Methods and systems for selecting and presenting content on a first system based on user preferences learned on a second system

Information

  • Patent Grant
  • 10984037
  • Patent Number
    10,984,037
  • Date Filed
    Friday, August 29, 2014
    10 years ago
  • Date Issued
    Tuesday, April 20, 2021
    3 years ago
Abstract
A method of selecting and presenting content on a first system based on user preferences learned on a second system is provided. The method includes receiving a user's input for identifying items of the second content system and, in response thereto, presenting a subset of items of the second content system and receiving the user's selection actions thereof. The method includes analyzing the selected items to learn the user's content preferences for the content of the second content system and determining a relationship between the content of the first and second content systems to determine preferences relevant to items of the first content system. The method includes, in response subsequent user input for items of the first content system, selecting and ordering a collection of items of the first content system based on the user's learned content preferences determined to be relevant to the items of the first content system.
Description
BACKGROUND OF THE INVENTION

1. Field of Invention


This invention generally relates to learning user preferences and, more specifically, to using those preferences to personalize the user's interaction with various service providers and interactions with content query systems, e.g., to better find results to queries provided by the user and to ordering the results for presentation to the user.


2. Description of Related Art


Personalization strategies to improve user experience can be chronologically classified into two categories: (1) collaborative filtering and (2) content reordering. Each is summarized in turn.


Collaborative Filtering was used in the late 1990s to generate recommendations for users. The term collaborative filtering refers to clustering users with similar interests and offering recommendations to users in the cluster based on the habits of the other users. Two distinct filtering techniques—user based and item based—are used in filtering.


In U.S. Patent App. Pub. No. U.S. 2005/0240580, Zamir et al. describe a personalization approach for reordering search queries based on the user's preferences. The application describes a technique for learning the user's preferences and increasing the promotion level of a search result based on personalization. Zamir et al. create a user profile, which is a list of keywords and categories listing the user preferences. The profile is generated from multiple sources, such as (1) information provided by the user at the time the user registers a login, (2) information from queries that the user has submitted in the past, and (3) information from web pages the user has selected.


Some systems directed to reordering content in the context of television schedules define categories and sub-categories according to an accepted standard. User preferences are gathered using various models such as (1) user input, (2) stereotypical user models, and (3) unobtrusive observation of user viewing habits. In some implementations, these models operate in parallel and collect the user preference information.


In other systems, a set of fixed attributes is defined and all media content and all user preferences are classified using these attributes. A vector of attribute weights captures the media content and the user preferences. The systems then determine the vector product between the content vector and the user preferences vector. The system suggests content to users where the values of the vector products exceed a predetermined threshold.


BRIEF SUMMARY OF THE DISCLOSURE

The invention provides methods and systems for selecting and presenting content on a first system based on user preferences learned on a second system.


Under an aspect of the invention, a user-interface method of selecting and presenting a collection of content items of a first content system in which the presentation is ordered at least in part based on content preferences of the user learned from the user selecting content of a second content system includes receiving incremental input entered by the user for incrementally identifying desired content items of the second content system, wherein each content item has at least one associated descriptive term to describe the content item. The method also includes, in response to the incremental input entered by the user, presenting a subset of content items of the second content system and receiving selection actions of content items of the subset from the user. The method further includes analyzing the descriptive terms of the selected content items to learn the content preferences of the user for the content of the second content system and determining a relationship between the content items of the first content system and the content items of the second content system. The relationship defines which learned user content preferences are relevant to the content items of the first content system. The method includes, in response to receiving subsequent incremental input entered by the user for incrementally identifying desired content items of the first content system, selecting and ordering a collection of content items of the first content system based on the learned content preferences of the user determined to be relevant to the content items of the first content system.


Under another aspect of the invention, the relationship between the content items of the first content system and the content items of the second content system is determined prior to at least one of receiving selection actions from the user and receiving subsequent incremental input entered by the user.


Under a further aspect of the invention, subsequent incremental input entered by the user is the first interaction between the user and the first content system. Under yet another aspect of the invention, the first content system has not characterized the content preferences of the user.


Under an aspect of the invention, the method also includes recording information associated with at least one of the selection actions from the user and


including the information in the collection of content items.


Under a yet further aspect of the invention, the first content system and second content system are different systems. The first content system can be on a server system remote from the user and the second content system can be on a user client device. The second content system can be on a server system remote from the user and the first content system can be on a user client device.


Under an aspect of the invention, a user-interface method of selecting and presenting to a first user a collection of content items of a first content system in which the presentation is ordered at least in part based on content preferences of a second user learned from the second user selecting content of a second content system includes receiving incremental input entered by the second user for incrementally identifying desired content items of the second content system, wherein each content item has at least one associated descriptive term to describe the content item. The method also includes, in response to the incremental input entered by the second user, presenting a subset of content items of the second content system and receiving selection actions of content items of the subset from the second user. The method further includes analyzing the descriptive terms of the selected content items to learn the content preferences of the second user for the content of the second content system. The method includes determining a relationship between the content items of the first content system and the content items of the second content system. The relationship defines which learned user content preferences are relevant to the content items of the first content system. The method further includes, in response to receiving subsequent incremental input entered by the first user for incrementally identifying desired content items of the first content system, selecting and ordering a collection of content items of the first content system based on the learned content preferences of the second user determined to be relevant to the content items of the first content system.


Under another aspect of the invention, the relationship between the content items of the first content system and the content items of the second content system is determined prior to at least one of receiving selection actions from the user and receiving subsequent incremental input entered by the user.


Under a further aspect of the invention, subsequent incremental input entered by the first user is the first interaction between the first user and the first content system.


Under a yet further aspect of the invention, the first content system and second content system are the same systems.


Under yet another aspect of the invention, the selecting and ordering the collection of content items is further based on popularity values associated with the content items. Each popularity value indicates a relative measure of a likelihood that the corresponding content item is desired by the user.


Under a further aspect of the invention, the set of content items includes at least one of television program items, movie items, audio/video media items, music items, contact information items, personal schedule items, web content items, and purchasable product items. The descriptive terms can include at least one of title, cast, director, content description, and keywords associated with the content.


Under yet a further aspect of the invention, the set of content items is contained on at least one of a cable television system, a video-on-demand system, an IPTV system, and a personal video recorder.


These and other features will become readily apparent from the following detailed description wherein embodiments of the invention are shown and described by way of illustration.





BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS

For a more complete understanding of various embodiments of the present invention, reference is now made to the following descriptions taken in connection with the accompanying drawings in which:



FIG. 1 illustrates two modules of a leaning engine.



FIG. 2 illustrates a collections of signatures of a user.



FIG. 3 illustrates a collection of signatures of a user for a single dataspace in a single location.



FIG. 4 illustrates orthogonal periodicities.



FIG. 5 illustrates overlapping periodicities.



FIG. 6 illustrates sample vectors in three vector spaces.



FIG. 7 illustrates seminormalization of signature probabilities.



FIG. 8 illustrates an example of a seminormalized signature.



FIG. 9 illustrates an example of detecting an increased level of activity associated with a content item.



FIG. 10 illustrates a context-specific personal preference information service.



FIG. 11 illustrates the local tracking and strengthening of the personal preference signatures based on user activity and the content on a mobile device.



FIG. 12 illustrates the information flow when a user device makes a request to a service provider.



FIG. 13 illustrates an alternate information flow when a user device makes a request to a service provider.



FIG. 14 illustrates examples of services that benefit from the context-sensitive personal preference service.



FIG. 15 illustrates possible user device configurations for use with the learning system and the context-sensitive personal preference service.





DETAILED DESCRIPTION

Preferred embodiments of the invention capture user preferences for a single user or a family of users based on historical observations of the users' activities and by use of a statistical learning model (also referred to as a learning engine). In an illustrative embodiment, the users of the family are members of a household using a single interface device. The learning model identifies a signature, or set of signatures, for the entire household as a stochastic signature. This stochastic signature is used to predict future activities in which members of the family may engage. For the sake of simplicity, the description will refer to signatures and activities of a single user, when in fact, the same applies to a group of users using a single interface device.


One benefit of the learning engine is to enhance the user's experience by increasing the accuracy of the results of a user's query for content and organizing these results so as to put the most likely desired content items at the top of the results list. This increases the user's speed and efficiency in accessing the content he or she desires. In addition, the signatures can be used to identify clusters of users with similar interests for use with collaborative filtering techniques. The learning system can be one aspect of an overall system that provides content and selectable actions to the user, or the learning system can be a standalone system that monitors the user's actions across multiple systems and provides learned user preferences to external systems based on the context of the user's interaction with a particular external system.


Information Captured by Signatures


The stochastic signature is a probabilistic model, and it is an identity that can be used for validation, prediction, etc. While this type of signature can be used to identify a user according to his or her preference, it is distinguished from a unique signature that a user may employ to authenticate an item or guarantee the source of the item, for example. A stochastic signature may be created based on various types of activities, for example, watching television programs, dialing telephone numbers, listening to music, etc. Thus, embodiments of the invention are useful in a wide variety of contexts. In applications where there are multiple dataspaces to be searched, the system will use a collection of stochastic signatures related to that dataspace. For example, when searching a personal address book, calendar, or air-line schedules, the system can use the set of signatures related to that particular dataspace. In addition, the system can also learn correlated activities across dataspaces. This allows the system to learn how the user interacts with related dataspaces and use that learned behavior to personalize the presentation of content to the user. However, for the sake of simplicity, certain embodiments of the invention will be described with reference to a single dataspace interface, e.g., a television system interface.


In the context of a user device with limited input capability, for example, a television remote control, the stochastic signature is particularly useful because it can be difficult and/or time consuming to enter queries on such a limited input device. The stochastic signature increases the likelihood that the desired search results will be found based on limited query input. For example, if a particular household has generally watched a certain program at a given time in the past, stochastic signatures can be used to predict that the household will watch the program at the given time in the future. Thus, instead of requiring a member of the household to enter the title of the program, the learning system can predict that the member wishes to watch the program based on only a few button presses.


Embodiments of the present invention build on techniques, systems and methods disclosed in earlier filed applications, including but not limited to U.S. patent application Ser. No. 11/136,261, entitled Method and System For Performing Searches For Television Programming Using Reduced Text Input, filed on May 24, 2005, U.S. patent application Ser. No. 11/246,432, entitled Method And System For Incremental Search With Reduced Text Entry Where The Relevance Of Results Is A Dynamically Computed Function of User Input Search String Character Count, filed on Oct. 7, 2005, and U.S. patent application Ser. No. 11/235,928, entitled Method and System For Processing Ambiguous, Multiterm Search Queries, filed on Sep. 27, 2005, the contents of which are hereby incorporated by reference. Those applications taught specific ways to perform incremental searches using ambiguous text input and method of ordering the search results. The present techniques, however, are not limited to systems and methods disclosed in the incorporated patent applications. Thus, while reference to such systems and applications may be helpful, it is not believed necessary to understand the present embodiments or inventions.



FIG. 1 shows the architecture of an illustrative learning engine 100. There are two distinct modules to learning engine 100—a data collection module 105 and a signature computation module 110. Data collection module 105 monitors the user activity for channel tuning, DVR recording, etc. and captures the relevant statistics of the activity, for example, the duration a TV channel was watched, as well as the genres and microgenres (discussed below) of the program that was watched. In the case of a mobile device, additional information is collected, such as the type of dataspace being visited (e.g., phone book, calendar, and downloadable media content), geographic location of the mobile device, etc. Data collection module 105 can reside in a client device, where it gathers data about the users' activities and sends this data to signature computation module 110. In the alternative, data collection module 105 can reside on a remote server that serves content to the client device. In this case, the remote server collects data about the content requested by the users and passes this data to computation module 110.


As mentioned, the learning engine gathers information about channels, genres, and microgenres that the user has watched. Herein, the term “channel” refers to a tunable entity in a television system. A channel can be identified by its name (CBS, ABC, CNBC, etc.).


The term “genre” refers to the overall theme of an item. In some systems, every retrievable item is categorized into a genre. The collection of genres is system-definable, and can be as coarse or as fine-grained as necessary. In addition, the genres can be defined independent of the retrievable items and can be defined ahead of populating a content system with retrievable items. In one implementation, a function g(x) returns a subset of the set of genres for a given item. Thus, g(x) is a function with a domain space of a set of retrievable items and the range space of a set of all subsets of genres. This is so because any retrievable item may belong to more than one genre, e.g., a movie Sleepless in Seattle has a genre of movie and romance.


The term “microgenre” refers to a very refined, unambiguous theme of descriptor for a given item. For example, New England Patriots as a search item has a microgenre of NFL Football and genres of football and sports. As with genres, a search item can have multiple microgenres. While genres are “macro” themes, microgenres are “micro”, unambiguous themes; these themes come from descriptive terms and metadata within the search items. Thus, the microgenres for New England Patriots also include Tom Brady. Microgenres are not limited to a set of predetermined descriptors, as are genres in the prior art, but can be any word that describes the item. Whether a particular theme is a genre or microgenre depends on the particular item with which the theme is associated and the configuration of the content system. Thus, microgenres are dynamic and generated “on-the-fly”, while genres are static and system defined.


In dataspaces other than the television content space, the channel, genre, and microgenre approach to characterizing items is modified to reflect the attributes of the content items in that particular dataspace. Thus, for a telephone directory dataspace, the channel statistics are replaced with statistics related to the person or entity called. The genre statistics are replaced by statistics related to the type of entity called, for example, individual or business. While the microgenre statistics are replaced by statistics related to key secondary attributes of the item, such as home, office, and mobile telephone numbers as well as, for example, telephone numbers of persons related to the persons called.


Computational module 110 is sent the current day's data periodically and determines the users' signatures. In so doing, computational module 110 combines this current data with historical data using exponential smoothing or other smoothing techniques (discussed below) so that the signatures adapt over time to the users' changing preferences. Computational module 110 also performs other computations involving the signatures, for example, combining the individual signatures to obtain aggregate signatures that predict the viewing preferences of a large collection of individuals or creating signatures that capture the average activity level associated with a particular program (described in greater detail below). In one embodiment of the system, computational module 110 resides in one or more servers, to exploit the computational power of larger processors. However, in some implementations, e.g., where privacy is an issue, computational module 110 may reside in the client device.


A particular stochastic signature is a normalized vector of probabilities. The probabilities capture the historical patterns of the user's behavior with respect to a particular set of activities. An example of a signature for use with a television system is {(ABC 0.3), (CBS 0.2), (NBC 0.5)}. This signature captures that over a given time period, when the user was watching television, the user watched ABC 30% of the time, CBS 20% of the time, and NBC 50% of the time. The stochastic nature of the signature says that this is a historical average and not an exact number.


Because the system captures the user's behavior and preferences across multiple dataspaces, taking into account the geographic location of the user, or the user's client device, the multiple signatures can be represented as a set with three indices. Thus, the convention signature (t, g, s) represents a signature in geographic location g at time t for dataspace s. This allows the system to use difference subspace projections to utilize the information contained in the entire set. For example, the system may utilize the user's preferences based on activity across all geographic locations or based on a composite of multiple times for a given dataspace and given location. The composite signature is described in greater detail below.


Although time is obviously a continuous variable, for the purpose of learning the user's preferences and activities, a coarse, or discretized, view of time is used to capture all activity. Thus, the system divides time into discrete quantities and represents time as an integer from one to the number of discrete quantities in a day. For example, time can be divided into the number of minutes in a day, thereby representing time as a number 1 to 1440. In addition, this discrete representation of time can be further subdivided into time slots that encompass multiple minutes, as discussed below. The duration, and therefore the number, of time slots is selected based on the nature of the activity of the particular dataspace. For example, in the television dataspace it is appropriate to divide time into 30-minute time slots to correspond to the program boundaries. In other dataspaces, the coarseness can vary. Although it is not necessary to have the same time division in all dataspaces, the examples set forth below assume identical time slot durations for the sake of simplicity. Similarly, geographic location, though continuous, is discretized and represented by character strings. For example, the geographic location identifiers can be a postal code, a major metropolitan area, or an area of a given size with the latitude and longitude of its center being the location identifier.


There are many possible collections of signatures that capture the activities of the user or family of users at various levels of granularity. FIG. 2 shows a sample signature hierarchy for the multiple dataspace learning model, with n locations 200, m dataspaces 210, and k time slots 220. At the first level, the figure illustrates activities in each location 200. Within each location 200, the system captures dataspace-specific activities in individual signatures. Inside each dataspace 210, for each time slot 220, the system obtains a unique signature. Finally, the signature hierarchy captures the nature of the activity within the time slot by appropriate keyword 230, genre 240, and microgenre signatures 250 (or equivalent statistics depending on the dataspace, as described above). The illustrative learning system shown in the figure has 3 nmk signatures in the collection.


The timeslots shown in FIG. 2 can be further divided according to the particular needs of the learning system. Thus, a top-level time slot can have lower level time slots organized beneath the top-level time slot. For example, a top-level time slot can be a day organized into lower-level time slots of an hour or half-hour increments, each having its own collection of signatures. Similarly, the day time slot can have a collection of composite signatures beneath it that aggregate all of the information of the individual time slots for that given day into a single composite signature.



FIG. 3 shows an illustrative example of the organization of a signature collection 300 for the user in the television program dataspace at a single location. At the top level, the signatures are classified into various periodicities for the user, as discussed in greater detail below. The example in FIG. 3 shows a weekday periodicity 305 and a weekend periodicity 310. Within each periodicity, signature collection 300 is further divided into individual time slots 315 with a composite 320 for each day. Within each further division exists three types of signatures: channel 325, genre 330, and microgenre 335. Thus, there is one of each of these three types of signatures for every weekday time slot, weekend time slot, and one for each weekday composite and weekend composite. Therefore, the system captures the activities performed by the user in this single dataspace and at this single location as defined by the hierarchy present in signature collection 300.


Because activities vary widely in a multiple dataspace environment, the system can capture the user's activities, according to the signature hierarchy, as duration and/or count. In other words, the system can track the amount of time the user spent performing an activity, or the system can track the number of times the user preformed a particular activity, or the system can record both. For example, if the system is modeling a DVR recording activity or DVD ordering activity, there is no duration measure associated with it. Thus, in these cases, the system will capture the intensity of the activities by the count (frequencies) of the activities. However, other activities have duration as a natural measure of the intensity of the activities (e.g., watching a television program). While still other activities have both count and duration as a natural measure of intensity (e.g., placing a telephone call and conducting the call). To be inclusive of all activities, the system models every activity by both count and duration. Thus, there are two signatures for each keyword, genre, and microgenre division of the hierarchy. Likewise, there are two for each composite as well. For each time, location, and dataspace, a function ƒ defines the convolution of the two intensity measures into a single signature:

ftgs:(count, duration)->single measure  (Equation 1)


For the sake of simplicity, this description omits the adjective, count or duration, in referring to signatures, opting for disambiguation based on the context.


In one embodiment of the invention, signatures capture the television viewing activity of the family of users in a single geographic location, and these signatures are used to identify and organize television program search results. The learning engine divides a day into time slots. For the purposes of this example, there are 48 time slots in a 24-hour day. Thus, one time slot corresponds to the smallest length TV-program, i.e., 30 minutes. In other implementations, time slots may be larger or smaller, for example, in a system using stochastic signatures to identify a user's telephone calling preferences, the time slots may be two to three hours long. During each time slot, the user activity is recorded and the learning system creates a time slot signature for each time slot. In addition, at the end of each day, the learning system creates a composite signature based on the data collected across all time slots within the current day. The signature is said to be composite in that it represents a user's activity across multiple time slots. As discussed in greater detail below, the learning system uses smoothing techniques to create evolving signatures that retain activities in the distant past as well as the most recent activities.


The day is divided into time slots because each family of users has a recurring viewing behavior based on the time of day. Thus, the learning system learns television-viewing preferences from the past behavior during a given time slot. Any queries for television content that originate in that time slot on a later day can use these preferences to identify and organize content by using that time slot's signature.


For example, in an illustrative family of three—husband, wife, and a child—mornings and afternoons are taken up by soap operas and talk shows; afternoons are taken up by cartoons and children's programming; and evenings are taken up by business, movies, and prime time shows. During these periods, it is likely that queries for current television content also relate to the corresponding past viewing behavior. Thus, signatures that capture this past behavior are used to identify and organize content consistent with this past behavior. However, for more aggregate behavior, independent of time slots, it is desirable to have a coarse grain view of the day's activity in the household. The time slot activity is aggregated into a day's activity; this is the basis of the composite signature. Thus, at the end of each day, the system has collected and aggregated 49 different signatures, 48 individual time slot signatures and one composite signature.


Composite signatures serve two purposes. First, if the family of users has a slight time-drift in their behavior (e.g., some days a particular user watches a given program at 10:00 AM, while other days at 10:45 AM), the time slot signatures may get shifted by one slot. However, the composite will still capture the overall viewing behavior correctly. Second, a particular user may time-shift deliberately by many time slots. The composite signatures will also correctly capture this behavior.


User Periodicity


The above example implicitly assumes that the user has a recurring behavior with a periodicity of a day. However, the learning system may utilize other periodicities, as explained below. As mentioned above, one benefit of the learning system is to enhance the user's experience by increasing the accuracy of the results of a user's query for content and organizing these results so as to put the most likely desired content items at the top of the results list. This increases the user's speed and efficiency in accessing the content he or she desires.


Towards this end, the learning system infers periodicity of activities. For example, as discussed above, there is a daily periodicity of activities. However, the daily periodicity model may not always apply, as occurs during a weekend, during which time the users' activities can be quite different from those during the week. To capture this different behavior pattern, for example, the system will utilize two different periodicities. Thus the weekday periodicity will contain data for the days during the week, while the weekend periodicity will be empty for those days and vice versa. This is an example of orthogonal periodicity.


The term orthogonal refers to the fact that the periodicity waveforms are orthogonal; i.e., if f(x) is a periodicity function and g(x) is another periodicity function, then f(x) and g(x) have orthogonal periodicity if

f(x)g(x)=0;0≤x≤∞  (Equation 2)


Equation 2 defines strong orthogonality, or pointwise-orthogonality, in contrast with the strict mathematical definition of orthogonality of functions (see F. B. Hildebrand, Introduction to Numerical Analysis, second edition, McGraw-Hill Book Company, New York, 1974, hereby incorporated by reference). FIG. 4 illustrates an example of orthogonal periodicity. The figure shows a variety of waveforms that represent the activity captured by a set of signatures for a particular dataspace in a particular location. The Y-axis is the intensity of activity during a particular day, and X-axis is the day. A weekday waveform 405 captures the activity during the days of the week (i.e., Monday-Friday). Whereas a Saturday waveform 410 and a Sunday waveform 415 capture the activity on Saturday and Sunday, respectively. A solid line shows weekday periodicity waveform 405; a short dashed line shows the Saturday periodicity waveform 410; and a long dashed line show Sunday periodicity waveform 415.



FIG. 4 illustrates the waveforms are orthogonal in that the activity level for weekday waveforms is zero during Saturday and Sunday, while the Saturday waveform is zero for all non-Saturday days, and the Sunday waveform is zero for all non-Sunday days. The system captures these orthogonal periodicities by storing the activities in distinct sets of signatures, with one set of signatures for each orthogonal period. As explained above, the set can include both individual time slot signatures as well as a composite signature for the entire orthogonal period. When a user query is submitted within a particular period, the corresponding set of signatures is used in identifying and organizing the search results.


Although the above example is in terms of a week, periodicity can extend beyond a week. For example, periodicities can exist within a day or can extend beyond a week. In addition, some periodicities may not be orthogonal. Thus, the system uses a second kind of periodicity, namely overlapping periodicity, to capture this phenomenon.


In overlapping periodicities, the periods overlap; i.e., the same time and day can belong to multiple periods, one having a larger frequency than the other. Thus, the strong orthogonality property of Equation 2 does not apply to overlapping periodicities. FIG. 5 shows an example of overlapping periodicity. In this example, a user watches a recurring program every Wednesday, along with the usual daily programs that she watches. Thus, there is a weekly period 505 with a frequency of once per week and a daily period 510 with a frequency of once per day.


Overlapping periodicities are distinguished by storing the same activities in multiple sets of signatures, one set for each overlapping periodicity. In the example of FIG. 5, the system will store the same Wednesday activity both in daily set 510 and weekly set 505. Notice that weekly set 505 does not contain activities from other days. When a query is submitted on a Wednesday, a union of both signatures is used in identifying and organizing the content results. Both signatures are combined in such as way as to reflect the fact that the weekly signature 505, anchored on Wednesdays, has a greater impact on the query results than does daily signature 510.


As mentioned above, the learning system defines periodicities according to the users' behavior. In one illustrative implementation, the system compares each recorded user action and determines the periodicity of similar actions by measuring the time that elapses between the similar actions. The similarity can be based on, for example, the user watching the same television channel or the user watching television programs of the same genre or microgenre. Therefore, if a user watches a particular television show on the first Tuesday of every month, the system would capture this as a monthly periodicity. Thus, although the system can use predefined periodicities, the system creates periodicities of any time duration as needed by the particular usage case. As mentioned above, capturing the user's viewing preferences in the television dataspace is only one example of an implementation of the learning system. Other examples include learning the user's dialing preferences in the telephone dataspace or tracking the user's buying behavior in the internet dataspace.


Signatures as Multiple Vectors


As explained above, vectors can be used to capture the family of users' behavior. A vector is defined as a linear array of numbers. Every vector belongs to a vector space. In one embodiment of the learning system, the system only operates in vector spaces in R+n, defined as

R+n={(x1,x2, . . . ,xn)|xi≥0 all i}  (Equation 3)


The dimensionality of the vector space depends on the type of vector. For example, the dimensionality of a channel vector space is the number of channels in the television system. The values in the vector also depend on the type of vector; e.g., it can be duration or count or any other metric deemed appropriate to capture the activity of the family of users. FIG. 6 shows an example of a collection of vectors capturing the users' activity between 10:00 AM and 10:30 AM on a weekday.


The vectors in FIG. 6 correspond to three different vector spaces—channel, genre, and microgenre. The dimensions of these vector spaces are the number of channels in the TV system, the number of genres defined in the learning system, and the number of microgenres dynamically created in the learning system, respectively. Only nonzero values are stored in the various vectors. All other values are implicitly zero and are not stored in the system. Thus, the learning system fundamentally stores all vectors as sparse vectors. The technology of sparse vectors and sparse matrix computations eases the burden of working with large vector spaces (see I. S. Duff, A. M. Erisman, and J. K. Reid, Direct Methods for Sparse Matrices, Monographs on Numerical Analysis, Oxford Science Publications, Clarendon Press, Oxford, 1986, for a description of numerical computations using sparse matrices and sparse vectors, hereby incorporated by reference).


A channel vector 605 in the figure has nonzero values for channels CBS, ABC, and CNBC. The values correspond to the number of minutes the user watched each channel between 10:00 AM and 10:30 AM. Similarly, the program genres are captured in a genre vector 610. In this example, the CBS and CNBC channels were broadcasting programs of type business and ABC was broadcasting a program of type comedy. Finally the program microgenres are captured in a microgenre vector 615. In the above example, ABC was broadcasting the comedy show Seinfeld, CNBC was broadcasting a business show Squawkbox, and no microgenre was created for the CBS show.


As previously mentioned, the techniques described above can be implemented in data collection modules and signature computation modules that reside on either a client device or a remote server system. Thus, the channel, genre, and microgenre data can be gathered and processed locally by the client device, or this information can be sent to a remote server system for processing. Likewise, the signatures can reside on a client device or on a remote server system for use as described below.


In addition to capturing the user's activities according to keyword (i.e., channel in the television dataspace context), genre, and microgenre, the system also learns the amount of time the user spends in each dataspace independent of location and time slot. This gives rise to yet another signature: the dataspace fraction signature. The dataspace fraction signature (herein “dfs”) has the coordinates of time and location and is represented by dfs(t, g). The signature dfs(t, g) is a normalized probability vector indicating the fraction of time (and/or activity count) the user spent in various dataspaces. For example, dfs(t, g)[s] contains the value indicating the fraction of time and/or count spent in dataspace s, at time t in location g. This two-coordinate signature is used to reorder the search results space when a search across dataspaces is performed. Meanwhile, as described above, a three-coordinate signature is used to reorder the items within each dataspace, e.g., ks(t, g, s) denotes a keyword signature in time slot t, in location g, and in dataspace s; ks(t, g, s)[x] denotes the value of element x in the keyword signature. Therefore, when the user initiates a search across all dataspaces, the system will reorder content items from the multiple dataspaces according to the user's dataspace preferences based on the information contained in the dataspace fraction signature. If the user performs actions in one particular dataspace relative to another, the result from the more heavily used dataspace would be promoted over the results from the lesser-used dataspace.


The following example is provided to illustrate this aspect of the learning system. A mobile user visited the telephone dataspace 30 times, the television dataspace 20 times, and the web dataspace 10 times while located in Denver during the 10 AM time slot. During these interactions with the system, the user called Sam, Sally, and Stewart, speaking for 5, 15 and 10 minutes respectively. The user watched a television program entitled “Seinfeld” for 30 minutes. In addition, the user browsed the Google webpage for 10 minutes and Yahoo! webpage for 20 minutes, respectively. Using a count measure for the dataspace fraction signature and a duration measures for the television, telephone, and web dataspaces, the keyword signature and dataspace fraction signature ensemble, will be as follows:


dfs(10, “Denver”)[“phone-space”]=0.5


dfs(10, “Denver)[“TV-space”]=0.33


dfs(10, “Denver”)[“web-space”]=0.17


ks(10, “Denver”, “phone-space”)[“Sam”]=0.17


ks(10, “Denver”, “phone-space”)[“Sally”]=0.50


ks(10, “Denver”, “phone-space”)[“Stewart”]=0.33


ks(10, “Denver”, “TV-space”)[“Seinfeld”]=1.0


ks(10, “Denver”, “web-space”)[“Google”]=0.33


ks(10, “Denver”, “web-space”)[“Yahoo!”]=0.67


Thus, if the user enters a text query starting with the letter “S”, all results beginning with the letter “S” would be presented to the user. However, the matching results from the phone-space would be promoted over the results from the TV-space and the web-space because the dataspace fraction signature probability for the phone-space is the greatest. This is so even though the probability for the lone TV-space item is greater than any of the phone-space items. Within the phone-space, the individual items would be sorted according to the keyword signature probability values. Therefore, the entry for “Sally” would be promoted over the other phone-space items. This example clearly shows the Bayesian property of the signatures. That is, the probabilities add up to one, conditioned on the fact that the user is in a particular dataspace (see B. W. Lindgren, G. W. McElrath, D. A. Berry, Introduction to Probability and Statistics, Macmillan publishing co., New York, 1978, herein incorporated by reference, for more details on Bayes's theorem).


As described above, the signatures associated with a particular dataspace (i.e., keyword, genre, and microgenre signatures) capture the probability of the user performing a future action or desiring a future content item based on past activities and selections that took place within that particular dataspace. Thus, the individual dataspace signatures are conditional signatures in that the probabilities they measure are conditioned upon the user operating in the particular dataspace to which those signatures relate.


The dataspace fraction signature probabilities can be used to weight the individual dataspace signature probabilities to provide a measure of the probability of a user performing a given action outside of a particular dataspace. This operation gives rise to an unconditional signature. The unconditional signature measures the probability of the user performing an action outside of any particular dataspace based on the information contained in the individual dataspace signatures and the information contained in the dataspace fraction signatures. The system uses Equation 4, below, to determine the unconditional keyword signature for an activity “A”. Unconditional signatures for genre and microgenre can be determined in the same way.

uks(t,g,s)[A]=ks(t,g,s)[A]*dfs(t,g)[s]  (Equation 4)


The learning system can organize the various dataspaces, content, and selectable actions into a tree hierarchy, which the user can navigate using the client device. The unconditional signatures are used by the learning system to rearrange the various branches of the tree structure so as to present the most favored content to the user based on the probabilities stored in the signatures. In addition, the unconditional probabilities enable the system to present lists of commingled selectable actions and content items based on the most commonly performed actions and most commonly exhibited preferences. For example, the learning system is capable of creating a “My Favorites” list based on the various signatures, or the system could rearrange a content tree hierarchy in order to reduce the effort required of the user to reach certain preferred content.


Correlated Activity Signatures


The learning system is also capable of learning correlated activities across dataspaces. A correlated activity is an activity performed in a secondary dataspace while starting from a primary dataspace. In general, by capturing correlated activities across dataspaces, the system is learning not only standalone actions and content preferences, but the system is learning chains of actions performed by the user. For example, a user enters the telephone dataspace of his device to make a telephone call. During the telephone call, the user wishes to enter the calendar dataspace to search for and review the date and time of a particular appointment. In this example, the user remains engaged with the primary dataspace, the telephone dataspace, for the duration of the telephone call. The user also performs a correlated action, the act of searching for an appointment, in the secondary dataspace, which is the calendar dataspace.


The purpose of learning the correlated activities is to achieve better ordering of the search query results in the secondary dataspace based on the correlated activities learned by the system. Thus, the correlated activity signatures provide yet another way to learn the preferences of the user and how the user interacts with his or her client device. This additional set of preferences and learned actions further enhances the user experience.


In general, the system has an activity matrix A that is a square N by N matrix, where N is the number of dataspaces. Each entry in the matrix is a signature vector that captures the actions performed in the secondary dataspace while operating from the primary dataspace. Thus, A is in fact a three dimensional matrix, which can be defined as follows:

A(i,i)[x]:=0;1≤i≤N, for all items x∈ dataspace i
A(i,j)[x]:=average number of accesses of item x in dataspace jwhile in dataspace i; 1≤i≤N;1≤j≤N;i≠j; for all items x∈ dataspace j  (Equation 5)


The matrix determined by Equation 5 captures the correlated activities of the user, and therefore can be used in accordance with the techniques disclosed herein to predict the probability that the user would perform an action in a secondary dataspace while operating from a primary dataspace. In addition, the correlated activity signatures can be used to determine the unconditional probability of the user accessing a keyword item x in dataspace s, location g, and at time t. The probability determination depends in part on the mode of access utilized by the user. As described in Equation 6 below, if the user is entering dataspace s at the root level of the client device (i.e., the user is not currently in a dataspace), the probability determination is based on the dataspace fraction signature and the relevant signatures for the selected dataspace (e.g., the keyword signature, the genre signature, or the microgenre signature). If the user is entering dataspace s from another dataspace, the probability determination is based on the dataspace fraction signature and the correlated activity matrix A.










Prob


[
x
]


=

{







dfs


(

t
,
g

)




[
s
]


*


ks


(

t
,
g
,
s

)




[
x
]



;

s





is





visited





at





root





level











1

i

N













A


(

i
,
s

)




[
x
]


*


dfs


(

t
,
g

)




[
i
]




;
otherwise




}





(

Equation





6

)







For the sake of simplicity, the learning system's ability to capture correlated activities was described in terms of a primary and secondary dataspace only. However, the invention is not limited to correlations between only two dataspaces. The learning system can also capture user activities and preferences when the user enters a first dataspace, enters and performs an action in a second dataspace, and enters and performs yet further actions in a third dataspace. In fact, using the principles and techniques described above, the learning system can create N! number of correlation signatures.


Signature Clustering


As described above, the users' activity and preferences are stored as vectors. Capturing the users' activity and preferences as vectors has many benefits. Among these is the fact that because signatures are vectors, one can easily determine when two signatures are nearly identical. For example, let {tilde over (x)} and {tilde over (y)} be two signatures in the same vector space. Lower case letters in bold will generally denote normalized probability vectors in appropriate dimension. A tilde over the top of a vector will generally denote un-normalized vectors. Greek symbols and lower case letters, without bold, will generally denote scalars. If











vector






angle


(


x
~

,

y
~


)



=





x
~

T



y
~






x
~







y
~







(

1
-
ɛ

)



,




(

Equation





7

)







where ϵ is a small fraction in the vicinity of 0.01, then the two signatures are nearly identical. Equation 7 states that if the cosine of the angle between the two signatures is small enough, then they are nearly identical, up to a scaling factor. The scaling factor recognizes that two vectors may have different magnitudes, but still be overlapping. For example, a first user has a genre vector of {(sports 20); (world news 50)} and a second user has a genre vector of {(sports 60); (world news 150)} where the first value within the vector elements is the genre and the second value is minutes watched per day. Although the magnitudes of these two vectors are different, the genres and ratio of minutes of sports to world news is identical. Thus, the learning system identifies these two signatures as nearly identical. The learning system can exploit this aspect in a variety of ways, as described in greater detail below.


In one illustrative implementation, the system clusters signatures that are similar into subsets. The subsets can then be used for many purposes, such as promotions, targeted advertisements, etc. For example, if several users of a particular cluster have watched a certain television program, this television program will be recommended to other users of the cluster who have yet to view the program. Similarly, outside of the television context, if users of a particular cluster have purchased a given item, ads for this item are presented to the other users of the cluster who have not purchased the item.


The notion of viewing signatures as vectors can be exploited to determine the similarity between the signatures by using Equation 7. Each cluster represents nearly identical signatures. Initially, the procedure starts with singleton clusters, and recursively collapses them until no more merging is possible. An example of pseudo-code that generates clusters of signatures is provided below:
















PROCEDURE



 Inputs:



  1. N signatures s1, s2, . . . , sN



  2. Tolerance threshold ε, 0 ≤ ε ≤ 1.0



 Outputs:



  1. Sets Ψ1 , Ψ2 , . . . , Ψc containing the signature clusters



 BEGIN



  1. Initially define singleton sets Ωj := {sj }; 1 ≤ j ≤ N



  2. merged := FALSE



  3. for 1 ≤ i ≤ N - 1 do



    a. if set Ωi = Ø, continue



    b. for i+1 ≤ j ≤ N do



      i. if set Ωj = Ø, continue



      ii. If for every x ∈ Ωi and every y ∈ Ωj



       vector_angle(x, y) ≥ (1 - ε), then



        A. Ωi := Ωi ∪ Ωj



        B. Ωj := Ø



        C. merged := TRUE



       end_if



     end_for



   end_for



  4. if merged = TRUE, go to step 2.



  5. c := 0



  6. for 1 ≤ i ≤ N do



    a. if Ωi ≠ Ø then



      i. c := c + 1



      ii. Ψc := Ωi



     end_if



   end_for



 END



END_PROCEDURE









Signature Decomposition


In addition, the learning system can decompose one or more signatures to approximate the number of family members in a signature, the gender of each family member, and the ages of the family members. The system uses a nonlinear optimization model to approximate these values. In this example, the learning system uses genre signatures; similar models apply to channel and microgenre signatures.


The technique starts with historical, human behavioral, statistical data on television viewing habits obtained from generally available data (such as the user viewing preferences available from the DVB Project, see www.dvb.org). In particular, Δ is a set of all genres that are available to the viewing public. Upper case Greek letters in bold font generally denotes sets. Thus, a collection of probability distributions exists, namely fgy(t) Probability that a genre g would be watched by a person of gender y and age t; y={male, female};

0≤t≤∞;g∈Δ  (Equation 8)


The learning system also provides a signature s that collectively describes the viewing habits of the household. The illustrative maximum likelihood estimation problem formulated below defines the most likely set of household members that may produce the signature s. For the purposes of this example, all vectors, normalized or unnormalized, are lower case bold letters.


The inputs to the optimization problem are the probability distribution functions fgy (t) and a family signature s that collectively represents the activities of a household. The outputs are n, the number of members in the household, where 1≤n, the age and gender of each family member i, where 1≤i≤n, and a set of signatures s1, s2, . . . , sn, where signature s1 corresponds to family member i. Further, let N=|Δ|, the cardinality of the Genre set, Φ=set of all nonnegative basis matrices B for the vector space R+N (i.e., B=[b1, b2, . . . , bN], where b1 is a nonnegative N-vector and b1 1≤i≤N, are linearly independent, and for any vector s∈R+N,







s
=




i
=
1


i
=
N









α
i



b
i




,





with αi≥0).


The decision variables are as follows: basis matrix B∈Φ, variables x1, x2, . . . , xN, which represent the ages of family members with signatures corresponding to the basis vectors in B, and variables y1, y2, . . . , yN, which represent the gender of the corresponding family members.


For the purpose of optimization, it is necessary to define an objective function to maximize. Towards this end, the system uses an intermediate function, as follows:











s
=




i
=
1


i
=
N









α
i



b
i




,


α
i


0

,






for





any





vector





s



R

+
N











and





any





basis





Matrix





B


Φ





(

Equation





9

)








h


(

v
,
x
,
y

)




=
Δ






1

k

N










f
k
y



(
x
)




v

(
k
)





,





where






v

(
k
)



is





the






k
th






component





of





vector





v





(

Equation





10

)







Function h(v, x, y) evaluates the likelihood probability of a person with age x and gender y having a signature v. Note that the system is taking the product of all the components of vector v. Thus, the maximum likelihood estimation becomes










Maximize

B

Φ







1

j

N








h


(



α
j



b
j


,

x
j

,

y
j


)







(

Equation





11

)







subject to the following constraints:










s
=




i
=
1


i
=
N









α
i



b
i




;


α
i


0

;




(

Equation





12

)







1


x
j




;

1

j

N

;




(

Equation





13

)








y
j

=

{

0
,
1

}


;

1

j


N
.






(

Equation





14

)







This optimization problem can be shown to be NP-Hard (see M. R. Garey, and D. S. Johnson, Computers and Intractability A Guide to the theory of NP-completeness, W. H. Freeman and Company, New York, 1979, herein incorporated by reference), since any technique needs to search over the space of all bases in R+N and the fact that the y variables are integer variable. This problem has some similarities to another non-standard class of optimization problems known in the literature as semi-definite programs. An approximate solution to this problem can be achieved using an illustrative technique described below.


The estimation technique uses a first approximation by converting these discrete variables to continuous variables with bounds. This makes the transformed problem amenable to differentiable optimization. The technique also uses a second approximation by identifying a subset of the set of all bases Φ as the set of bases in R+N that are permutations of the coordinate basis matrix and restricts the search to this set. Given any basis from this set, the inner iteration involves a steepest ascent technique in variables (αj, xj, and zj) to obtain a local maximum (where zj is a continuous approximation of yj). The iterations are terminated when no improvement in the objective function occurs. After termination, the gender variables that are fractional are rounded/truncated using a rounding heuristic, described below. Given a fixed basis matrix, the transformed maximum likelihood estimation problem becomes a continuous maximum likelihood estimation problem and is given by the following equations:









Maximize





1

j

N








h


(



α
j



b
j


,

x
j

,

z
j


)







(

Equation





15

)







subject to the following constraints:









s
=




1

j

N









α
j



b
j







(

Equation





16

)






1


x
j









1


j

N




(

Equation





17

)






0


z
j



1





1


j

N




(

Equation





18

)







α
j



0





1


j

N




(

Equation





19

)







An example of pseudo code for solving the above continuous maximum likelihood estimation problem is given below. The pseudo code consists of an outer iteration and inner iteration. In the outer iteration, the code iterates through basis matrices. While in the inner iteration, the code employs the steepest-ascent optimization technique to obtain the optimal solution, given a basis matrix.


The steepest-ascent optimization technique has three steps. First, the optimization technique obtains the gradient of the objective function at the current iteration. This is done in step 2.c.ii, set forth below, using difference approximations. The technique of using difference approximations, as well as other numerical and matrix techniques can be found in D. G. Luenberger, Linear and Nonlinear Programming, second edition, Addison-Wesley publishing company, Reading Mass., 1989, herein incorporated by reference. Second, the optimization technique projects the gradient onto the null space of B, to obtain the ascent direction d (step 2.c.iii). Third, the optimization technique obtains the optimal step length along d (step 2.c.iv). In the field of optimization, this is called a step-length computation and involves a one-dimensional optimization. The inner iterations proceed until no more improvement in the objective function is possible. After this, the basis matrix is changed and the inner iterations are reinitiated. Finally, rounding heuristics (such as those in G. L. Nemhauser, and L. A. Wolsey, Integer and Combinatorial Optimization, John Wiley & sons New York, 1988, herein incorporated by reference) are employed to round off the fractional variables.


In the pseudo code set forth below, I is an identity matrix of order N, and Pi is the ith permutation matrix in the sequence of permutations of the index set {1, 2, 3, . . . , N}.














PROCEDURE


 Inputs:


  1. The historical probabilities fky(x)


  2. The family stochastic signature s


 Outputs:


  1. Number of family members


  2. Sex of family members


  3. Age of family members


  4. Individual signatures of family members


 BEGIN


   1. Initialize:


    a. Current basis matrix B := I


    b. Iteration counter i := 0


    c. Permutation matrix P := I


    d. newOuterObj := 0; oldOuterObj := −∞


    e. αj := sj 1 ≤ j ≤ N


    f. xj := 1 1 ≤ j ≤ N


    g. zj := 0 1 ≤ j ≤ N


    h. stopping tolerance for inner iterations ε := 0.001


    i. stopping tolerance for outer iteration β := 0.0001


   2. While ((newOuterObj − oldOuterObj)/|oldOuterObj| > β)


     Do//outer iteration


    a. oldOuterObj := newOuterObj


    b. Initialize inner iteration; newInnerObj := 0;


     oldInnerObj := −∞


    c. while ((newInnerObj−oldInnerObj)/|oldInnerObj| > ε) Do


     i. oldInnerObj := newInnerObj


     ii. Compute the gradient vector g := [∂h/∂αj], [∂h/∂xj],


      [∂h/∂zj] using difference approximation.


     iii. Project the gradient vector g on to the null space of


      matrix B to obtain the direction vector d := g⊥B


     iv. Compute the optimal step length δ along the


      direction d.





     v.
[αjxjzj]:=[αjxjzj]+δd






     vi.
newInnerObj:=1jNh(αjbj,xj,zj)






      endWhile


    d. i := i +1; set Pi := Next permutation matrix in the


    sequence


    e. B := PiIPiT


    f. oldOuterObj := newOuterObj


    g. newOuterObj := newInnerObj


   endWhile


   3. Use rounding heuristics to set fractional zj variables to the


   nearest integer value to obtain variables yj.


   4. Compute n := number of αj that are greater than 0.


   5. Output the optimal solution:


    h. Output n as the number of family members


    i. For 1 ≤ j ≤ N Do


     i. if aj, > 0 then


      a. output αjbj as the signature of person j


      b. output xj as the age of person j


      c. output yj as the sex of person j


     EndIf


    EndFor


 END


END PROCEDURE









Signature Aging


The learning system also provides for remembering past behavior and integrating it into the current signature. The fundamental reason for remembering the past is to infer a recurring pattern in the user behavior, and use the inference to aid in future navigation. Exponential smoothing is a way of gradually forgetting the distant past, while giving more relevance to the immediate past. Thus, activities done yesterday are given more relevance than activities done two days ago, which in turn is given more importance than the activities done three days ago, and so on (see V. E. Benes, Mathematical Theory of Connecting Networks and Telephone Traffic, Academic Press, New York, 1965 for additional information, herein incorporated by reference). This technique has the added advantage of reducing the amount of computer memory consumed by the signature.


The learning system uses the concept of exponential smoothing in the context of learning the user's activities. For example, a set of activities for today is captured in a signature s, whereas all of the past activities are remembered in a signature s* (s* remembers all of the past, since the recurrence relation, given below, convolutes all of the past into a single signature). At the end of the day (when the present becomes the past), the system updates s* by the recurrence relation

s*=αs+(1−α)s*0≤α≤1  (Equation 20)


In Equation 20, a is called the smoothing parameter and it controls how much of the past the system remembers, and how fast the past decays—larger the α, faster the decay. Expanding the above recurrence relation into a recurrence equation illustrates the machinery of exponential smoothing. Where s*(n) denotes the past signature after n days and s(n) represents the activities done during the nth day. Equation 20 expands into










s

*

(
n
)



=


s

(
n
)


+


(

1
-
α

)



s

(

n
-
1

)



+



(

1
-
α

)

2



s

(

n
-
2

)



+









(

1
-
α

)


(

n
-
1

)




s

(
1
)



+




(

1
-
α

)


(
n
)


α




s

(
0
)


.







(

Equation





21

)







Because α is ≤1, Equation 21 clearly shows that less weight is given to the activities of that past. In some embodiments, all signatures for each macro class (channel, genre, microgenre) are smoothed using the exponential smoothing technique. The determination of when to decay a particular signature is based on the dataspace of the signature and the nature of activities performed in the dataspace. For example, in the television dataspace, a decay period of one day is a suitable period because television shows typically reoccur on a daily basis. Whereas the decay period for the telephone dataspace would be longer so as to decay at a slower rate that the television dataspace. The decay parameter, or smoothing parameter α, can be selected to control the degree of decay of the past user behavior.


The learning system also uses an adaptive decay technique to integrate the past signature information with the most recent signature information. This adaptive decay technique is based on a hybrid chronological-activity based smoothing and provides improved results over strict chronology-based aging when applied to signatures that take into account the user's geographic location. This technique enables the influence of past activities to decay over time, while still preserving the past information during the user's absence from a particular geographic location for a stretch of time. In general, the past signature will be decayed if (1) a new activity has occurred in the geographic location and (2) the elapsed time since the last signature decay event is greater than a threshold. In essence, the system freezes the signatures when no activity is happening in a given location, effectively stopping time for that location. When next an activity occurs in that location, the system smoothes the signatures based on elapsed time.


If a traditional smoothing technique were employed to decay the memory of the past once per day, for example, the signature values may decay to zero if the user were absent from the geographic location for an extended period. Thus, upon returning to that particular location, the user would effectively have to “retrain” the system by rebuilding the signatures corresponding to that location. The adaptive decay technique avoids this problem.


An illustration of an implementation of the adaptive decay technique follows. As mentioned above, signature decay occurs for all signatures in coordinates (t, g, s) (i.e., time t, geographic location g, and dataspace s), only when there is a new activity in (t, g, s). In addition, a minimum time must elapse before decaying takes place. To account for long elapsed times, the system uses the concept of epoch time. Epoch time is the absolute local time since a certain distant past. The concept of epoch time can be found in current-day operating systems (e.g., Linux and WinCE) that fix a reference point in the distant past and return the elapsed time since that reference point. For the example below, T is the epoch time when some activity x happens in (t, g, s). Note that the coordinate t is an integer denoting the discretized time denoting the time-of-day or time slot, whereas T is an epoch time. For use in the Equation 22 below, β(t, g, s) is the decay threshold for signatures, r(t, g, s) is the last time, in epoch units, that signatures in (t, g, s) were decayed, and e(t, g, s) is a vector capturing a newly performed user action (i.e., current signature) with a duration/count metric (explained above) in position x and zeros in all other positions. This technique also uses the smoothing parameter a as described above. Equation 22, shown below, is one implementations the adaptive decay technique.










ks


(

t
,
g
,
s

)


=

{









α







e


(

t
,
g
,
s

)




[
x
]



+







(

1
-
α

)



ks


(

t
,
g
,
s

)






;





if





T

>


r


(

t
,
g
,
s

)


+

β


(

t
,
g
,
s

)














ks


(

t
,
g
,
s

)


+







α

(

1
-
α

)





e


(

t
,
g
,
s

)




[
x
]






;



otherwise



}





(

Equation





22

)







Under this implementation, the system decays the signature if the time interval since the last decay is greater than the decay interval; in this case, the system performs a convex combination of the past activity and present activity. If the last decay has occurred more recently than the decay interval, then the historic signature is combined with the current signature, with a multiplier α/(1−α) applied to the current signature. The technique of using this multiplier optimizes storage. Typically, when performing an exponential smoothing operation, the past is the period of time up to time r(t, g, s), and the present is the period of time from time r(t, g, s) to time T. Under a typical application, the new activity x would be stored in a temporary storage, ts(t, g, s), along with all additional subsequent activities, until the time r(t, g, s)+β(t, g, s). At that time, the smoothing formula would combine the past with the new activities according to Equation 23.

ks(t,g,s)=αts(t,g,s)+(1−α)ks(t,g,s)  (Equation 23)


The system avoids the need for temporary storage by combining each new activity with the past signature as each new activity occurs, using the multiplier described above to offset what would otherwise be a premature composition. This ensures true exponential smoothing. Although the above discussion involved only the keyword signatures, ks, the same principles and techniques apply to all other signatures described herein.


Use of Signatures to Personalize Content


As mentioned above, one illustrative use of the learning system is to enhance the user experience during a search procedure. In one illustrative implementation, the various individual, aggregate, and program signatures reside on a server system that contains a set of content items (e.g., television programs, television channels, movies, etc.). The server system uses the signatures to personalize search results provided to users of the system. In particular, the results obtained through a query are identified and reordered by promoting relevance values of individual search results based on the set of signatures. For example, in a system employing an incremental search method (as described in the above incorporated U.S. Patent Applications), the system begins searching for content item results as the user enters letters of a text query. The system identifies content items as candidates for presentation based on comparing the letters of the text query with descriptive terms associated with the content items. Each of these content items is associated with a base relevance value that measures the popularity of the item in the overall population. The system uses these base relevance values to rank which content items are likely sought by the user. Higher base relevance values indicate a higher overall popularity, thus, these items are assumed to be of more interest to the user than items with lower base relevance values.


However, as explained in greater detail below, the system modifies the base relevance values based on the set of user signatures. Thus, if the set of signatures indicates, given the particular time and day of the search, that it is likely the user is searching for a program with the genre of news, the system will promote the relevance values of programs with a genre of news that match the user's query text. Likewise, the system can use the channel and microgenre data associated with the content items in conjunction with the channel and microgenre signatures to promote the base relevance values. The final relevance weights of each item determine if the item is included in the result set and help determine its position in a list of results. Many different promotion techniques can be implemented; one example is the “ruthless promotion technique”, described below.


The ruthless promotion technique ensures that any particular search result item that has a nonzero probability in a user signature will have its relevance value boosted such that it will be higher than any other search result items having a zero probability value in the same user signature. For use in Equation 24 below, K is the number of search results retrieved with relevance numbers r1, r2, . . . , rK, and M is the maximum value any relevance can have, based on the general popularity of the search result. Typically, search engines assign a relevance number to query results based on ranks with some maximum bound. These ranks can be a measure of the popularity or relevance of the items based on popular opinion. Search results are displayed in the shelf space, sorted in descending order of relevance based on these ranks (Herein, the phrase “shelf space” refers to the portion of a display screen of a device that displays the search results in response to a user query. This portion can be organized as a column of text boxes in some implementations.) The values p1(1), p2(2), . . . , pK(1) are the channel signature probabilities (0≤pi(1)≤1) assigned by the learning system (typically, most of the pi(1) will be 0). The superscripts on the probabilities refer to type of signature, e.g., channel, genre, or microgenre. The ruthless promotion technique computes new relevance numbers {tilde over (r)}i, {tilde over (r)}2, . . . , {tilde over (r)}K as











r
~

i

=

{








(

M
+
1

)



e

p
i

(
1
)






;





p
i

(
1
)


>
0







r
i

;





p
i

(
1
)


=
0




}





(

Equation





24

)







The search items are then reordered using the new relevance numbers. For example, a user had watched the channels “CARTOON NETWORK” and “COMEDY CHANNEL” in the past, with the signature probabilities 0.7 and 0.3 respectively. The generic relevance numbers for channels, based on popular opinion, are 500, 300, 100, and 70, for “CBS”, “CNBC”, “COMEDY CHANNEL”, and “CARTOON NETWORK”, respectively with a maximum bound of 1000. Table 1 and Table 2 show the displayed results and their corresponding relevance values, when a query character “C” is typed. Table 1 shows the order of the query results without the benefit of the learning system, and Table 2 shows the order of the results using the ruthless promotion technique. As can be seen, the user convenience is enhanced, because fewer scrolls are required to access the most likely watched channels.












TABLE 2








Relevance



Channel
Number



















CARTOON
2015



NETWORK




COMEDY
1351



CHANNEL




CBS
500



CNBC
300




















TABLE 1








Relevance



Channel
Number



















CBS
500



CNBC
300



COMEDY
100



CHANNEL




CARTOON
70



NETWORK











Other promotion techniques are within the scope of the invention, including, for example, techniques that do not necessarily ensure that a search result item that has a nonzero probability in a user signature will have its relevance value boosted such that it will be higher than any other search result items having a zero probability value in the same user signature. In particular, because there are six signatures capturing the user activity at any time of day—channel, genre, and microgenre for given time slot, and their corresponding composite signatures, these signatures are combined to compute new relevance weights. Equation 24 above shows the use of channel signature for promotion. In the example below, there is an inherent importance in these signatures, from more refined to more coarse. This variant of the ruthless promotion technique considers an aggregated promotion formula, as follows:











r
~

i

=






1

k

6










(

M
+
1

)

k



e

p
i

(
k
)











(

Equation





25

)







In Equation 25, the superscript on the probabilities, pi(k) refers to time slot channel, microgenre, genre, followed by composite channel, microgenre, and genre signatures, with increasing values of k, respectively. Since they are being multiplied by powers of (M+1), a natural relevance importance is implied.


Signatures corresponding to overlapping periodicities are also combined to provide an aggregate signature for a particular time slot. The probabilities in the aggregate signature can be used with the promotion techniques above to identify and rank the results of search. In order to form an aggregate signature, the vectors from the overlapping signatures are added together and renormalized. For example, for a particular time slot, a user has a first signature with a periodicity of every Monday and a second signature with a periodicity of every weekday. The first signature is the genre vector {(0.2 news), (0.8 sports)}; the second signature is the genre vector {(0.1 comedy), (0.4 news), (0.5 sports)}. To form an aggregate signature, the system first arithmetically combines the two vectors to produce the new vector {(0.1 comedy), (0.6 news), (1.3 sports)}, and the system then normalizes the new vector by dividing each numerical element by the sum of the numerical elements of the vector, i.e., 2.0. Thus, the aggregate, normalized genre probability vector of the two overlapping signatures is {(0.05 comedy), (0.3 news), (0.65 sports)}.


Seminormalization of Signatures


In one implementation, the learning system uses integer arithmetic for relevance promotion computations. In particular, all probabilities are represented as integers, appropriately scaled. One compelling motivation for using integer arithmetic is to make the learning system portable to disparate hardware and operating systems, some of which may lack floating arithmetic.


The learning system uses a seminormalization approach to weight more recent activities more heavily in the signatures, while deemphasizing, but still retaining, the information from more distant activities in the signature. Thus, when personalizing services or content provided to the user, the system is more heavily influenced by more recent activities. The basic idea of this seminormalization approach is to make the long-term memory coarse-grained by bucketing small probabilities that result from less common user preferences and/or preferences captured in the more distance past, while still bounding the range of values by using a small relevance scale factor. This approach allows the system to capture both small and large probabilities without requiring a large dynamic range to define the probability bounds. Thus, a small scaling factor is used to distinguish between the relatively more probable activities in the captured in the signatures, while the relatively less probable activities are not lost due to truncation errors.


An illustrative technique for converting an unnormalized signature, x, into a seminormalized signature is provided below. In signature x, all elements xi are nonnegative integers, representing the activity intensity. Signature x is an N-dimensional vector, and x has at least one positive element. The value d is the infinity norm of x and thus, d is the normalizing sum. The vector p is the normalized










x






=
Δ






1

i

N








x
i



;





probability vector corresponding to x; therefore,








p
i

=


x
i

d


;

1

i


N
.






In order to seminormalize the signature x, the system uses a fine-grain memory threshold of probabilities, K, represented in log scale; i.e., K is a positive integer denoting a probability threshold 10−K. All probabilities≤10−K will be scaled in fine-grain, and all probabilities between 0 and 10−K will be scaled in coarse-grain with bucketing. The system also uses a positive integer, S, as the seminormalization range represented in log scale. After seminormalization, a probability value of 1 is scaled to 10S. The largest value in the seminormalized vector would be 10S. Although not required, S can be equal to K. For use the in equations below, let t=10−K, 1=K+2, and u=10S. Finally, y is the seminormalized vector corresponding to p. Thus, y=f(p, K, S), where f is the function implementing the seminormalization algorithm. The function f is not an invertible function.


Each element i of the seminormalized vector y is defined by the Equation 26.










y
i

=

{




1
;




0


p
i

<

10


-
2






K









v
+
2

;








10

(



-
2






K

+
v

)




p
i

<








10

(



-
2






K

+
v
+
1

)



0


v


K
-
1
















(



10
K


1

-
u

)


(


10
K

-
1

)


+








10
K



(

u
-
1

)



p
i



(


10
K

-
1

)





;





10

-
K




p
i


1




}





(

Equation





26

)







The first 2 parts of Equation 26 define the coarse-grain bucketing, and the last part of the equation defines the fine-grain scaling. FIG. 7 shows a pictorial representation of Equation 26. The X-axis is shown in log scale. In FIG. 7, S=K, and there are K buckets of width 0.1. The buckets start with the bucket having a left boundary 10−2K and ending with the bucket with the right boundary 10−K. There is a special underflow bucket for any probability<10−2K. Each pi falling within a bucket is scaled to a fixed count. For probabilities larger than 10−K, the pi is scaled using a linear equation. The slope of the linear scaling equation in plot is approximately 10S with the intercept at (K+2).


An example of this approach as applied to an electronic phonebook application on a mobile phone is provided below. In this example, each outgoing call is counted and stored in a raw signature. The system scales the call counts by a large number so that truncation errors in the smoothing and aging process, due to integer arithmetic, are reduced. FIG. 8 illustrates a raw phonebook signature 800 with 6 entries. The row names in signature 800 indicate the person called, and the values are the scaled frequency, after aging and smoothing. Thus, the value 1 represents the long-term memory of a phone call made to John, perhaps many years ago, and not repeated again. Similarly, the entry corresponding to Jane has a signature value of 5. This value can be interpreted two ways: (1) Jane was called as long ago as John, but with a frequency five times greater than John; or (2) Jane was called more recently with the same frequency as John. The larger values represent short-term memories of calls made in the recent past. The normalized probabilities of these events are shown in a probability vector 810.


It is clear the dynamic range of probability vector 810 is quite large. Using the techniques described above, with K=S=4, the system generated a seminormalized vector 820. The system has collapsed the memory of John and Jane into the underflow bucket, thus making them indistinguishable. Some differentiation has been made for Debbie and Naren, although these entries also represent long-term memories, and therefore, the exact difference in frequency or recency of calls is not retained. However, the system captures the precise relative values of Simon and Marty. The values in the seminormalized vector are completely bounded and suitable for stable relevance promotion.


Activity Spike Detection


The learning system is also useful for detecting sudden bursts of activity (i.e., spike detection) at a particular time and day for a particular search item, e.g., a certain television channel. The system can use these spikes of activity to temporarily boost the base relevance values of particular items that have sudden popularity. Typically, spikes happen when an emergency or crisis has happened in the recent past. For example, if a particular news channel is presenting a late-breaking news story that is attracting a high number of viewers, the system will recognize the sudden popularity of the news program and boost its base relevance in recognition of the fact that other users of the system may also be interested in the news story.


In general, when the collective activity level associated with a particular content item is above a certain threshold attributable to statistical variations, then the activity level is considered a spike. The learning system analyzes the current and past activity levels by collectively examining all of the signatures of the user population. If each user is considered an independent random variable whose probability of watching a program is encoded in a stochastic signature, then the collection of all these independent random variables provides a measure of the overall popularity of the content item. Thus, the system employs these signatures to derive a joint probability distribution of the number of users watching a given program at a given time. Thus a new type of signature, herein a “program signature”, rk(i,t), is defined in Equation 27.

rk(i,t)=Probability that a program iis being watched by kusers at time t;0≤k≤N  (Equation 27)


An example of a technique for obtaining the program signature is provided below. In general, when the activity level associated with a particular content item exceeds a certain inherent randomness value predicted by the program signature, the system identifies such activity as a spike.


The system creates a set of program signatures, each of which is a statistical convolution of all individual signatures in the population that have watched the particular program. By convolving the individual signatures, the system creates an aggregate mean and standard deviation of the activity level associated with the given program. Thus, a program signature captures the fraction of all of the current users interacting with the system that are currently watching the given program. Because the number of users interacting with the system changes over time, the fraction of users watching a particular program changes over time as well. The system captures this information by creating program signatures for the various time slots.


These signatures estimate the mean and higher moments of the probability distribution of the number of people accessing this program in terms of fractional probabilities. The aggregate signature and related statistical measures define the normal level of activity for the particular search item. Thus, by continually monitoring the current level of activity for the particular search item at a given time, the system can detect if the current activity is above or below the normal level. If the activity exceeds a certain threshold, the system adjusts the reordering technique to temporarily boost the relevance of the particular search item to recognize that the item may be of particular interest.


An example of creating an aggregate signature is provided below. For the sake of simplicity, the example is restricted to one day, one time slot, one content item, i, and a single category of signature (e.g., channel, genre, or microgenre). This technique for finding the aggregate signature is applied to all time periods, all days, all search items, and all signatures. In the following example, N is the number of users using the system, qi(j) is the normalized signature value of user j for item i (i.e., the fraction of time user j watched program i) where 1≤j≤N, Ψ is the index set {1, 2, . . . , N}, Φm is the set of subsets of Ψ of size m where 0≤m≤N, and X is a random variable denoting the number of users currently watching program i.


The unnormalized probability that there are m users watching program i, herein rm, is determined by Equation 28.











r
m

=




Θ


Φ
m












1

k

m








q
i

(

j
k

)





,


where





Θ

=

{



j
1



j
2


,





,

j
m


}






(

Equation





28

)







The normalization constant, G, is given by Equation 29.









G
=




0

m

N








r
m






(

Equation





29

)







The probability density function of X, fX(m), the mean of X, μX, and the standard deviation of X, σX are now given by the following equations:












f
X



(
m
)


=


1
G



r
m



;

0

m

N





(

Equation





30

)







μ
X

=




0

m

N









mf
X



(
m
)







(

Equation





31

)







σ
X

=





0

m

N










(

m
-

μ
X


)

2




f
X



(
m
)









(

Equation





32

)







The system monitors the number of users watching program i. Chebychev's inequality dictates that, with 96% confidence, the random variable X cannot be above μ+5σ due to inherent randomness. Thus, whenever the number of users watching program i goes beyond μX+5σX, the system identifies this as a spike of activity. The system can temporarily boost the base relevance of program i in queries for which program i is a candidate in recognition of the fact that the user may be interested in the same program. The relevance can be boosted by a predetermined amount, or it may be boosted by an amount that is proportional to the increase in viewing of the program. In addition, the system can use a variety of multiples of σX (not only 5σX) to determine when a spike of activity is occurring.


The system can also infer the overall relevance of particular search items using the aggregate signatures. As described above, the system computes the mean of the statistical convolution of N signatures, N being the number of system users. Using this mean value, the system generically reorders the search results even in the absence of a signature for a particular user. Thus, the user benefits from the systems knowledge of the popular option of various search items, and these popular opinions are used to identify and order search results for presentation to the user. For example, if the aggregate signature has a large mean for the television program “The Apprentice”, then any user who does not have a personal signature will have this item in the top shelf on an appropriate query (the query, for instance, can be “trump”, which is a microgenre of the program “The Apprentice”).



FIG. 9 illustrates an example of detecting an increased level of activity associated with a content item (i.e., an activity spike). A normal level of activity 905, as determined using the techniques described above is shown. Normal level of activity 905 is based on the aggregate signatures. As the system is being used, a current level of activity 910 is generated using continuously calculated aggregate signatures based on the current content items usage or activity. Upon detecting an increase in activity level 915, which is beyond a specified threshold, the system identifies the content item as having a spike of activity, and the system promotes the ranking of that content item, as described above.


The learning system also allows accessing rare search items using preprocessing. In some implementations described above, the search engines work by first gathering significant amounts of results matching the query, and filtering out low relevance results before applying a promotion technique. This technique has several advantages, including increasing the speed of the system and reduces network bandwidth required. However, a specific user may be interested in an item having low overall popularity that is filtered out of the results before applying a promotion technique. In the absence of a signature, this rare item may never me presented in the search results (this rare item is sometimes referred to as the “long tail” in the probability distribution sense).


In order to capture the rare item in the ordered search results, some implementations of the system compute the relevance before filtering, using the promotion techniques described above or other promotion techniques. Thus, the rare item is ranked highly for the particular user, allowing him or her to access the item with ease. Here, signatures enable fine-grain customization and increase user satisfaction.


An inherent feature of the stochastic signature mechanism is the probabilistic nature of the signature entries, i.e., the signature entries are all normalized probabilities. This enables the system to export the signatures to other, potentially unrelated systems, with ease. For example, over some period of time, the television system interface described above learns that, in general, a given user prefers the Spirituality genre 50% of the time, Sports genre 40% of the time, and the Seinfeld show 10% of the time. In response, the system creates a set of signatures for the user that captures these preferences. The user can elect to share this signature information with other systems.


Therefore, when the user registers with a website that sells books, the user can elect to share his signature information with this website. Because the signature information is stored in terms of normalized probabilities, the signature can be easily imported into the website that is configured to utilize such probability information. In addition, the website need not have an identical set of genres as that of the television system in order to use the signature information. For example, the website may not have “Seinfeld” defined as a genre or category of books. In this case, the website can simply renormalize the signature by removing the irrelevant entries, i.e., Seinfeld, and determining new normalized probabilities for the remaining genres. Thus, the new normalized probabilities for the user would be 56% for Spirituality and 44% for Sports. Sharing signatures in this way obviates the need for relearning in the new system. Also, different subsets of signatures can be shared for different systems.


Signature Based Preference Service


As explained above, the learning system captures the user's preferences across multiple dataspaces. In addition, portions of the learning system can be incorporated into various user client devices, thereby enabling the system to capture the user's preferences across multiple devices. For example, the system can track the user's actions performed on a mobile telephone, a television system, a handheld computer device, and/or a personal computer. This enables the system to provide personalized services to the user across the multiple dataspaces and multiple devices. Thus, user preferences expressed on one device can be used to personalize the user interactions on a different device.


Likewise, the learning system can provide the learned user preferences captured in the various signatures to third-party service providers. The information provided to third-party service providers allows the service providers to personalize the services for the user outside of the learning system. In such an implementation, the learning system determines what preference information to provide to the service providers based on the nature of the services provided. The learning system can provide this information on a per transaction basis, or the system can periodically synchronize a set of signatures stored by the third-party service provider. Furthermore, the user can configure the learning system so as to control which third-party service receives user preference information.


By providing a centralized system that learns and stores the user's preferences, the learning system enables the user to avoid creating disconnected pockets of personalization associated with only one particular dataspace or device. Therefore, a user may immediately leverage the preference information contained in the user's signatures when interacting with a new service rather than having to wait for the new service to learn the user preferences. Thus, the learning system can provide personalization information to the third-party service provider to improve the user's experience with the service.


This comprehensive personalization across diverse user activities and devices is especially helpful to the user when the user interacts with the same service provider using different interface devices. Not only does the learning system capture the user's preferences from these diverse interactions, but the system also stores the details of the user's transaction for later retrieval by the user. For example, a user can book a flight through a travel website using a personal computer. The learning system captures the detailed information associated with transaction, e.g., the date of travel, the time of the flight, and the departure and destination city. At a later time, the user wishes to modify the travel reservations, and elects to do so using a mobile telephone. Because the system monitors the user's interactions with various service providers, the system recognizes that the user has placed a telephone call to the travel service. In response, the learning system automatically presents the user's upcoming travel itineraries on the user's mobile telephone or sends the information to the travel service's customer service agent with the user's consent.


In the alternative, if the user is presented with an automated voice response system, the learning system can send the relevant itineraries to the travel service (e.g., via an SMS message dispatched to the telephone number called or DTMF tones at the beginning of the telephone call), which would provide the travel service with a background context of the telephone call to improve the automated voice response system's response to the user voice commands. The power of a comprehensive personalization across diverse user activities and devices becomes very evident in voice based navigation applications. Comprehensive personalization can provide the necessary context that can vastly improve ambiguities in user input that plague these types of systems today.



FIG. 10 illustrates a part of the learning system for providing a context specific personal preference information service. In a preferred embodiment, a user device 1001a-c solicits a service, on behalf of the user, from a service provider 1002. This can include, for example, making a telephone call to modify a travel itinerary or accessing a search engine to find some information. The context-sensitive personal preference information service 1003 enables the external service provider 1002 to provide a targeted response to the user based on user's prior activity, data access history, and the learned user preferences.


Service provider 1002 can also serve as the source of information and relevance updates for user device 1001a-c. A network 1002 functions as the distribution framework and can be a combination of wired and wireless connections. The navigation devices can have a wide range of interface capabilities and include such devices as a personal or laptop computer 1001a, a hand-held device 1001b (e.g. phone, PDA, or a music/video playback device) with limited display size and an overloaded or small QWERTY keypad, and a television remote control system 1001c, wherein the remote control has an overloaded or small QWERTY keypad. The navigation devices provide user activity data to the learning system via personal preference information service 1003 to create the various signatures. As mentioned above, in alternate embodiments, the user device can create the various signatures, and the signatures can be kept on the device. This enables the device to locally filter and order content items received from service provider 1002 and/or content items that reside on the device itself


As described above, the learning system captures the user's preferences from the user's interactions with various dataspaces. FIG. 11 illustrates the local tracking and strengthening of the personal preference signatures based on user activity and the content on a mobile device. For example, user interaction with a telephone book 1101, media applications 1102, email/calendar 1103, and web browser 1104 are tracked, as well as when and where the interaction takes place. In addition to the user's interaction with these applications, the content that is coupled with these applications such as call logs 1101A, music files 1102A, email data/calendar appointments 1103A, and browser cookies 1104A are also tracked to capture the user's preferences. Aggregated actions and various signatures 1105 are captured by the learning system as described above.


The aggregated data and signatures 1105 are used by a wide variety of services, ranging from a local data prefetching service, in order to improve search performance, to a commercial third-party service provider, in order target the user for a specific product offering. The sets of signatures generated by the learning system form an onion-like layered structure; the inner layers are specific and capture the exact details of the user's actions, while the outer layers characterize the user's general preferences. For example, the inner layers capture (1) the time and the location where the user performed an action, (2) the nature of the action (e g tuning to a channel or the purchase of a book, DVD, or airline ticket), and (3) the details of the action (e.g. the channel and/or program the user tuned to, the title of book the user ordered, or the departure and destination airports of an airline ticket purchase). This layered structure coincides with the various signatures created by the learning system. The inner layers correspond to the microgenre and keyword signatures, while the outer layers correspond to the genre signatures.


The service provider requesting the user's signature information can designate the degree of specificity of user preferences desired. For example, a video content search engine wishing to use the user's signatures to order the results of a query may request specific information on which channels or program the user watched. A bookstore, on the other hand, may request broad user preferences of book tastes. The personal signature information sent in the later case would not be the individual instances of purchased books, but rather the broad user preferences at a genre level.



FIG. 12 illustrates the information flow when a user device 1203 makes a request to a service provider 1201 (step 1). The request contains a unique identifier that identifies the user or the user device. The identity could be an anonymous yet unique identifier of the device. For example, a one-way hash function of the device hardware identifier may be used to uniquely identify the device; there would be no way to reverse map to the actual device that generated the response, given the one-way nature of the hash function. In this case, the personal preference service 1202 has only have a set of unique device identifiers that share signatures for each user; there would be no identity information beyond the identifiers. In this way, the user's identity is maintained anonymous, yet responses matching user's preferences can be delivered.


In addition to the substance of the request, the communication from user device 1203 to service provider 1201 contains information that describes the context of the request, as explained below. Service provider 1201 communicates the substance of the request and the additional context information to personal preference service 1202 (step 2). The context information includes the identifier of the user device currently being employed, the location of the user device, if available, the time of the request, and general description of the action the user is performing (e.g., the fact the user is currently using a telephone versus playing media). The additional context information enables personal preference service 1202 to provide context-sensitive personal preference information to service provider 1201. Descriptive tags are assigned to the various actions the user can perform using the system. The system associates these descriptive tags with the signatures that are generated by the corresponding actions. In this way, personal preference service 1202 sends relevant preference information based on the tags sent by user device 1203 to service provider 1201 (step 3).


The relevant personal preference information is used by the service provider 1201 to send a targeted response to the user device 1203 (step 4). Additionally, service provider 1201 sends feedback to personal preference service 1202 about the targeted response that was sent (step 5). This feedback is used by personal preference service 1202 to adjust the personal actions signature of the user.


By disaggregating personal preferences through a standalone entity, i.e. personal preference service 1202, multiple service providers that provide different services can all benefit from the aggregated personal preference accrued across different service providers, different user actions, and different user devices. The end user gains immensely due to the key benefit of having targeted responses to many different types of requests. For example, a user who purchases books from Amazon.com gets the benefit of a targeted response when he goes to the Barnes & Nobles site using the techniques described above.


As described above, personal preference service 1202 can also be a centralized aggregator of user actions across different devices. Thus, user actions performed on different devices, e.g., a mobile computing device, a home television with set-top box, and a personal computer, could all be aggregated to provide user preferences for identifying and ordering search results. For example, a user could initiate a remote recording for a favorite program using a mobile device, where the discovery of the desired program can be made easy by leveraging the user's viewing behavior on a television system. Thus, the available episodes of Seinfeld could be automatically displayed in the mobile device, for example, based on the fact that the user has viewed Seinfeld many times in the past on the television system.



FIG. 13 illustrates the information flow when a user device 1302 makes a request to a service provider 1301. In this scenario, the context sensitive personal preference information is sent along with the request (step 1) to generate a response (step 2). The personal preference data 1303 is locally resident on user device 1302. Additionally, personal preference data 1303 is updated (step 3) based on the response received from service provider 1301.


In another implementation of the learning system, a user device can serve as the personal preference provider in a peer-to-peer fashion for other user devices. For example, in a home entertainment network with more than one DVR (Digital Video Recorder), one DVR can serve as the personal preference provider for another DVR resident in the same home. When the user performs a search for content on a particular DVR, the other DVR in the home provides a personalized preference service to enable the user to find the desired content more quickly by leveraging the prior user viewing habits across the different DVRs.


In addition, a particular user can elect to share his or her signatures with another user. This can be accomplished in a peer-to-peer fashion as described above. In this case, the preferences learned for one user are used to personalize content results for another user. For example, the system will generate a set of signatures for a first user while that user selected various content from a book dataspace. These signatures encode the book reading preferences of the first user. A second user has a similar interest to the first user, and the second user wishes to select books related to similar topics as the first user. In this case, the first user can share his signature with the second user. The system then uses the first user's signatures to personalize the content results for the second user. In this way, the system enables the second user to benefit from the learned preferences of the first user without the second user having to train the system.



FIG. 14 illustrates different services 1401, for example, travel services (airline, car, and hotel), food services, entertainment services, and search engines services, that benefit from the context-sensitive personal preference service 1402. Although each service provider may have its own personalized services, when users first identify themselves, the services have no knowledge of the first time customer. The techniques disclosed herein increase the likelihood of the acquiring and retaining first time customers by offering targeted services immediately upon using the service. The techniques disclosed also enhance the first-time user experience. In contrast, without these techniques, users would have to create an account with a service and build an action history with that service before receiving personalized services.


Using the techniques described above, a user, for example, can go to any travel site and the site, without knowing the user and without requiring him to create an account or log in, can still offer the user personalized services based on the history of prior travel actions the user took on other platforms or web sites. Additionally, for services where comprehensive personalization is not in place, these services can leverage the personal preference service discussed above.


Because the learning system and personal preference service operate across multiple dataspaces and multiple user devices, the user device configuration can vary greatly. FIG. 15 illustrates possible user device configurations for use with the learning system and the context-sensitive personal preference service. In one configuration, a user device 1509 can have multiple output capabilities, for example, a display 1501 and/or a voice output 1502. In addition, user device can have a processor 1503, volatile memory 1504, a text input interface 1505, and/or voice input 1506. Furthermore, user device 1509 can have remote connectivity 1507 to a server through a network and can have persistent storage 1508.


In another user device configuration, user device 1509 may not have local persistent storage 1508. In such a scenario, user device 1509 would have remote connectivity 1507 to submit the user's request to a server and retrieve responses from the server. In yet another configuration of user device 1509, the device may not have remote connectivity 1507. In such case, the learning system, personalization database, and signatures are locally resident on local persistent storage 1508. Persistent storage 1508 can be a removable storage element, such as SD, SmartMedia, or a CompactFlash card. In a configuration of user device 1509 with remote connectivity 1507 and persistent storage 1508, user device 1509 can use remote connectivity 1507 for a personalization data update or for the case where the personalization database is distributed between local persistent storage 1508 and a server.


It will be appreciated that the scope of the present invention is not limited to the above-described embodiments, but rather is defined by the appended claims, and these claims will encompass modifications of and improvements to what has been described. For example, embodiments have been described in terms of a television content system. However, embodiments of the invention can be implemented on a mobile phone to assist the user in retrieving personal contact information for individuals.

Claims
  • 1. A method comprising: monitoring user interactions with a first plurality of media content items of a first third-party service provider application installed on a mobile device, wherein each media content item of the first plurality of media content items has an associated descriptive term to describe content of the respective media content item;determining a set of media content preferences of the user for the first plurality of media content items of the first application by: receiving a user selection of a media content item of the first plurality of media content items;determining an associated descriptive term of content of the media content item; andidentifying the associated descriptive term of the media content item as a media content preference of the set of media content preferences;determining a subset of the media content preferences of the user that are relevant to a second plurality of media content items of a second third-party service provider application, distinct from the first third-party service provider application; andselecting and presenting, from within the second third-party service provider application, a media content item of the second plurality of media content items of the second third-party service provider application based on the subset of the media content preferences.
  • 2. The method of claim 1, wherein the second application is an application provided by a third-party service provider with respect to a service provider of the mobile device.
  • 3. The method of claim 1, wherein presenting the content item further comprises presenting the content item of the second plurality of content items of the second application in response to receiving incremental input entered by the user for incrementally identifying desired content items of the second plurality of content items of the second application.
  • 4. The method of claim 1, wherein the selecting and presenting of the content item is further based on popularity values associated with each content item of the second plurality of content items, and wherein each popularity value indicates a relative measure of a likelihood that the respective content item is desired by the user.
  • 5. The method of claim 1, wherein a plurality of content items of the second application or the first plurality of content items of the mobile device are contained on at least one of a cable television system, a video-on-demand system, an IPTV system, and a personal video recorder.
  • 6. The method of claim 1, wherein the second application has not characterized the content preferences of the user.
  • 7. The method of claim 1, wherein monitoring the user interactions further comprises monitoring the user interactions with a plurality of applications installed on the mobile device.
  • 8. The method of claim 1, further comprising generating a request for the subset of content preferences from the second application.
  • 9. A non-transitory computer-readable medium having instructions encoded thereon, the instructions comprising: instructions for monitoring user interactions with a first plurality of media content items of a first third-party service provider application installed on a mobile device, wherein each media content item of the first plurality of media content items has an associated descriptive term to describe content of the respective media content item;instructions for determining a set of media content preferences of the user for the first plurality of media content items of the first application by: receiving a user selection of a media content item of the first plurality of content items;determining an associated descriptive term of content of the media content item; andidentifying the associated descriptive term of the media content item as a media content preference of the set of media content preferences;instructions for a subset of the media content preferences of the user that are relevant to a second plurality of media content items of a second third-party service provider application, distinct from the first third-party service provider application; andinstructions for selecting and presenting, from within the second third-party service provider application, a media content item of the second plurality of media content items of the second third-party service provider application based on the subset of the media content preferences.
  • 10. The non-transitory computer-readable medium of claim 9, wherein the first application comprises at least one of a phone application, phone book application, address book application, calendar application, downloadable media content application, television application, travel application, web application, and book application.
  • 11. The non-transitory computer-readable medium of claim 9, wherein the instructions for presenting the content item further comprise instructions for presenting the content item of the second plurality of content items of the second application in response to receiving incremental input entered by the user for incrementally identifying desired content items of the second plurality of content items of the second application.
  • 12. The non-transitory computer-readable medium of claim 9, wherein the instructions for selecting and presenting of the content item further comprise instructions for selecting and presenting of the content item based on popularity values associated with each content item of the second plurality of content items, and wherein each popularity value indicates a relative measure of a likelihood that the respective content item is desired by the user.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 13/959,289 entitled Methods and Systems For Selecting and Presenting Content On A First System Based On User Preferences Learned On A Second System, filed Aug. 5, 2013, now U.S. Pat. No. 8,825,576, which is a continuation of U.S. patent application Ser. No. 13/021,086, entitled Methods and Systems For Selecting and Presenting Content On A First System Based On User Preferences Learned On A Second System, filed Feb. 4, 2011, now U.S. Pat. No. 8,543,516, which is a continuation of U.S. patent application Ser. No. 12/882,451, entitled Methods and Systems For Selecting and Presenting Content On A First System Based On User Preferences Learned On A Second System, filed Sep. 15, 2010, now U.S. Pat. No. 7,885,904, which is a continuation of U.S. patent application Ser. No. 11/682,588, entitled Methods and Systems For Selecting and Presenting Content On A First System Based On User Preferences Learned On A Second System, filed Mar. 6, 2007, now U.S. Pat. No. 7,835,998, which claims the benefit of the following applications: U.S. Provisional Application No. 60/779,547, entitled A Framework for Learning User Behavior With Stochastic Signatures, filed Mar. 6, 2006;U.S. Provisional Application No. 60/784,027, entitled A System And Method For Service Solicitation Enhanced With Relevant Personal Context to Elicit Targeted Response, filed Mar. 20, 2006;U.S. Provisional Application No. 60/796,614, entitled A Learning Model For Multiple Dataspaces With Applications To Mobile Environment, filed May 1, 2006; andU.S. Provisional Application No. 60/834,966, entitled Seminormalization Of Signatures For Reducing Truncation Errors And Stabilizing Relevance Promotion, filed Aug. 2, 2006, the contents of each of the above applications are incorporated by reference herein. This application is related to the following applications, filed on Mar. 6, 2007: U.S. patent application Ser. No. 11/682,693, entitled Methods and Systems For Selecting and Presenting Content Based On Learned Periodicity Of User Content Selection, now U.S. Pat. No. 7,774,294;U.S. patent application Ser. No. 11/682,700, entitled Methods and Systems For Selecting and Presenting Content Based On Dynamically Identifying Microgenres Associated With The Content, now U.S. Pat. No. 7,774,341;U.S. patent application Ser. No. 11/682,689, entitled Methods and Systems For Selecting and Presenting Content Based On Activity Level Spikes Associated With The Content, now U.S. Pat. No. 7,657,526;U.S. patent application Ser. No. 11/682,695, entitled Methods and Systems For Selecting and Presenting Content Based On User Preference Information Extracted From An Aggregate Preference Signature, now U.S. Pat. No. 7,739,280;U.S. patent application Ser. No. 11/682,533, entitled Methods and Systems For Selecting and Presenting Content Based On A Comparison Of Preference Signatures From Multiple Users, now U.S. Pat. No. 8,380,726;U.S. patent application Ser. No. 11/682,596, entitled Methods and Systems For Segmenting Relative User Preferences Into Fine-Grain and Coarse-Grain Collections, now U.S. Pat. No. 7,529,741; andU.S. patent application Ser. No. 11/682,599, entitled Methods and Systems For Selecting and Presenting Content Based On Context Sensitive User Preferences, now U.S. Pat. No. 7,792,815.

US Referenced Citations (1394)
Number Name Date Kind
3440427 Kammer Apr 1969 A
3492577 Reiter et al. Jan 1970 A
3493674 Houghton Feb 1970 A
3729581 Anderson Apr 1973 A
3833757 Kirk, Jr. et al. Sep 1974 A
3891792 Kimura Jun 1975 A
3936868 Thorpe Feb 1976 A
3996583 Hutt et al. Dec 1976 A
4004085 Makino et al. Jan 1977 A
4016361 Pandey Apr 1977 A
4024401 Bernstein et al. May 1977 A
4026555 Kirschner et al. May 1977 A
4031548 Kato et al. Jun 1977 A
4052719 Hutt et al. Oct 1977 A
4058830 Guinet et al. Nov 1977 A
4070693 Shutterly Jan 1978 A
4079419 Siegle et al. Mar 1978 A
4081753 Miller Mar 1978 A
4081754 Jackson Mar 1978 A
4096524 Scott Jun 1978 A
4103524 Mitchell et al. Aug 1978 A
4134127 Campioni Jan 1979 A
4139860 Micic et al. Feb 1979 A
4150254 Schussler et al. Apr 1979 A
4156850 Beyers, Jr. May 1979 A
4161728 Insam Jul 1979 A
4162513 Beyers, Jr. et al. Jul 1979 A
4170782 Miller Oct 1979 A
4186413 Mortimer Jan 1980 A
4189781 Douglas Feb 1980 A
4203130 Doumit et al. May 1980 A
4205343 Barrett May 1980 A
4218698 Bart et al. Aug 1980 A
4228543 Jackson Oct 1980 A
4231031 Crowther et al. Oct 1980 A
4233628 Ciciora Nov 1980 A
4249211 Baba et al. Feb 1981 A
4249213 Imaide et al. Feb 1981 A
4261006 Weintraub et al. Apr 1981 A
4264924 Freeman Apr 1981 A
4264925 Freeman et al. Apr 1981 A
4270145 Farina May 1981 A
4271532 Wine Jun 1981 A
4276597 Dissly et al. Jun 1981 A
4280148 Saxena Jul 1981 A
4283787 Chambers Aug 1981 A
4288809 Yabe Sep 1981 A
4290142 Schnee et al. Sep 1981 A
4300040 Gould et al. Nov 1981 A
4305101 Yarbrough et al. Dec 1981 A
4329684 Monteath et al. May 1982 A
4331974 Cogswell et al. May 1982 A
4337480 Bourassin et al. Jun 1982 A
4337482 Coutta Jun 1982 A
4337483 Guillou Jun 1982 A
4344090 Belisomi et al. Aug 1982 A
4355415 George et al. Oct 1982 A
4367557 Stern et al. Jan 1983 A
4367559 Tults Jan 1983 A
4375651 Templin et al. Mar 1983 A
4381522 Lambert Apr 1983 A
4385210 Marquiss May 1983 A
4388645 Cox et al. Jun 1983 A
4390901 Keiser Jun 1983 A
4393376 Thomas Jul 1983 A
4403285 Kikuchi Sep 1983 A
4405946 Knight Sep 1983 A
4412244 Shanley, II Oct 1983 A
4413281 Thonnart Nov 1983 A
4420769 Novak Dec 1983 A
4425579 Merrell Jan 1984 A
4425581 Schweppe et al. Jan 1984 A
4429385 Cichelli et al. Jan 1984 A
4439784 Furukawa et al. Mar 1984 A
4449249 Price May 1984 A
4456925 Skerlos et al. Jun 1984 A
4466017 Banker Aug 1984 A
4477830 Lindman et al. Oct 1984 A
4488179 Kru/ ger et al. Dec 1984 A
4495654 Deiss Jan 1985 A
4496171 Cherry Jan 1985 A
4496804 Hung Jan 1985 A
4496976 Swanson et al. Jan 1985 A
4510623 Bonneau et al. Apr 1985 A
4520404 Von Kohorn May 1985 A
4523228 Banker Jun 1985 A
4527194 Sirazi Jul 1985 A
4531020 Wechselberger et al. Jul 1985 A
4533910 Sukonick et al. Aug 1985 A
4536791 Campbell et al. Aug 1985 A
4547804 Greenberg Oct 1985 A
4554584 Elam et al. Nov 1985 A
4555755 Kurosawa et al. Nov 1985 A
4555775 Pike Nov 1985 A
4566034 Harger et al. Jan 1986 A
4573072 Freeman Feb 1986 A
4587520 Astle May 1986 A
4595951 Filliman Jun 1986 A
4595952 Filliman Jun 1986 A
4598288 Yarbrough et al. Jul 1986 A
4602279 Freeman Jul 1986 A
4605964 Chard Aug 1986 A
4605973 Von Kohorn Aug 1986 A
4608859 Rockley Sep 1986 A
4611269 Suzuki et al. Sep 1986 A
4620229 Amano et al. Oct 1986 A
4622545 Atkinson Nov 1986 A
4635109 Comeau Jan 1987 A
4635121 Hoffman et al. Jan 1987 A
4641205 Beyers, Jr. Feb 1987 A
4677466 Lert, Jr. et al. Jun 1987 A
4677501 Saltzman et al. Jun 1987 A
4685131 Home Aug 1987 A
4689022 Peers et al. Aug 1987 A
4691351 Hayashi et al. Sep 1987 A
4694490 Harvey et al. Sep 1987 A
4701794 Froling et al. Oct 1987 A
4704725 Harvey et al. Nov 1987 A
4706121 Young Nov 1987 A
4712105 Kohler Dec 1987 A
4714919 Foster Dec 1987 A
4718107 Hayes Jan 1988 A
RE32632 Atkinson Mar 1988 E
4729027 Hakamada et al. Mar 1988 A
4729028 Micic et al. Mar 1988 A
4734769 Davis Mar 1988 A
4745549 Hashimoto May 1988 A
4746983 Hakamada May 1988 A
4748618 Brown et al. May 1988 A
4750036 Martinez Jun 1988 A
4750213 Novak Jun 1988 A
4751578 Reiter et al. Jun 1988 A
4754326 Kram et al. Jun 1988 A
4768228 Clupper et al. Aug 1988 A
4772882 Mical Sep 1988 A
4775935 Yourick Oct 1988 A
4785408 Britton et al. Nov 1988 A
4787063 Muguet Nov 1988 A
4812834 Wells Mar 1989 A
4814883 Perine et al. Mar 1989 A
4821102 Ichikawa et al. Apr 1989 A
4821211 Torres Apr 1989 A
4829558 Welsh May 1989 A
4847604 Doyle Jul 1989 A
4847698 Freeman Jul 1989 A
4847700 Freeman Jul 1989 A
4847744 Araki Jul 1989 A
4855813 Russell et al. Aug 1989 A
4857999 Welsh Aug 1989 A
4862268 Campbell et al. Aug 1989 A
4864429 Eigeldinger et al. Sep 1989 A
4873584 Hashimoto Oct 1989 A
4873623 Lane et al. Oct 1989 A
4876600 Pietzsch et al. Oct 1989 A
4882732 Kaminaga Nov 1989 A
4884223 Ingle et al. Nov 1989 A
4888796 Olivo, Jr. Dec 1989 A
4890168 Inoue et al. Dec 1989 A
4890320 Monslow et al. Dec 1989 A
4890321 Seth-Smith et al. Dec 1989 A
4894789 Yee Jan 1990 A
4899136 Beard et al. Feb 1990 A
4899139 Ishimochi et al. Feb 1990 A
4905094 Pocock et al. Feb 1990 A
4908707 Kinghorn Mar 1990 A
4908713 Levine Mar 1990 A
4908859 Bennett et al. Mar 1990 A
4914517 Duffield Apr 1990 A
4914732 Henderson et al. Apr 1990 A
4918531 Johnson Apr 1990 A
4930158 Vogel May 1990 A
4930160 Vogel May 1990 A
4931783 Atkinson Jun 1990 A
4935865 Rowe et al. Jun 1990 A
4937821 Boulton Jun 1990 A
4937863 Robert et al. Jun 1990 A
4939507 Beard et al. Jul 1990 A
4942391 Kikuta Jul 1990 A
4945563 Horton et al. Jul 1990 A
4954882 Kamemoto Sep 1990 A
4959719 Strubbe et al. Sep 1990 A
4959720 Duffield et al. Sep 1990 A
4963994 Levine Oct 1990 A
4965825 Harvey et al. Oct 1990 A
4977455 Young Dec 1990 A
4987486 Johnson et al. Jan 1991 A
4991011 Johnson et al. Feb 1991 A
4991012 Yoshino Feb 1991 A
4992782 Sakamoto et al. Feb 1991 A
4992940 Dworkin Feb 1991 A
4995078 Monslow et al. Feb 1991 A
4996642 Hey Feb 1991 A
4998171 Kim et al. Mar 1991 A
5003384 Durden et al. Mar 1991 A
5005084 Skinner Apr 1991 A
5008853 Bly et al. Apr 1991 A
5012409 Fletcher et al. Apr 1991 A
5014125 Pocock et al. May 1991 A
5023721 Moon-Hwan Jun 1991 A
5027400 Baji et al. Jun 1991 A
5031045 Kawasaki Jul 1991 A
5036314 Barillari et al. Jul 1991 A
5038211 Hallenbeck Aug 1991 A
5040067 Yamazaki Aug 1991 A
5045947 Beery Sep 1991 A
5046092 Walker et al. Sep 1991 A
5047867 Strubbe et al. Sep 1991 A
5058160 Banker et al. Oct 1991 A
5062060 Kolnick Oct 1991 A
5068733 Bennett Nov 1991 A
5068734 Beery Nov 1991 A
5072412 Henderson, Jr. et al. Dec 1991 A
5075771 Hashimoto Dec 1991 A
5083205 Arai Jan 1992 A
5083800 Lockton Jan 1992 A
5091785 Canfield et al. Feb 1992 A
5093921 Bevins, Jr. Mar 1992 A
5099319 Esch et al. Mar 1992 A
5103314 Keenan Apr 1992 A
5105184 Pirani et al. Apr 1992 A
5109279 Ando Apr 1992 A
5119188 McCalley et al. Jun 1992 A
5119577 Lilly Jun 1992 A
5121476 Yee Jun 1992 A
5123046 Levine Jun 1992 A
5126851 Yoshimura et al. Jun 1992 A
5128766 Choi Jul 1992 A
5134719 Mankovitz Jul 1992 A
5146335 Kim et al. Sep 1992 A
5148154 MacKay et al. Sep 1992 A
5148275 Blatter et al. Sep 1992 A
5151782 Ferraro Sep 1992 A
5151789 Young Sep 1992 A
5152012 Schwob Sep 1992 A
5155591 Wachob Oct 1992 A
5155806 Hoeber et al. Oct 1992 A
5157768 Hoeber et al. Oct 1992 A
5161019 Emanuel Nov 1992 A
5161023 Keenan Nov 1992 A
5162905 Itoh et al. Nov 1992 A
5170388 Endoh Dec 1992 A
5172111 Olivo, Jr. Dec 1992 A
5172413 Bradley et al. Dec 1992 A
5177604 Martinez Jan 1993 A
5179439 Hashimoto et al. Jan 1993 A
5179654 Richards et al. Jan 1993 A
5182646 Keenan Jan 1993 A
5187589 Kono et al. Feb 1993 A
5189630 Barstow et al. Feb 1993 A
5191423 Yoshida et al. Mar 1993 A
5194941 Grimaldi et al. Mar 1993 A
5195092 Wilson et al. Mar 1993 A
5195134 Inoue et al. Mar 1993 A
5200822 Bronfin et al. Apr 1993 A
5200823 Yoneda et al. Apr 1993 A
5204897 Wyman Apr 1993 A
5206722 Kwan Apr 1993 A
5210611 Yee et al. May 1993 A
5212553 Maruoka May 1993 A
5214622 Nemoto et al. May 1993 A
5216515 Steele et al. Jun 1993 A
5220420 Hoarty et al. Jun 1993 A
5223924 Strubbe Jun 1993 A
5227874 Von Kohorn Jul 1993 A
5231493 Apitz Jul 1993 A
5231494 Wachob Jul 1993 A
RE34340 Freeman Aug 1993 E
5233423 Jernigan et al. Aug 1993 A
5233654 Harvey et al. Aug 1993 A
5235415 Bonicel et al. Aug 1993 A
5236199 Thompson, Jr. Aug 1993 A
5237411 Fink et al. Aug 1993 A
5237417 Hayashi et al. Aug 1993 A
5237418 Kaneko Aug 1993 A
5239540 Rovira et al. Aug 1993 A
5241428 Goldwasser et al. Aug 1993 A
5245420 Harney et al. Sep 1993 A
5247347 Litteral et al. Sep 1993 A
5247364 Banker et al. Sep 1993 A
5247580 Kimura et al. Sep 1993 A
5251921 Daniels Oct 1993 A
5252860 McCarty et al. Oct 1993 A
5253066 Vogel Oct 1993 A
5253067 Chaney et al. Oct 1993 A
5260778 Kauffman et al. Nov 1993 A
5260788 Takano et al. Nov 1993 A
5260999 Wyman Nov 1993 A
5283561 Lumelsky et al. Feb 1994 A
5283639 Esch et al. Feb 1994 A
5283819 Glick et al. Feb 1994 A
5285265 Choi et al. Feb 1994 A
5285278 Holman Feb 1994 A
5285284 Takashima et al. Feb 1994 A
5293357 Hallenbeck Mar 1994 A
5296931 Na et al. Mar 1994 A
5297204 Levine Mar 1994 A
5299006 Kim et al. Mar 1994 A
5301028 Banker et al. Apr 1994 A
5307173 Yuen et al. Apr 1994 A
5311423 Clark May 1994 A
5313282 Hayashi May 1994 A
5315392 Ishikawa et al. May 1994 A
5317403 Keenan May 1994 A
5319445 Fitts Jun 1994 A
5323234 Kawasaki Jun 1994 A
5323240 Amano et al. Jun 1994 A
5325183 Rhee Jun 1994 A
5325423 Lewis Jun 1994 A
5335277 Harvey et al. Aug 1994 A
5343239 Lappington et al. Aug 1994 A
5345430 Moe Sep 1994 A
5347167 Singh Sep 1994 A
5347632 Filepp et al. Sep 1994 A
5351075 Herz et al. Sep 1994 A
5353121 Young et al. Oct 1994 A
5355162 Yazolino et al. Oct 1994 A
5357276 Banker et al. Oct 1994 A
5359367 Stockill Oct 1994 A
5359601 Wasilewski et al. Oct 1994 A
5365282 Levine Nov 1994 A
5367316 Ikezaki Nov 1994 A
5367330 Haave et al. Nov 1994 A
5371551 Logan et al. Dec 1994 A
5373288 Blahut Dec 1994 A
5374942 Gilligan et al. Dec 1994 A
5374951 Welsh Dec 1994 A
5377317 Bates et al. Dec 1994 A
5377319 Kitahara et al. Dec 1994 A
5382983 Kwoh et al. Jan 1995 A
5384910 Torres Jan 1995 A
5387945 Takeuchi Feb 1995 A
5389964 Oberle et al. Feb 1995 A
5390027 Henmi et al. Feb 1995 A
5398074 Duffield et al. Mar 1995 A
5404393 Remillard Apr 1995 A
5410326 Goldstein Apr 1995 A
5410343 Coddington et al. Apr 1995 A
5410344 Graves et al. Apr 1995 A
5410367 Zahavi et al. Apr 1995 A
5412720 Hoarty May 1995 A
5416508 Sakuma et al. May 1995 A
5422389 Trepka et al. Jun 1995 A
5424770 Schmelzer et al. Jun 1995 A
5425101 Woo et al. Jun 1995 A
5428406 Terasawa Jun 1995 A
5432561 Strubbe Jul 1995 A
5434625 Willis Jul 1995 A
5434626 Hayashi et al. Jul 1995 A
5434678 Abecassis Jul 1995 A
5436676 Pint et al. Jul 1995 A
5438355 Palmer Aug 1995 A
5438372 Tsumori et al. Aug 1995 A
5440678 Eisen et al. Aug 1995 A
5442389 Blahut et al. Aug 1995 A
5444499 Saitoh Aug 1995 A
5446919 Wilkins Aug 1995 A
5452012 Saitoh Sep 1995 A
5453146 Kemper Sep 1995 A
5453796 Duffield et al. Sep 1995 A
5457478 Frank Oct 1995 A
5459522 Pint Oct 1995 A
5461415 Wolf et al. Oct 1995 A
5465113 Gilboy Nov 1995 A
5465385 Ohga et al. Nov 1995 A
5469206 Strubbe et al. Nov 1995 A
5473442 Kim et al. Dec 1995 A
5477262 Banker et al. Dec 1995 A
5479266 Young et al. Dec 1995 A
5479268 Young et al. Dec 1995 A
5479302 Haines Dec 1995 A
5479497 Kovarik Dec 1995 A
5481296 Cragun et al. Jan 1996 A
5483278 Strubbe et al. Jan 1996 A
5485197 Hoarty Jan 1996 A
5485219 Woo Jan 1996 A
5485221 Banker et al. Jan 1996 A
5485518 Hunter et al. Jan 1996 A
5488409 Yuen et al. Jan 1996 A
5495295 Long Feb 1996 A
5502504 Marshall et al. Mar 1996 A
5515098 Caries May 1996 A
5515106 Chaney et al. May 1996 A
5515511 Nguyen et al. May 1996 A
5517254 Monta et al. May 1996 A
5517257 Dunn et al. May 1996 A
5521589 Mondrosch et al. May 1996 A
5523791 Berman Jun 1996 A
5523794 Mankovitz et al. Jun 1996 A
5523795 Ueda Jun 1996 A
5523796 Marshall et al. Jun 1996 A
5524195 Clanton, III et al. Jun 1996 A
5525795 MacGregor et al. Jun 1996 A
5526034 Hoarty et al. Jun 1996 A
5527257 Piramoon Jun 1996 A
5528304 Cherrick et al. Jun 1996 A
5532735 Blahut et al. Jul 1996 A
5532754 Young et al. Jul 1996 A
5534911 Levitan Jul 1996 A
5537141 Harper et al. Jul 1996 A
5539449 Blahut et al. Jul 1996 A
5539479 Bertram Jul 1996 A
5539822 Lett Jul 1996 A
5541662 Adams et al. Jul 1996 A
5541738 Mankovitz Jul 1996 A
5543933 Kang et al. Aug 1996 A
5544321 Theimer et al. Aug 1996 A
5546521 Martinez Aug 1996 A
5550576 Klosterman Aug 1996 A
5557338 Maze et al. Sep 1996 A
5557721 Fite et al. Sep 1996 A
5557724 Sampat et al. Sep 1996 A
5559548 Davis et al. Sep 1996 A
5559549 Hendricks et al. Sep 1996 A
5559550 Mankovitz Sep 1996 A
5559942 Gough et al. Sep 1996 A
5561471 Kim Oct 1996 A
5561709 Remillard Oct 1996 A
5563665 Chang Oct 1996 A
5568272 Levine Oct 1996 A
5570295 Isenberg et al. Oct 1996 A
5572442 Schulhof et al. Nov 1996 A
5574962 Fardeau et al. Nov 1996 A
5576755 Davis et al. Nov 1996 A
5576951 Lockwood Nov 1996 A
5579055 Hamilton et al. Nov 1996 A
5581479 McLaughlin et al. Dec 1996 A
5582364 Trulin et al. Dec 1996 A
5583560 Florin et al. Dec 1996 A
5583561 Baker et al. Dec 1996 A
5583563 Wanderscheid et al. Dec 1996 A
5583576 Perlman et al. Dec 1996 A
5583653 Timmermans et al. Dec 1996 A
5584025 Keithley et al. Dec 1996 A
5585838 Lawler et al. Dec 1996 A
5585858 Harper et al. Dec 1996 A
5585865 Amano et al. Dec 1996 A
5585866 Miller et al. Dec 1996 A
5589892 Knee et al. Dec 1996 A
5592551 Lett et al. Jan 1997 A
5594490 Dawson et al. Jan 1997 A
5594491 Hodge et al. Jan 1997 A
5594492 O'Callaghan et al. Jan 1997 A
5594509 Florin et al. Jan 1997 A
5594661 Bruner et al. Jan 1997 A
5596373 White et al. Jan 1997 A
5598523 Fujita Jan 1997 A
5600364 Hendricks et al. Feb 1997 A
5600365 Kondo et al. Feb 1997 A
5600366 Schulman Feb 1997 A
5600573 Hendricks et al. Feb 1997 A
5602582 Wanderscheid et al. Feb 1997 A
5602596 Claussen et al. Feb 1997 A
5602597 Bertram Feb 1997 A
5602598 Shintani Feb 1997 A
5602600 Queinnec Feb 1997 A
5604542 Dedrick Feb 1997 A
5606374 Bertram Feb 1997 A
5608448 Smoral et al. Mar 1997 A
5610653 Abecassis Mar 1997 A
5610664 Bobert Mar 1997 A
5617565 Augenbraun et al. Apr 1997 A
5619247 Russo Apr 1997 A
5619249 Billock et al. Apr 1997 A
5619274 Roop et al. Apr 1997 A
5621456 Florin et al. Apr 1997 A
5621579 Yuen Apr 1997 A
5623613 Rowe et al. Apr 1997 A
5625406 Newberry et al. Apr 1997 A
5625464 Compoint et al. Apr 1997 A
5627940 Rohra et al. May 1997 A
5629733 Youman et al. May 1997 A
5630119 Aristides et al. May 1997 A
5631995 Weissensteiner et al. May 1997 A
5632007 Freeman May 1997 A
5633683 Rosengren et al. May 1997 A
5634051 Thomson May 1997 A
5635978 Alten et al. Jun 1997 A
5635979 Kostreski et al. Jun 1997 A
5635989 Rothmuller Jun 1997 A
5636346 Saxe Jun 1997 A
5640501 Turpin Jun 1997 A
5640577 Scharmer Jun 1997 A
5642153 Chaney et al. Jun 1997 A
5648813 Tanigawa et al. Jul 1997 A
5648824 Dunn et al. Jul 1997 A
5650826 Eitz Jul 1997 A
5650831 Farwell Jul 1997 A
5652613 Lazarus et al. Jul 1997 A
5652615 Bryant et al. Jul 1997 A
5654748 Matthews, III Aug 1997 A
5654886 Zereski, Jr. et al. Aug 1997 A
5657072 Aristides et al. Aug 1997 A
5657091 Bertram Aug 1997 A
5657414 Lett et al. Aug 1997 A
5659350 Hendricks et al. Aug 1997 A
5659366 Kerman Aug 1997 A
5659367 Yuen Aug 1997 A
5661516 Caries Aug 1997 A
5661517 Budow et al. Aug 1997 A
5663757 Morales Sep 1997 A
5664111 Nahan et al. Sep 1997 A
5666293 Metz et al. Sep 1997 A
5666498 Amro Sep 1997 A
5666645 Thomas et al. Sep 1997 A
5671276 Eyer et al. Sep 1997 A
5671411 Watts et al. Sep 1997 A
5671607 Clemens et al. Sep 1997 A
5675390 Schindler et al. Oct 1997 A
5675752 Scott et al. Oct 1997 A
5677708 Matthews, III et al. Oct 1997 A
5677981 Kato et al. Oct 1997 A
5682195 Hendricks et al. Oct 1997 A
5682206 Wehmeyer et al. Oct 1997 A
5684525 Klosterman Nov 1997 A
5686954 Yoshinobu et al. Nov 1997 A
5687331 Volk et al. Nov 1997 A
5689648 Diaz et al. Nov 1997 A
5689666 Berquist et al. Nov 1997 A
5692214 Levine Nov 1997 A
5692335 Magnuson Dec 1997 A
5694163 Harrison Dec 1997 A
5694176 Bruette et al. Dec 1997 A
5694381 Sako Dec 1997 A
5696905 Reimer et al. Dec 1997 A
5699107 Lawler et al. Dec 1997 A
5699125 Rzeszewski et al. Dec 1997 A
5699528 Hogan Dec 1997 A
5703604 McCutchen Dec 1997 A
5708478 Tognazzini Jan 1998 A
5710601 Marshall et al. Jan 1998 A
5710815 Ming et al. Jan 1998 A
5710884 Dedrick Jan 1998 A
5715314 Payne et al. Feb 1998 A
5715399 Bezos Feb 1998 A
5717452 Janin et al. Feb 1998 A
5717923 Dedrick Feb 1998 A
5721829 Dunn et al. Feb 1998 A
5722041 Freadman Feb 1998 A
5724091 Freeman et al. Mar 1998 A
5724103 Batchelor Mar 1998 A
5724521 Dedrick Mar 1998 A
5724525 Beyers, II et al. Mar 1998 A
5724567 Rose et al. Mar 1998 A
5727060 Young Mar 1998 A
5727163 Bezos Mar 1998 A
5731844 Rauch et al. Mar 1998 A
5732216 Logan et al. Mar 1998 A
5734444 Yoshinobu Mar 1998 A
5734720 Salganicoff Mar 1998 A
5734853 Hendricks et al. Mar 1998 A
5734893 Li et al. Mar 1998 A
5737028 Bertram et al. Apr 1998 A
5737029 Ohkura et al. Apr 1998 A
5737030 Hong et al. Apr 1998 A
5737552 Lavallee et al. Apr 1998 A
5740231 Cohn et al. Apr 1998 A
5740549 Reilly et al. Apr 1998 A
5745710 Clanton, III et al. Apr 1998 A
5749043 Worthy May 1998 A
5749081 Whiteis May 1998 A
5751282 Girard et al. May 1998 A
5752159 Faust et al. May 1998 A
5752160 Dunn May 1998 A
5754258 Hanaya et al. May 1998 A
5754771 Epperson et al. May 1998 A
5754939 Herz et al. May 1998 A
5757417 Aras et al. May 1998 A
5758257 Herz et al. May 1998 A
5758258 Shoff et al. May 1998 A
5758259 Lawler May 1998 A
5760821 Ellis et al. Jun 1998 A
5761372 Yoshinobu et al. Jun 1998 A
5761601 Nemirofsky et al. Jun 1998 A
5761606 Wolzien Jun 1998 A
5761607 Gudesen et al. Jun 1998 A
5768528 Stumm Jun 1998 A
5771354 Crawford Jun 1998 A
5774170 Hite et al. Jun 1998 A
5774357 Hoffberg et al. Jun 1998 A
5774534 Mayer Jun 1998 A
5774664 Hidary et al. Jun 1998 A
5774887 Wolff et al. Jun 1998 A
5778181 Hidary et al. Jul 1998 A
5778182 Cathey et al. Jul 1998 A
5781226 Sheehan Jul 1998 A
5781228 Sposato Jul 1998 A
5781245 Van Der Weij et al. Jul 1998 A
5781246 Alten et al. Jul 1998 A
5781734 Ohno et al. Jul 1998 A
5784258 Quinn Jul 1998 A
5790198 Roop et al. Aug 1998 A
5790201 Antos Aug 1998 A
5790202 Kummer et al. Aug 1998 A
5790426 Robinson Aug 1998 A
5790753 Krishnamoorthy et al. Aug 1998 A
5790835 Case et al. Aug 1998 A
5790935 Payton Aug 1998 A
5793364 Bolanos et al. Aug 1998 A
5793409 Tetsumura Aug 1998 A
5793438 Bedard Aug 1998 A
5793964 Rogers et al. Aug 1998 A
5793972 Shane et al. Aug 1998 A
5796952 Davis et al. Aug 1998 A
5797011 Kroll et al. Aug 1998 A
5798785 Hendricks et al. Aug 1998 A
5801747 Bedard Sep 1998 A
5801785 Crump et al. Sep 1998 A
5801787 Schein et al. Sep 1998 A
5802284 Karlton et al. Sep 1998 A
5805154 Brown Sep 1998 A
5805155 Allibhoy et al. Sep 1998 A
5805167 van Cruyningen Sep 1998 A
5805235 Bedard Sep 1998 A
5805763 Lawler et al. Sep 1998 A
5805804 Laursen et al. Sep 1998 A
5808608 Young et al. Sep 1998 A
5808694 Usui et al. Sep 1998 A
5809204 Young et al. Sep 1998 A
5809242 Shaw et al. Sep 1998 A
5812123 Rowe et al. Sep 1998 A
5812124 Eick et al. Sep 1998 A
5812205 Milnes et al. Sep 1998 A
5812937 Takahisa et al. Sep 1998 A
5815145 Matthews, III Sep 1998 A
5815671 Morrison Sep 1998 A
5818438 Howe et al. Oct 1998 A
5818439 Nagasaka et al. Oct 1998 A
5818441 Throckmorton et al. Oct 1998 A
5818511 Farry et al. Oct 1998 A
5818541 Matsuura et al. Oct 1998 A
5818935 Maa Oct 1998 A
5819019 Nelson Oct 1998 A
5819156 Belmont Oct 1998 A
5819284 Farber et al. Oct 1998 A
5822123 Davis et al. Oct 1998 A
5825407 Cowe et al. Oct 1998 A
5828402 Collings Oct 1998 A
5828419 Bruette et al. Oct 1998 A
5828420 Marshall et al. Oct 1998 A
5828839 Moncreiff Oct 1998 A
5828945 Klosterman Oct 1998 A
5830068 Brenner et al. Nov 1998 A
5832223 Hara et al. Nov 1998 A
5833468 Guy et al. Nov 1998 A
5835717 Karlton et al. Nov 1998 A
5838314 Neel et al. Nov 1998 A
5838383 Chimoto et al. Nov 1998 A
5838419 Holland Nov 1998 A
5842010 Jain et al. Nov 1998 A
5842199 Miller et al. Nov 1998 A
5844620 Coleman et al. Dec 1998 A
5848352 Dougherty et al. Dec 1998 A
5848396 Gerace Dec 1998 A
5848397 Marsh et al. Dec 1998 A
5850218 LaJoie et al. Dec 1998 A
5851149 Xidos et al. Dec 1998 A
5852437 Wugofski et al. Dec 1998 A
5861881 Freeman et al. Jan 1999 A
5861906 Dunn et al. Jan 1999 A
5862292 Kubota et al. Jan 1999 A
5864823 Levitan Jan 1999 A
5867226 Wehmeyer et al. Feb 1999 A
5867227 Yamaguchi Feb 1999 A
5867228 Miki et al. Feb 1999 A
5870543 Ronning Feb 1999 A
5872588 Aras et al. Feb 1999 A
5873660 Walsh et al. Feb 1999 A
5874985 Matthews, III Feb 1999 A
5875108 Hoffberg et al. Feb 1999 A
5877906 Nagasawa et al. Mar 1999 A
5880768 Lemmons et al. Mar 1999 A
5883621 Iwamura Mar 1999 A
5883677 Hofmann Mar 1999 A
5886691 Furuya et al. Mar 1999 A
5886731 Ebisawa Mar 1999 A
5889950 Kuzma Mar 1999 A
5892498 Marshall et al. Apr 1999 A
5892535 Allen et al. Apr 1999 A
5892536 Logan et al. Apr 1999 A
5892767 Bell et al. Apr 1999 A
5895474 Maarek et al. Apr 1999 A
5899920 DeSatnick et al. May 1999 A
5900867 Schindler et al. May 1999 A
5900905 Shoff et al. May 1999 A
5903314 Niijima et al. May 1999 A
5903545 Sabourin et al. May 1999 A
5903816 Broadwin et al. May 1999 A
5905497 Vaughan et al. May 1999 A
5907322 Kelly et al. May 1999 A
5907323 Lawler et al. May 1999 A
5907366 Farmer et al. May 1999 A
5912664 Eick et al. Jun 1999 A
5914712 Sartain et al. Jun 1999 A
5914746 Matthews, III et al. Jun 1999 A
5915243 Smolen Jun 1999 A
5917481 Rzeszewski et al. Jun 1999 A
5917830 Chen et al. Jun 1999 A
5918014 Robinson Jun 1999 A
5920700 Gordon et al. Jul 1999 A
5923848 Goodhand et al. Jul 1999 A
5929849 Kikinis Jul 1999 A
5929850 Broadwin et al. Jul 1999 A
5929932 Otsuki et al. Jul 1999 A
5930493 Ottesen et al. Jul 1999 A
5931905 Hashimoto et al. Aug 1999 A
5936614 An et al. Aug 1999 A
5936679 Kasahara et al. Aug 1999 A
5937160 Davis et al. Aug 1999 A
5937397 Callaghan Aug 1999 A
5940073 Klosterman et al. Aug 1999 A
5940572 Balaban et al. Aug 1999 A
5940614 Allen et al. Aug 1999 A
5945988 Williams et al. Aug 1999 A
5946386 Rogers et al. Aug 1999 A
5946678 Aalbersberq Aug 1999 A
5947867 Gierer et al. Sep 1999 A
5949954 Young et al. Sep 1999 A
5951642 Onoe et al. Sep 1999 A
5953005 Liu Sep 1999 A
5955988 Blonstein et al. Sep 1999 A
5959592 Petruzzelli Sep 1999 A
5959688 Schein et al. Sep 1999 A
5960411 Hartman et al. Sep 1999 A
5963264 Jackson Oct 1999 A
5963645 Kigawa et al. Oct 1999 A
5969748 Casement et al. Oct 1999 A
5970486 Yoshida et al. Oct 1999 A
5973683 Cragun et al. Oct 1999 A
5974222 Yuen et al. Oct 1999 A
5977964 Williams et al. Nov 1999 A
5978044 Choi Nov 1999 A
5986650 Ellis et al. Nov 1999 A
5987213 Mankovitz et al. Nov 1999 A
5987509 Portuesi Nov 1999 A
5987621 Duso et al. Nov 1999 A
5988078 Levine Nov 1999 A
5990890 Etheredge Nov 1999 A
5990927 Hendricks et al. Nov 1999 A
5991498 Young Nov 1999 A
5991799 Yen et al. Nov 1999 A
5995155 Schindler et al. Nov 1999 A
5997964 Klima, Jr. Dec 1999 A
5999912 Wodarz et al. Dec 1999 A
6002393 Hite et al. Dec 1999 A
6002394 Schein et al. Dec 1999 A
6002444 Marshall et al. Dec 1999 A
6005561 Hawkins et al. Dec 1999 A
6005562 Shiga et al. Dec 1999 A
6005563 White et al. Dec 1999 A
6005565 Legall et al. Dec 1999 A
6005566 Jones et al. Dec 1999 A
6005597 Barrett et al. Dec 1999 A
6005631 Anderson et al. Dec 1999 A
6006218 Breese et al. Dec 1999 A
6006257 Slezak Dec 1999 A
6008802 Iki et al. Dec 1999 A
6008803 Rowe et al. Dec 1999 A
6011546 Bertram Jan 2000 A
6014137 Burns Jan 2000 A
6014184 Knee et al. Jan 2000 A
6014502 Moraes Jan 2000 A
6014638 Burge et al. Jan 2000 A
6016141 Knudson et al. Jan 2000 A
6018372 Etheredge Jan 2000 A
6018768 Ullman et al. Jan 2000 A
6020880 Naimpally Feb 2000 A
6020883 Herz et al. Feb 2000 A
6020929 Marshall et al. Feb 2000 A
6023267 Chapuis et al. Feb 2000 A
6025837 Matthews, III et al. Feb 2000 A
6025886 Koda Feb 2000 A
6028599 Yuen et al. Feb 2000 A
6028600 Rosin et al. Feb 2000 A
6029045 Picco et al. Feb 2000 A
6029176 Cannon Feb 2000 A
6029195 Herz Feb 2000 A
6031806 Tomita Feb 2000 A
6035091 Kazo Mar 2000 A
6035304 MacHida et al. Mar 2000 A
6037933 Blonstein et al. Mar 2000 A
6038367 Abecassis Mar 2000 A
6047317 Bisdikian et al. Apr 2000 A
6049824 Simonin Apr 2000 A
6052145 MacRae et al. Apr 2000 A
6057872 Candelore May 2000 A
6057890 Virden et al. May 2000 A
6061060 Berry et al. May 2000 A
6061082 Park May 2000 A
6061097 Satterfield May 2000 A
6064376 Berezowski et al. May 2000 A
6064980 Jacobi et al. May 2000 A
6067303 Aaker et al. May 2000 A
6067561 Dillon May 2000 A
6072460 Marshall et al. Jun 2000 A
6072982 Haddad Jun 2000 A
6075526 Rothmuller Jun 2000 A
6075551 Berezowski et al. Jun 2000 A
6075575 Schein et al. Jun 2000 A
6078348 Klosterman et al. Jun 2000 A
6081291 Ludwig, Jr. Jun 2000 A
6081750 Hoffberg et al. Jun 2000 A
6081830 Schindler Jun 2000 A
6088722 Herz et al. Jul 2000 A
6088945 Sanderfoot Jul 2000 A
6091883 Artigalas et al. Jul 2000 A
6091884 Yuen et al. Jul 2000 A
RE36801 Logan et al. Aug 2000 E
6098065 Skillen et al. Aug 2000 A
6104705 Ismail et al. Aug 2000 A
6108042 Adams et al. Aug 2000 A
6111614 Mugura et al. Aug 2000 A
6112186 Bergh et al. Aug 2000 A
6115057 Kwoh et al. Sep 2000 A
6118492 Milnes et al. Sep 2000 A
6119098 Guyot et al. Sep 2000 A
6119101 Peckover Sep 2000 A
6122011 Dias et al. Sep 2000 A
6124854 Sartain et al. Sep 2000 A
6125230 Yaginuma Sep 2000 A
6130726 Darbee et al. Oct 2000 A
6133909 Schein et al. Oct 2000 A
6133910 Stinebruner Oct 2000 A
6139177 Venkatraman et al. Oct 2000 A
6141003 Chor et al. Oct 2000 A
6141488 Knudson et al. Oct 2000 A
6147714 Terasawa et al. Nov 2000 A
6147715 Yuen et al. Nov 2000 A
6151059 Schein et al. Nov 2000 A
6151643 Cheng et al. Nov 2000 A
6154203 Yuen et al. Nov 2000 A
6154752 Ryan Nov 2000 A
6154771 Rangan et al. Nov 2000 A
6155001 Marin Dec 2000 A
6157411 Williams et al. Dec 2000 A
6157413 Hanafee et al. Dec 2000 A
6160545 Eyer et al. Dec 2000 A
6160546 Thompson et al. Dec 2000 A
6160570 Sitnik Dec 2000 A
6160989 Hendricks et al. Dec 2000 A
6163316 Killian Dec 2000 A
6163345 Noguchi et al. Dec 2000 A
6166778 Yamamoto et al. Dec 2000 A
6167188 Young et al. Dec 2000 A
6169542 Hooks et al. Jan 2001 B1
6172674 Etheredge Jan 2001 B1
6172677 Stautner et al. Jan 2001 B1
6173271 Goodman et al. Jan 2001 B1
6175362 Harms et al. Jan 2001 B1
6177931 Alexander et al. Jan 2001 B1
6178446 Gerszberg et al. Jan 2001 B1
6181335 Hendricks et al. Jan 2001 B1
6184877 Dodson et al. Feb 2001 B1
6185360 Inoue et al. Feb 2001 B1
6186287 Heidenreich et al. Feb 2001 B1
6186443 Shaffer Feb 2001 B1
6191780 Martin et al. Feb 2001 B1
6195501 Perry et al. Feb 2001 B1
6201536 Hendricks et al. Mar 2001 B1
6202023 Hancock et al. Mar 2001 B1
6202058 Rose et al. Mar 2001 B1
6202212 Sturgeon et al. Mar 2001 B1
6208335 Gordon et al. Mar 2001 B1
6208799 Marsh et al. Mar 2001 B1
6209129 Carr et al. Mar 2001 B1
6209130 Rector, Jr. et al. Mar 2001 B1
6212553 Lee et al. Apr 2001 B1
6216264 Maze et al. Apr 2001 B1
6219839 Sampsell Apr 2001 B1
6226447 Sasaki et al. May 2001 B1
6233389 Barton et al. May 2001 B1
6237145 Narasimhan et al. May 2001 B1
6237146 Richards et al. May 2001 B1
6239794 Yuen et al. May 2001 B1
6240555 Shoff et al. May 2001 B1
6253203 O'Flaherty et al. Jun 2001 B1
6256071 Hiroi Jul 2001 B1
6256785 Klappert et al. Jul 2001 B1
6257268 Hope et al. Jul 2001 B1
6262721 Tsukidate et al. Jul 2001 B1
6262772 Shen et al. Jul 2001 B1
6263501 Schein et al. Jul 2001 B1
6263507 Ahmad et al. Jul 2001 B1
6268849 Boyer et al. Jul 2001 B1
6275268 Ellis et al. Aug 2001 B1
6275648 Knudson et al. Aug 2001 B1
6279157 Takasu Aug 2001 B1
6282713 Kitsukawa et al. Aug 2001 B1
6285713 Nakaya et al. Sep 2001 B1
6286140 Ivanyi Sep 2001 B1
6289346 Milewski et al. Sep 2001 B1
6298482 Seidman et al. Oct 2001 B1
6311877 Yang Nov 2001 B1
6312336 Handelman et al. Nov 2001 B1
6317885 Fries Nov 2001 B1
6320588 Palmer et al. Nov 2001 B1
6323911 Schein et al. Nov 2001 B1
6323931 Fujita et al. Nov 2001 B1
6324338 Wood et al. Nov 2001 B1
6326982 Wu et al. Dec 2001 B1
6327418 Barton Dec 2001 B1
6331877 Bennington et al. Dec 2001 B1
6334022 Ohba et al. Dec 2001 B1
6335722 Tani et al. Jan 2002 B1
6335963 Bosco Jan 2002 B1
6341195 Mankovitz et al. Jan 2002 B1
6341374 Schein et al. Jan 2002 B2
6342926 Hanafee et al. Jan 2002 B1
6357042 Srinivasan et al. Mar 2002 B2
6357043 Ellis et al. Mar 2002 B1
6359636 Schindler et al. Mar 2002 B1
6363525 Dougherty et al. Mar 2002 B1
6366890 Usrey Apr 2002 B1
6370526 Agrawal et al. Apr 2002 B1
6373528 Bennington et al. Apr 2002 B1
6381582 Walker et al. Apr 2002 B1
6388714 Schein et al. May 2002 B1
6389593 Yamagishi May 2002 B1
6392710 Gonsalves et al. May 2002 B1
6396546 Alten et al. May 2002 B1
6400407 Zigmond et al. Jun 2002 B1
6405371 Oosterhout et al. Jun 2002 B1
6408437 Hendricks et al. Jun 2002 B1
6411308 Blonstein et al. Jun 2002 B1
6411696 Iverson et al. Jun 2002 B1
6412110 Schein et al. Jun 2002 B1
6418556 Bennington et al. Jul 2002 B1
6421067 Kamen et al. Jul 2002 B1
6426779 Noguchi et al. Jul 2002 B1
6437836 Huang et al. Aug 2002 B1
6438579 Hosken Aug 2002 B1
6438752 McClard Aug 2002 B1
6441832 Tao et al. Aug 2002 B1
6442332 Knudson et al. Aug 2002 B1
6446261 Rosser Sep 2002 B1
6453471 Klosterman Sep 2002 B1
RE37881 Haines Oct 2002 E
6463585 Hendricks et al. Oct 2002 B1
6469753 Klosterman et al. Oct 2002 B1
6470497 Ellis et al. Oct 2002 B1
6473559 Knudson et al. Oct 2002 B1
6477579 Kunkel et al. Nov 2002 B1
6477705 Yuen et al. Nov 2002 B1
6480667 O'Connor Nov 2002 B1
6486892 Stern Nov 2002 B1
6486920 Arai et al. Nov 2002 B2
6487362 Yuen et al. Nov 2002 B1
6487541 Aggarwal et al. Nov 2002 B1
6493876 DeFreese et al. Dec 2002 B1
6498895 Young et al. Dec 2002 B2
6499138 Swix et al. Dec 2002 B1
6505348 Knowles et al. Jan 2003 B1
6507953 Horlander et al. Jan 2003 B1
6515680 Hendricks et al. Feb 2003 B1
6516323 Kamba Feb 2003 B1
6530082 Del Sesto et al. Mar 2003 B1
6539548 Hendricks et al. Mar 2003 B1
6542169 Marshall et al. Apr 2003 B1
6545722 Schultheiss et al. Apr 2003 B1
6546556 Kataoka et al. Apr 2003 B1
6564005 Berstis May 2003 B1
6564170 Halabieh May 2003 B2
6564213 Ortega May 2003 B1
6564378 Satterfield et al. May 2003 B1
6564379 Knudson et al. May 2003 B1
6567892 Horst et al. May 2003 B1
6567982 Howe et al. May 2003 B1
6571390 Dunn et al. May 2003 B1
6574424 Dimitri et al. Jun 2003 B1
6588013 Lumley et al. Jul 2003 B1
6600364 Liang et al. Jul 2003 B1
6600503 Stautner et al. Jul 2003 B2
6601074 Liebenow Jul 2003 B1
6606128 Hanafee et al. Aug 2003 B2
6611842 Brown Aug 2003 B1
6611958 Shintani et al. Aug 2003 B1
6614987 Ismail et al. Sep 2003 B1
6622304 Carhart Sep 2003 B1
6622306 Kamada Sep 2003 B1
6631523 Matthews, III et al. Oct 2003 B1
6637029 Maissel et al. Oct 2003 B1
6640337 Lu Oct 2003 B1
6651251 Shoff et al. Nov 2003 B1
6660503 Kierulff Dec 2003 B2
6661468 Alten et al. Dec 2003 B2
6665869 Ellis et al. Dec 2003 B1
6670971 Oral et al. Dec 2003 B1
6675386 Hendricks et al. Jan 2004 B1
6678706 Fishel Jan 2004 B1
6681396 Bates et al. Jan 2004 B1
6687906 Yuen et al. Feb 2004 B1
6698020 Zigmond et al. Feb 2004 B1
6704931 Schaffer et al. Mar 2004 B1
6714917 Eldering et al. Mar 2004 B1
6718324 Edlund et al. Apr 2004 B2
6718551 Swix et al. Apr 2004 B1
6721954 Nickum Apr 2004 B1
6727914 Gutta Apr 2004 B1
6728967 Bennington et al. Apr 2004 B2
6732369 Schein et al. May 2004 B1
6738978 Hendricks et al. May 2004 B1
6742183 Reynolds et al. May 2004 B1
6744967 Kaminski et al. Jun 2004 B2
6751800 Fukuda et al. Jun 2004 B1
6754904 Cooper et al. Jun 2004 B1
6756987 Goyins et al. Jun 2004 B2
6756997 Ward, III et al. Jun 2004 B1
6757906 Look et al. Jun 2004 B1
6760537 Mankovitz Jul 2004 B2
6760538 Bumgardner et al. Jul 2004 B1
6766100 Komar et al. Jul 2004 B1
6771317 Ellis et al. Aug 2004 B2
6771886 Mendelsohn Aug 2004 B1
6788882 Geer et al. Sep 2004 B1
6792618 Bendinelli et al. Sep 2004 B1
6799326 Boylan, III et al. Sep 2004 B2
6799327 Reynolds et al. Sep 2004 B1
6820278 Ellis Nov 2004 B1
6828993 Hendricks et al. Dec 2004 B1
6837791 McNutt et al. Jan 2005 B1
6847387 Roth Jan 2005 B2
6850693 Young et al. Feb 2005 B2
6857131 Yagawa et al. Feb 2005 B1
6865746 Herrington et al. Mar 2005 B1
6868551 Lawler et al. Mar 2005 B1
6898762 Ellis et al. May 2005 B2
6920278 Yano et al. Jul 2005 B1
6920281 Agnibotri et al. Jul 2005 B1
6925035 Ueki Aug 2005 B2
6934964 Schaffer et al. Aug 2005 B1
6938208 Reichardt Aug 2005 B2
6947922 Glance Sep 2005 B1
6973621 Sie et al. Dec 2005 B2
6973665 Dudkiewicz et al. Dec 2005 B2
6973669 Daniels Dec 2005 B2
6981040 Konig et al. Dec 2005 B1
6983478 Grauch et al. Jan 2006 B1
6985188 Hurst, Jr. Jan 2006 B1
7003792 Yuen Feb 2006 B1
7007294 Kurapati Feb 2006 B1
7017118 Carroll Mar 2006 B1
7017179 Asamoto et al. Mar 2006 B1
7024424 Platt et al. Apr 2006 B1
7027716 Boyle et al. Apr 2006 B1
7028326 Westlake et al. Apr 2006 B1
7029935 Negley et al. Apr 2006 B2
7039935 Knudson et al. May 2006 B2
7047550 Yasukawa et al. May 2006 B1
7058635 Shah-Nazaroff et al. Jun 2006 B1
7065709 Ellis et al. Jun 2006 B2
7069576 Knudson et al. Jun 2006 B1
7073187 Hendricks et al. Jul 2006 B1
7088910 Potrebic et al. Aug 2006 B2
7096486 Ukai et al. Aug 2006 B1
7100185 Bennington et al. Aug 2006 B2
7117518 Takahashi et al. Oct 2006 B1
7143430 Fingerman et al. Nov 2006 B1
7165098 Boyer et al. Jan 2007 B1
7181128 Wada et al. Feb 2007 B1
7185355 Ellis et al. Feb 2007 B1
7187847 Young et al. Mar 2007 B2
7200859 Perlman et al. Apr 2007 B1
7209640 Young et al. Apr 2007 B2
7209915 Taboada et al. Apr 2007 B1
7218839 Plourde, Jr. et al. May 2007 B2
7229012 Enright et al. Jun 2007 B1
7229354 McNutt et al. Jun 2007 B2
7243139 Ullman et al. Jul 2007 B2
7266833 Ward, III et al. Sep 2007 B2
7287267 Knudson et al. Oct 2007 B2
7293276 Phillips et al. Nov 2007 B2
7328450 Macrae et al. Feb 2008 B2
7356246 Kobb Apr 2008 B1
7369749 Ichioka et al. May 2008 B2
7369750 Cheng et al. May 2008 B2
7370342 Ismail et al. May 2008 B2
7392532 White et al. Jun 2008 B2
7398541 Bennington et al. Jul 2008 B2
7403935 Horvitz et al. Jul 2008 B2
7412441 Scott, III et al. Aug 2008 B2
7424510 Gross et al. Sep 2008 B2
7437751 Daniels Oct 2008 B2
7440677 Strasser Oct 2008 B2
7454515 Lamkin et al. Nov 2008 B2
7454772 Fellenstein et al. Nov 2008 B2
7467398 Fellenstein et al. Dec 2008 B2
7477832 Young et al. Jan 2009 B2
7480929 Klosterman et al. Jan 2009 B2
7487528 Satterfield et al. Feb 2009 B2
7487529 Orlick Feb 2009 B1
7493641 Klosterman et al. Feb 2009 B2
7503055 Reynolds et al. Mar 2009 B2
7506350 Johnson Mar 2009 B2
7519268 Juen et al. Apr 2009 B2
7540010 Hanaya et al. May 2009 B2
7574382 Hubert Aug 2009 B1
7577336 Srinivasan et al. Aug 2009 B2
7590993 Hendricks et al. Sep 2009 B1
7599753 Taylor et al. Oct 2009 B2
7603685 Knudson et al. Oct 2009 B2
7634786 Knee et al. Dec 2009 B2
7644054 Garg et al. Jan 2010 B2
7657563 Kim et al. Feb 2010 B2
7664746 Majumder Feb 2010 B2
7665109 Matthews, III et al. Feb 2010 B2
7685620 Fellenstein et al. Mar 2010 B2
7689995 Francis et al. Mar 2010 B1
7693827 Zamir et al. Apr 2010 B2
7707617 Birleson Apr 2010 B2
7725467 Yamamoto et al. May 2010 B2
7770196 Hendricks Aug 2010 B1
7774335 Scofield et al. Aug 2010 B1
7778158 Vogel et al. Aug 2010 B2
7779437 Barton Aug 2010 B2
7793326 McCoskey et al. Sep 2010 B2
7801888 Rao et al. Sep 2010 B2
7823055 Sull et al. Oct 2010 B2
7840577 Ortega et al. Nov 2010 B2
7859571 Brown et al. Dec 2010 B1
7882520 Beach et al. Feb 2011 B2
7895218 Venkataraman et al. Feb 2011 B2
7925141 Geer et al. Apr 2011 B2
7996864 Yuen et al. Aug 2011 B2
8051450 Robarts et al. Nov 2011 B2
8065702 Goldberg et al. Nov 2011 B2
8078751 Janik et al. Dec 2011 B2
8087050 Ellis et al. Dec 2011 B2
8230343 Logan et al. Jul 2012 B2
8265458 Helmstetter Sep 2012 B2
8275764 Jeon et al. Sep 2012 B2
8363679 Sorenson et al. Jan 2013 B2
8478750 Rao et al. Jul 2013 B2
8515954 Gibbs et al. Aug 2013 B2
8613020 Knudson et al. Dec 2013 B2
8635649 Ward, III et al. Jan 2014 B2
8707366 Wong et al. Apr 2014 B2
RE44966 Flinn et al. Jun 2014 E
9075882 Ward et al. Jul 2015 B1
9256685 Zamir et al. Feb 2016 B2
20010001160 Shoff et al. May 2001 A1
20010013009 Greening et al. Aug 2001 A1
20010013122 Hirata Aug 2001 A1
20010013123 Freeman et al. Aug 2001 A1
20010025375 Ahmad et al. Sep 2001 A1
20010027555 Franken et al. Oct 2001 A1
20010027562 Schein et al. Oct 2001 A1
20010028782 Ohno et al. Oct 2001 A1
20010029610 Corvin et al. Oct 2001 A1
20010034237 Garahi Oct 2001 A1
20010042246 Yuen et al. Nov 2001 A1
20010043795 Wood et al. Nov 2001 A1
20010047298 Moore et al. Nov 2001 A1
20010049820 Barton Dec 2001 A1
20010054181 Corvin Dec 2001 A1
20020009283 Ichioka et al. Jan 2002 A1
20020019882 Soejima et al. Feb 2002 A1
20020026496 Boyer et al. Feb 2002 A1
20020035573 Black et al. Mar 2002 A1
20020042913 Ellis et al. Apr 2002 A1
20020042914 Walker et al. Apr 2002 A1
20020042918 Townsend et al. Apr 2002 A1
20020048448 Daniels Apr 2002 A1
20020049973 Alten et al. Apr 2002 A1
20020053078 Holtz et al. May 2002 A1
20020056098 White May 2002 A1
20020057893 Wood et al. May 2002 A1
20020059599 Schein et al. May 2002 A1
20020059602 MacRae et al. May 2002 A1
20020073424 Ward et al. Jun 2002 A1
20020078450 Bennington et al. Jun 2002 A1
20020083439 Eldering Jun 2002 A1
20020090203 Mankovitz Jul 2002 A1
20020092017 Klosterman et al. Jul 2002 A1
20020095676 Knee et al. Jul 2002 A1
20020107853 Hofmann et al. Aug 2002 A1
20020110353 Potrebic et al. Aug 2002 A1
20020112249 Hendricks et al. Aug 2002 A1
20020120925 Logan Aug 2002 A1
20020120933 Knudson et al. Aug 2002 A1
20020124249 Shintani et al. Sep 2002 A1
20020129368 Schlack et al. Sep 2002 A1
20020138840 Schein et al. Sep 2002 A1
20020144279 Zhou Oct 2002 A1
20020147976 Yuen et al. Oct 2002 A1
20020147977 Hammett et al. Oct 2002 A1
20020154888 Allen et al. Oct 2002 A1
20020166119 Cristofalo Nov 2002 A1
20020174230 Gudorf et al. Nov 2002 A1
20020174424 Chang et al. Nov 2002 A1
20020174430 Ellis et al. Nov 2002 A1
20020174433 Baumgartner et al. Nov 2002 A1
20020191954 Beach et al. Dec 2002 A1
20020194596 Srivastava Dec 2002 A1
20020198762 Donato Dec 2002 A1
20020199185 Kaminski et al. Dec 2002 A1
20030005432 Ellis et al. Jan 2003 A1
20030005445 Schein et al. Jan 2003 A1
20030009766 Marolda Jan 2003 A1
20030067554 Klarfeld et al. Apr 2003 A1
20030088873 McCoy et al. May 2003 A1
20030093792 Labeeb et al. May 2003 A1
20030098891 Molander May 2003 A1
20030103088 Dresti et al. Jun 2003 A1
20030110056 Berghofer et al. Jun 2003 A1
20030110494 Bennington et al. Jun 2003 A1
20030110495 Bennington et al. Jun 2003 A1
20030110499 Knudson et al. Jun 2003 A1
20030110500 Rodriguez Jun 2003 A1
20030115599 Bennington et al. Jun 2003 A1
20030115602 Knee et al. Jun 2003 A1
20030118323 Ismail et al. Jun 2003 A1
20030126607 Phillips et al. Jul 2003 A1
20030135490 Barrett et al. Jul 2003 A1
20030145323 Hendricks et al. Jul 2003 A1
20030149988 Ellis et al. Aug 2003 A1
20030163813 Klosterman et al. Aug 2003 A1
20030164858 Klosterman et al. Sep 2003 A1
20030188310 Klosterman et al. Oct 2003 A1
20030188311 Yuen et al. Oct 2003 A1
20030192050 Fellenstein et al. Oct 2003 A1
20030196201 Schein et al. Oct 2003 A1
20030196203 Ellis et al. Oct 2003 A1
20030198462 Bumgardner et al. Oct 2003 A1
20030204847 Ellis et al. Oct 2003 A1
20030206719 Kilonoardner et al. Nov 2003 A1
20030208756 Macrae et al. Nov 2003 A1
20030208758 Schein et al. Nov 2003 A1
20030208759 Gordon et al. Nov 2003 A1
20030210898 Juen et al. Nov 2003 A1
20030212996 Wolzien Nov 2003 A1
20030226144 Thurston et al. Dec 2003 A1
20040003097 Willis Jan 2004 A1
20040003405 Boston et al. Jan 2004 A1
20040003407 Hanafee et al. Jan 2004 A1
20040015397 Barry et al. Jan 2004 A1
20040019907 Li et al. Jan 2004 A1
20040049787 Maissel et al. Mar 2004 A1
20040049794 Shao et al. Mar 2004 A1
20040060063 Russ et al. Mar 2004 A1
20040070594 Burke Apr 2004 A1
20040073918 Ferman et al. Apr 2004 A1
20040073923 Wasserman Apr 2004 A1
20040073924 Pendakur Apr 2004 A1
20040078809 Drazin Apr 2004 A1
20040078814 Allen Apr 2004 A1
20040078815 Lemmons et al. Apr 2004 A1
20040088729 Petrovic et al. May 2004 A1
20040098744 Gutta May 2004 A1
20040103092 Tuzhilin et al. May 2004 A1
20040103434 Ellis May 2004 A1
20040111742 Hendricks et al. Jun 2004 A1
20040111745 Schein et al. Jun 2004 A1
20040128686 Boyer et al. Jul 2004 A1
20040133910 Gordon et al. Jul 2004 A1
20040139465 Matthews et al. Jul 2004 A1
20040156614 Bumgardner et al. Aug 2004 A1
20040160862 Ueki Aug 2004 A1
20040168189 Reynolds et al. Aug 2004 A1
20040181814 Ellis et al. Sep 2004 A1
20040187164 Kandasamy et al. Sep 2004 A1
20040194131 Ellis et al. Sep 2004 A1
20040194138 Boylan et al. Sep 2004 A1
20040194141 Sanders Sep 2004 A1
20040210932 Mori et al. Oct 2004 A1
20040210935 Schein et al. Oct 2004 A1
20040221310 Herrington et al. Nov 2004 A1
20040255321 Matz Dec 2004 A1
20040264920 Helmstetter Dec 2004 A1
20050010949 Ward et al. Jan 2005 A1
20050015804 LaJoie et al. Jan 2005 A1
20050015815 Shoff et al. Jan 2005 A1
20050028218 Blake Feb 2005 A1
20050097622 Zigmond et al. May 2005 A1
20050125240 Speiser et al. Jun 2005 A9
20050129049 Srinivasan et al. Jun 2005 A1
20050132264 Joshi et al. Jun 2005 A1
20050138659 Boccon-Gibod et al. Jun 2005 A1
20050138660 Boyer et al. Jun 2005 A1
20050154640 Kolluri et al. Jul 2005 A1
20050155056 Knee et al. Jul 2005 A1
20050157217 Hendricks Jul 2005 A1
20050183123 Lee et al. Aug 2005 A1
20050188402 de Andrade et al. Aug 2005 A1
20050193015 Logston et al. Sep 2005 A1
20050198668 Yuen et al. Sep 2005 A1
20050204382 Ellis Sep 2005 A1
20050204388 Knudson et al. Sep 2005 A1
20050216936 Knudson et al. Sep 2005 A1
20050229214 Young et al. Oct 2005 A1
20050229215 Schein et al. Oct 2005 A1
20050234880 Zeng et al. Oct 2005 A1
20050235320 Maze et al. Oct 2005 A1
20050235323 Ellis et al. Oct 2005 A1
20050240962 Cooper et al. Oct 2005 A1
20050240968 Knudson et al. Oct 2005 A1
20050244138 O'Connor et al. Nov 2005 A1
20050251827 Ellis et al. Nov 2005 A1
20050267994 Wong Dec 2005 A1
20050273377 Ouimet et al. Dec 2005 A1
20050273819 Knudson et al. Dec 2005 A1
20050278741 Robarts et al. Dec 2005 A1
20050283796 Flickinger Dec 2005 A1
20050283800 Ellis et al. Dec 2005 A1
20060037044 Daniels Feb 2006 A1
20060041548 Parsons et al. Feb 2006 A1
20060083484 Wada et al. Apr 2006 A1
20060101490 Leurs May 2006 A1
20060123448 Ma et al. Jun 2006 A1
20060140584 Ellis et al. Jun 2006 A1
20060150216 Herz et al. Jul 2006 A1
20060155764 Tao Jul 2006 A1
20060156329 Treese Jul 2006 A1
20060161952 Herz et al. Jul 2006 A1
20060184558 Martin et al. Aug 2006 A1
20060204142 West et al. Sep 2006 A1
20060212900 Ismail et al. Sep 2006 A1
20060248555 Eldering Nov 2006 A1
20060277271 Morse Dec 2006 A1
20070005526 Whitney et al. Jan 2007 A1
20070005653 Marsh Jan 2007 A1
20070016926 Ward et al. Jan 2007 A1
20070028266 Trajkovic et al. Feb 2007 A1
20070033224 Allen et al. Feb 2007 A1
20070033613 Ward et al. Feb 2007 A1
20070038672 Plastina et al. Feb 2007 A1
20070050352 Kim Mar 2007 A1
20070094067 Kumar et al. Apr 2007 A1
20070118498 Song et al. May 2007 A1
20070136751 Garbow et al. Jun 2007 A1
20070157242 Cordray et al. Jul 2007 A1
20070162934 Roop et al. Jul 2007 A1
20070186240 Ward et al. Aug 2007 A1
20070204308 Nicholas et al. Aug 2007 A1
20070208718 Javid et al. Sep 2007 A1
20070214480 Kamen Sep 2007 A1
20070234393 Walker et al. Oct 2007 A1
20070244902 Seide et al. Oct 2007 A1
20070255693 Ramaswamy et al. Nov 2007 A1
20070271582 Ellis et al. Nov 2007 A1
20070288961 Guldi et al. Dec 2007 A1
20080004989 Yi Jan 2008 A1
20080066111 Ellis et al. Mar 2008 A1
20080077575 Tateno et al. Mar 2008 A1
20080080774 Jacobs et al. Apr 2008 A1
20080092155 Ferrone et al. Apr 2008 A1
20080115169 Ellis et al. May 2008 A1
20080126303 Park et al. May 2008 A1
20080127265 Ward et al. May 2008 A1
20080127266 Ward et al. May 2008 A1
20080178216 Bennington et al. Jul 2008 A1
20080178221 Schein et al. Jul 2008 A1
20080178222 Bennington et al. Jul 2008 A1
20080178223 Kwoh et al. Jul 2008 A1
20080184286 Kwoh et al. Jul 2008 A1
20080184305 Schein et al. Jul 2008 A1
20080184308 Herrington et al. Jul 2008 A1
20080184312 Schein et al. Jul 2008 A1
20080184315 Ellis et al. Jul 2008 A1
20080189744 Schein et al. Aug 2008 A1
20080222106 Rao et al. Sep 2008 A1
20080235725 Hendricks Sep 2008 A1
20080270561 Tang et al. Oct 2008 A1
20080281689 Blinnikka et al. Nov 2008 A1
20080288980 Schein et al. Nov 2008 A1
20090025033 Stautner et al. Jan 2009 A1
20090049481 Fellenstein et al. Feb 2009 A1
20090055390 Maeda et al. Feb 2009 A1
20090070817 Ellis et al. Mar 2009 A1
20090119723 Tinsman May 2009 A1
20090150219 Headings et al. Jun 2009 A1
20090193458 Finseth et al. Jul 2009 A1
20090234878 Herz et al. Sep 2009 A1
20100115541 Schein et al. May 2010 A1
20100146543 Knee et al. Jun 2010 A1
20100175078 Knudson et al. Jul 2010 A1
20100247065 Cooper et al. Sep 2010 A1
20100275230 Yuen et al. Oct 2010 A1
20100299692 Rao et al. Nov 2010 A1
20100319013 Knudson et al. Dec 2010 A1
20110013885 Wong et al. Jan 2011 A1
20110035771 Ward, III et al. Feb 2011 A1
20110131601 Alten et al. Jun 2011 A1
20110138417 Klappert Jun 2011 A1
20110167451 Yuen et al. Jul 2011 A1
20110185387 Schein et al. Jul 2011 A1
20110209170 Schein et al. Aug 2011 A1
20110276995 Alten et al. Nov 2011 A1
20120079539 Schein et al. Mar 2012 A1
20120095834 Doig et al. Apr 2012 A1
20120102523 Herz et al. Apr 2012 A1
20120185901 Macrae et al. Jul 2012 A1
20120272270 Boyer et al. Oct 2012 A1
20130031582 Tinsman et al. Jan 2013 A1
20140149434 Aravamudan et al. May 2014 A1
20170262437 Raichelgauz et al. Sep 2017 A1
Foreign Referenced Citations (543)
Number Date Country
199856198 Jul 1998 AU
731010 Mar 2001 AU
733993 May 2001 AU
749209 Jun 2002 AU
760568 May 2003 AU
765648 Sep 2003 AU
2008201306 Apr 2008 AU
1030505 May 1978 CA
1187197 May 1985 CA
1188811 Jun 1985 CA
1196082 Oct 1985 CA
1200911 Feb 1986 CA
1203625 Apr 1986 CA
2151458 Jun 1994 CA
2164608 Dec 1994 CA
2285645 Jul 1998 CA
2297039 Jan 1999 CA
2312326 Jun 1999 CA
2322217 Sep 1999 CA
2324278 Nov 1999 CA
2513282 Nov 1999 CA
1200221 Nov 1998 CN
1226030 Aug 1999 CN
1555191 Dec 2004 CN
1567986 Jan 2005 CN
29 18 846 Nov 1980 DE
3246225 Jun 1984 DE
3337204 Apr 1985 DE
36 21 263 Jan 1988 DE
3640436 Jun 1988 DE
3702220 Aug 1988 DE
3909334 Sep 1990 DE
41 43 074 Jul 1992 DE
4201031 Jul 1993 DE
4217246 Dec 1993 DE
4240187 Jun 1994 DE
4407701 Sep 1995 DE
4440419 May 1996 DE
19 531 121 Feb 1997 DE
19 740 079 Mar 1999 DE
19 931 046 Jan 2001 DE
42 90 947 Nov 2006 DE
0 072 153 Feb 1983 EP
0 148 733 Jul 1985 EP
0 222 025 May 1987 EP
0 229 526 Jul 1987 EP
0 239 884 Oct 1987 EP
0 276425 Aug 1988 EP
0337336 Oct 1989 EP
0339675 Nov 1989 EP
0 363 847 Apr 1990 EP
0 393 555 Oct 1990 EP
0396062 Nov 1990 EP
0 401 015 Dec 1990 EP
0401930 Dec 1990 EP
0408892 Jan 1991 EP
0420123 Apr 1991 EP
0424648 May 1991 EP
0444496 Sep 1991 EP
0447968 Sep 1991 EP
0 463 451 Jan 1992 EP
0 477 754 Apr 1992 EP
0477756 Apr 1992 EP
0 489 387 Jun 1992 EP
0488379 Jun 1992 EP
0 492 853 Jul 1992 EP
497 235 Aug 1992 EP
0532322 Mar 1993 EP
0536901 Apr 1993 EP
0550911 Jul 1993 EP
0 560 593 Sep 1993 EP
0 572 090 Dec 1993 EP
0 575 956 Dec 1993 EP
0617563 Sep 1994 EP
0 620 689 Oct 1994 EP
0624039 Nov 1994 EP
0 644 689 Mar 1995 EP
0 650 114 Apr 1995 EP
0 658 048 Jun 1995 EP
0 669 760 Aug 1995 EP
0 673 164 Sep 1995 EP
0 682 452 Nov 1995 EP
0 723369 Jul 1996 EP
0721253 Jul 1996 EP
0725539 Aug 1996 EP
0 742669 Nov 1996 EP
0752767 Jan 1997 EP
0753964 Jan 1997 EP
0762751 Mar 1997 EP
0762756 Mar 1997 EP
0772360 May 1997 EP
0774866 May 1997 EP
0775417 May 1997 EP
0784405 Jul 1997 EP
0 789 488 Aug 1997 EP
0797355 Sep 1997 EP
0 804 028 Oct 1997 EP
0 805 590 Nov 1997 EP
0 806 111 Nov 1997 EP
0805594 Nov 1997 EP
0822718 Feb 1998 EP
0827340 Mar 1998 EP
0 836 320 Apr 1998 EP
0 836 321 Apr 1998 EP
0 837599 Apr 1998 EP
0834798 Apr 1998 EP
0843468 May 1998 EP
0848554 Jun 1998 EP
0849948 Jun 1998 EP
0 852361 Jul 1998 EP
0 854645 Jul 1998 EP
0851681 Jul 1998 EP
0852442 Jul 1998 EP
0854654 Jul 1998 EP
0880856 Dec 1998 EP
0 892 554 Jan 1999 EP
0905985 Mar 1999 EP
0 921 682 Jun 1999 EP
0924927 Jun 1999 EP
0935393 Aug 1999 EP
0 940 983 Sep 1999 EP
0 945003 Sep 1999 EP
0940985 Sep 1999 EP
0944253 Sep 1999 EP
0963119 Dec 1999 EP
0988876 Mar 2000 EP
1014715 Jun 2000 EP
1 058 999 Dec 2000 EP
1 059 749 Dec 2000 EP
1067792 Jan 2001 EP
1 093 305 Apr 2001 EP
1095504 May 2001 EP
1135929 Sep 2001 EP
0 856 847 Nov 2001 EP
1213919 Jun 2002 EP
1036466 Mar 2003 EP
0936811 May 2003 EP
1763234 Mar 2007 EP
2662895 Dec 1991 FR
1 554 411 Oct 1979 GB
2034995 Jun 1980 GB
2126002 Mar 1984 GB
2185670 Jul 1987 GB
2217144 Oct 1989 GB
2 227 622 Aug 1990 GB
2 229 595 Sep 1990 GB
2256546 Dec 1992 GB
2264409 Aug 1993 GB
2 275 585 Aug 1994 GB
2305049 Mar 1997 GB
2309134 Jul 1997 GB
2325537 Nov 1998 GB
2 346 251 Aug 2000 GB
2377578 Jan 2003 GB
1035285 Mar 2005 HK
58137334 Aug 1983 JP
58137344 Aug 1983 JP
58196738 Nov 1983 JP
58210776 Dec 1983 JP
59141878 Aug 1984 JP
61050470 Mar 1986 JP
61074476 Apr 1986 JP
62-060370 Mar 1987 JP
62060372 Mar 1987 JP
62060384 Mar 1987 JP
06392177 Apr 1988 JP
63234679 Sep 1988 JP
01307944 Dec 1989 JP
02048879 Feb 1990 JP
02-119307 May 1990 JP
06-141250 May 1990 JP
2189753 Jul 1990 JP
10-234007 Sep 1990 JP
03-022770 Jan 1991 JP
03063990 Mar 1991 JP
03-167975 Jul 1991 JP
3178278 Aug 1991 JP
03-214919 Sep 1991 JP
03-243076 Oct 1991 JP
09-009244 Jan 1992 JP
04-44475 Feb 1992 JP
04079053 Mar 1992 JP
04-162889 Jun 1992 JP
04-180480 Jun 1992 JP
04227380 Aug 1992 JP
04250760 Sep 1992 JP
04-335395 Nov 1992 JP
4340258 Nov 1992 JP
05-103281 Apr 1993 JP
05-122692 May 1993 JP
05-183826 Jul 1993 JP
05284437 Oct 1993 JP
05-339100 Dec 1993 JP
06021907 Jan 1994 JP
06038165 Feb 1994 JP
06-90408 Mar 1994 JP
60-61935 Mar 1994 JP
06111413 Apr 1994 JP
06-124309 May 1994 JP
06-133235 May 1994 JP
06504165 May 1994 JP
06-164973 Jun 1994 JP
06243539 Sep 1994 JP
06-295312 Oct 1994 JP
06303541 Oct 1994 JP
0723356 Jan 1995 JP
07020254 Jan 1995 JP
07-050259 Feb 1995 JP
07-076592 Mar 1995 JP
07-135621 May 1995 JP
07123326 May 1995 JP
07-162776 Jun 1995 JP
07147657 Jun 1995 JP
07160732 Jun 1995 JP
07193762 Jul 1995 JP
7-262200 Oct 1995 JP
7-284033 Oct 1995 JP
07-288759 Oct 1995 JP
07-321748 Dec 1995 JP
08-32528 Feb 1996 JP
08-056352 Feb 1996 JP
0832538 Feb 1996 JP
08-137334 May 1996 JP
08125497 May 1996 JP
08130517 May 1996 JP
8-506469 Jul 1996 JP
08506941 Jul 1996 JP
08-196738 Aug 1996 JP
08-234709 Sep 1996 JP
08251122 Sep 1996 JP
08275077 Oct 1996 JP
08289281 Nov 1996 JP
08-331546 Dec 1996 JP
09-37168 Feb 1997 JP
09037151 Feb 1997 JP
09037171 Feb 1997 JP
09037172 Feb 1997 JP
9-65321 Mar 1997 JP
09-070020 Mar 1997 JP
09083888 Mar 1997 JP
09-102827 Apr 1997 JP
09-114781 May 1997 JP
09 162818 Jun 1997 JP
09-162821 Jun 1997 JP
09-247565 Sep 1997 JP
092-44475 Sep 1997 JP
09-261609 Oct 1997 JP
09-270965 Oct 1997 JP
09289630 Nov 1997 JP
09322213 Dec 1997 JP
10-042235 Feb 1998 JP
10-501936 Feb 1998 JP
10042218 Feb 1998 JP
10-093933 Apr 1998 JP
10-143340 May 1998 JP
10-143349 May 1998 JP
10-228500 Aug 1998 JP
10228687 Aug 1998 JP
10257400 Sep 1998 JP
10-289205 Oct 1998 JP
2838892 Oct 1998 JP
10-512420 Nov 1998 JP
11008810 Jan 1999 JP
11-136615 May 1999 JP
11-136658 May 1999 JP
11177962 Jul 1999 JP
11261917 Sep 1999 JP
11-313280 Nov 1999 JP
11308561 Nov 1999 JP
2000-013708 Jan 2000 JP
2000-138886 May 2000 JP
2000-224533 Aug 2000 JP
2000-235546 Aug 2000 JP
2000216845 Aug 2000 JP
2000-261750 Sep 2000 JP
2000-287179 Oct 2000 JP
2000-306314 Nov 2000 JP
2000-312333 Nov 2000 JP
2000-339931 Dec 2000 JP
2001-022282 Jan 2001 JP
2001-086423 Mar 2001 JP
2001-088372 Apr 2001 JP
2001-165669 Jun 2001 JP
2001-167522 Jun 2001 JP
2001-213595 Aug 2001 JP
2001-257950 Sep 2001 JP
2001-513595 Sep 2001 JP
2002506328 Feb 2002 JP
2002-279969 Sep 2002 JP
2003-018668 Jan 2003 JP
2003-189200 Jul 2003 JP
2003-199004 Jul 2003 JP
2004-007592 Jan 2004 JP
2004-023326 Jan 2004 JP
2006-186513 Jul 2006 JP
2006-340396 Dec 2006 JP
4062577 Mar 2008 JP
2010-119149 May 2010 JP
5053378 Oct 2012 JP
O247388 Oct 1994 TW
WO-8601359 Feb 1986 WO
WO-8601962 Mar 1986 WO
WO-8703766 Jun 1987 WO
WO-8804057 Jun 1988 WO
WO-8804507 Jun 1988 WO
WO-8902682 Mar 1989 WO
WO-8903085 Apr 1989 WO
WO-8912370 Dec 1989 WO
WO-9000847 Jan 1990 WO
WO-9001243 Feb 1990 WO
WO-9015507 Dec 1990 WO
WO-9100670 Jan 1991 WO
WO-9105436 Apr 1991 WO
WO-9106367 May 1991 WO
WO-9106912 May 1991 WO
WO-9118476 Nov 1991 WO
WO-9204801 Mar 1992 WO
WO-9222983 Dec 1992 WO
WO-9304473 Mar 1993 WO
WO-9305452 Mar 1993 WO
WO-9311638 Jun 1993 WO
WO-9311639 Jun 1993 WO
WO-9311640 Jun 1993 WO
WO-9323957 Nov 1993 WO
WO-9413107 Jun 1994 WO
WO-9414281 Jun 1994 WO
WO-9414282 Jun 1994 WO
WO-9414283 Jun 1994 WO
WO-9414284 Jun 1994 WO
WO-9416441 Jul 1994 WO
WO-9421085 Sep 1994 WO
WO-9423383 Oct 1994 WO
WO-9429811 Dec 1994 WO
WO-9501056 Jan 1995 WO
WO-9501057 Jan 1995 WO
WO-9501058 Jan 1995 WO
WO-9501059 Jan 1995 WO
WO-9502945 Jan 1995 WO
WO-9504431 Feb 1995 WO
WO-9506389 Mar 1995 WO
WO-9507003 Mar 1995 WO
WO-9510910 Apr 1995 WO
WO-9515658 Jun 1995 WO
WO-9516568 Jun 1995 WO
WO-9515649 Jun 1995 WO
WO-9515657 Jun 1995 WO
WO-9519092 Jul 1995 WO
WO-9526095 Sep 1995 WO
WO-9526608 Oct 1995 WO
WO-9528055 Oct 1995 WO
WO-9528799 Oct 1995 WO
WO-9530961 Nov 1995 WO
WO-9532585 Nov 1995 WO
WO-9532587 Nov 1995 WO
WO-9531069 Nov 1995 WO
WO-9532583 Nov 1995 WO
WO-1995030302 Nov 1995 WO
WO-9607270 Mar 1996 WO
WO-9608109 Mar 1996 WO
WO-9608923 Mar 1996 WO
WO-9609721 Mar 1996 WO
WO-9608113 Mar 1996 WO
WO-9613932 May 1996 WO
WO-9613935 May 1996 WO
WO-9617467 Jun 1996 WO
WO-9617473 Jun 1996 WO
WO-9621990 Jul 1996 WO
WO-9626605 Aug 1996 WO
WO-9627270 Sep 1996 WO
WO-9627982 Sep 1996 WO
WO-9627989 Sep 1996 WO
WO-9631980 Oct 1996 WO
WO-9634467 Oct 1996 WO
WO-9634486 Oct 1996 WO
WO-9634491 Oct 1996 WO
WO-9636172 Nov 1996 WO
WO-9637075 Nov 1996 WO
WO-9637996 Nov 1996 WO
WO-9638799 Dec 1996 WO
WO-9641477 Dec 1996 WO
WO-9641478 Dec 1996 WO
WO-9638962 Dec 1996 WO
WO-9641470 Dec 1996 WO
WO-9641471 Dec 1996 WO
WO-9702702 Jan 1997 WO
WO-9704595 Feb 1997 WO
WO-9707656 Mar 1997 WO
WO-9712486 Apr 1997 WO
WO-9713368 Apr 1997 WO
WO-9717774 May 1997 WO
WO-9718675 May 1997 WO
WO-9719555 May 1997 WO
WO-9726612 Jul 1997 WO
WO-9729458 Aug 1997 WO
WO-9731480 Aug 1997 WO
WO-9734413 Sep 1997 WO
WO-9734414 Sep 1997 WO
WO-9740623 Oct 1997 WO
WO-9741673 Nov 1997 WO
WO-9742763 Nov 1997 WO
WO-9746943 Dec 1997 WO
WO-9747124 Dec 1997 WO
WO-9748230 Dec 1997 WO
WO-9749237 Dec 1997 WO
WO-9749241 Dec 1997 WO
WO-9749242 Dec 1997 WO
WO-9750251 Dec 1997 WO
WO-9745786 Dec 1997 WO
WO-9748228 Dec 1997 WO
WO-1997047135 Dec 1997 WO
WO-199800975 Jan 1998 WO
WO-199800976 Jan 1998 WO
WO-9806219 Feb 1998 WO
WO-9810589 Mar 1998 WO
WO-9814009 Apr 1998 WO
WO-9816062 Apr 1998 WO
WO-9817063 Apr 1998 WO
WO-9817064 Apr 1998 WO
WO-9820675 May 1998 WO
WO-9821664 May 1998 WO
WO-9821877 May 1998 WO
WO-9826584 Jun 1998 WO
WO-9827723 Jun 1998 WO
WO-9826569 Jun 1998 WO
WO-9828906 Jul 1998 WO
WO-9831148 Jul 1998 WO
WO-9837695 Aug 1998 WO
WO-9839893 Sep 1998 WO
WO-9841020 Sep 1998 WO
WO-9843183 Oct 1998 WO
WO-9843406 Oct 1998 WO
WO-9847279 Oct 1998 WO
WO-9847290 Oct 1998 WO
WO-9848566 Oct 1998 WO
WO-9847283 Oct 1998 WO
WO-9856172 Dec 1998 WO
WO-9856173 Dec 1998 WO
WO-9856712 Dec 1998 WO
WO-9901984 Jan 1999 WO
WO-9903267 Jan 1999 WO
WO-9904561 Jan 1999 WO
WO-9907142 Feb 1999 WO
WO-9914947 Mar 1999 WO
WO-9918722 Apr 1999 WO
WO-199918721 Apr 1999 WO
WO-9922502 May 1999 WO
WO-9929109 Jun 1999 WO
WO-9930491 Jun 1999 WO
WO-9931480 Jun 1999 WO
WO-9933265 Jul 1999 WO
WO-9938092 Jul 1999 WO
WO-9935827 Jul 1999 WO
WO-9937045 Jul 1999 WO
WO-9939280 Aug 1999 WO
WO-9945700 Sep 1999 WO
WO-9945701 Sep 1999 WO
WO-9945702 Sep 1999 WO
WO-9952279 Oct 1999 WO
WO-9952285 Oct 1999 WO
WO-9956466 Nov 1999 WO
WO-9956473 Nov 1999 WO
WO-9957837 Nov 1999 WO
WO-9957839 Nov 1999 WO
WO-9960493 Nov 1999 WO
WO-9960783 Nov 1999 WO
WO-9960789 Nov 1999 WO
WO-9960790 Nov 1999 WO
WO-9966725 Dec 1999 WO
WO-9965237 Dec 1999 WO
WO-0004706 Jan 2000 WO
WO-0004708 Jan 2000 WO
WO-0004709 Jan 2000 WO
WO-0002380 Jan 2000 WO
WO-0007368 Feb 2000 WO
WO-0008850 Feb 2000 WO
WO-0008851 Feb 2000 WO
WO-0008852 Feb 2000 WO
WO-0005889 Feb 2000 WO
WO-0011865 Mar 2000 WO
WO-0013415 Mar 2000 WO
WO-0016548 Mar 2000 WO
WO-200014951 Mar 2000 WO
WO-0011869 Mar 2000 WO
WO-0013416 Mar 2000 WO
WO-0016336 Mar 2000 WO
WO-0028734 May 2000 WO
WO-0028739 May 2000 WO
WO-0027122 May 2000 WO
WO 0033560 Jun 2000 WO
WO-0033573 Jun 2000 WO
WO-0033578 Jun 2000 WO
WO-0033160 Jun 2000 WO
WO-0033224 Jun 2000 WO
WO-2000033233 Jun 2000 WO
WO-00040014 Jul 2000 WO
WO-0040025 Jul 2000 WO
WO-0049801 Aug 2000 WO
WO-0051310 Aug 2000 WO
WO-0057645 Sep 2000 WO
WO-0058833 Oct 2000 WO
WO-0058967 Oct 2000 WO
WO-0059214 Oct 2000 WO
WO-0059220 Oct 2000 WO
WO-0059223 Oct 2000 WO
WO-0062298 Oct 2000 WO
WO-0062299 Oct 2000 WO
WO-0062533 Oct 2000 WO
WO-0067475 Nov 2000 WO
WO-200070505 Nov 2000 WO
WO-2000079798 Dec 2000 WO
WO-0101677 Jan 2001 WO
WO-0106784 Jan 2001 WO
WO-0110126 Feb 2001 WO
WO-0110128 Feb 2001 WO
WO-0111865 Feb 2001 WO
WO-0115438 Mar 2001 WO
WO-0122729 Mar 2001 WO
WO-0119086 Mar 2001 WO
WO-0135662 May 2001 WO
WO-01-46843 Jun 2001 WO
WO-0147238 Jun 2001 WO
WO-0147249 Jun 2001 WO
WO-0147257 Jun 2001 WO
WO-0147273 Jun 2001 WO
WO-0147279 Jun 2001 WO
WO-0146869 Jun 2001 WO
WO-0150743 Jul 2001 WO
WO-0158158 Aug 2001 WO
WO-0175649 Oct 2001 WO
WO-0176239 Oct 2001 WO
WO-0176248 Oct 2001 WO
WO-0176704 Oct 2001 WO
WO-0189213 Nov 2001 WO
WO-0182600 Nov 2001 WO
WO-0231731 Apr 2002 WO
WO-02078317 Oct 2002 WO
WO-0284992 Oct 2002 WO
WO-0305712 Jan 2003 WO
WO-2004004341 Jan 2004 WO
WO-04066180 Aug 2004 WO
WO-06079977 Aug 2006 WO
PCTUS2007063417 Mar 2007 WO
WO-2008042280 Apr 2008 WO
Non-Patent Literature Citations (322)
Entry
US 5,047,897 A, 09/1991, Strubbe et al. (withdrawn)
U.S. Appl. No. 11/682,533, filed Mar. 6, 2007.
U.S. Appl. No. 13/768,450, filed Feb. 15, 2013.
U.S. Appl. No. 11/682,588, filed Mar. 6, 2007.
U.S. Appl. No. 12/882,451, filed Sep. 15, 2010.
U.S. Appl. No. 13/021,086, filed Feb. 4, 2011.
U.S. Appl. No. 13/959,289, filed Aug. 5, 2013.
U.S. Appl. No. 11/682,596, filed Mar. 6, 2007.
U.S. Appl. No. 12/435,899, filed May 5, 2009.
U.S. Appl. No. 13/296,490, filed Nov. 15, 2011.
U.S. Appl. No. 13/296,477, filed Nov. 15, 2011.
U.S. Appl. No. 13/296,486, filed Nov. 15, 2011.
U.S. Appl. No. 11/682,599, filed Mar. 6, 2007.
U.S. Appl. No. 12/873,622, filed Sep. 1, 2010.
U.S. Appl. No. 13/788,265, filed Mar. 7, 2013.
U.S. Appl. No. 11/682,689, filed Mar. 6, 2007.
U.S. Appl. No. 12/692,896, filed Jan. 25, 2010.
U.S. Appl. No. 13/788,274, filed Mar. 7, 2013.
U.S. Appl. No. 11/682,693, filed Mar. 6, 2007.
U.S. Appl. No. 12/843,577, filed Jul. 26, 2010.
U.S. Appl. No. 13/035,162, filed Feb. 25, 2011.
U.S. Appl. No. 14/077,538, filed Nov. 12, 2013.
U.S. Appl. No. 11/682,695, filed Mar. 6, 2007.
U.S. Appl. No. 12/795,303, filed Jun. 7, 2010.
U.S. Appl. No. 11/682,700, filed Mar. 6, 2007.
U.S. Appl. No. 12/844,366, filed Jul. 27, 2010.
U.S. Appl. No. 13/442,436, filed Apr. 9, 2012.
U.S. Appl. No. 13/887,514, filed May 6, 2013.
U.S. Appl. No. 60/834,966, filed Aug. 2, 2006.
U.S. Appl. No. 60/796,614, filed May 1, 2006.
U.S. Appl. No. 60/784,027, filed Mar. 20, 2006.
U.S. Appl. No. 60/779,547, filed Mar. 6, 2006.
U.S. Appl. No. 09/330,792, filed Jun. 11, 1999, Knudson et al.
U.S. Appl. No. 09/332,244, filed Jun. 11, 1999, Ellis.
U.S. Appl. No. 09/356,268, filed Jul. 16, 1999, Rudnick et al.
“A New Approach to Addressability,” CableData Brochure, 9 pages, undated.
“Generative Models for Cold-Start Recommendations,” Schein et al, SIGR-2001, http://www.cis.upenn.edu/˜popescul/Publications/schein01generative.pdf, last accessed Oct. 24, 2006, 9 pgs.
“Methods and Metrics for cold-Start Recommendations,” Schein et al, SIGIR'02, Aug. 11-15, 2002, Tampere, Finland.
“OpenTV(R) and Interactive Channel Form Strategic Alliance to Deliver Interactive Programming to Satellite Television Subscribers”, from the Internet at http://www.opentv.com/news/interactivechannelfinal.htm, printed on Jun. 8, 1999.
“Probabilistic Models for Unified Collaborative and Content-Based Recommendation in Sparse-Data Environments”, Popescul et al, Proceedings of the Seventeenth Conference on Uncertainty in Artificial Intelligence (UAI-2001), to appear, Morgan Kaufmann, San Francisco, 2001.
“Prodigy Launches Interactive TV Listing”, Apr. 22, 1994 public Broadcasting Report.
“SWAMI: A Framework for Collaborative Filtering Algorithm Development and Evaluation”, Fisher et al., http://www.cs.berkeley.edu/˜richie/swami/sigir00-final/report.pdf, last accessed on Oct. 24, 2006, 3 pgs.
“Social Information Filtering: Algorithms for automating “Word of Mouth””, Shardanand et al., http://www.cs.ubc.cal/˜conati/532b/papers/chi-95-paper.pdf, last accessed Oct. 24, 2006, 8 pgs.
“StarSight Interactive Television Program Guide III” Jim Leftwich and Steve Schein, Functional/Interactional Architecture Specification Document, Orbit Interaction, Palo alto, California, Published before Apr. 19, 1995.
“StarSight Interactive Television Program Guide IV” Jim Leftwich and Steve Schein, Functional/Interactional Architecture Specification Document, Orbit Interaction, Palo Alto, California, Published before Apr. 19, 1995.
“StarSight Interactive Television Program Guide” Jim Leftwich, Willy Lai & Steve Schein Published before Apr. 19, 1995.
“TV Guide Online Set for Fall”, Entertainment Marketing Letter, Aug. 1994.
“Utilizing Popularity Characteristics for Product Recommendation”, Hyung Jun Ahn, International Journal of Electronic Commerce/Winter Jul. 2006, vol. 11, No. 2, pp. 59-80.
272OR Satellite Receiver User's Guide, General Instrument, 1991, pp. 58-61.
A Financial Times Survey: Viewdata (Advertisement), Financial Times, May 20, 1979, 1 page.
ACM Multimedia 93 Proceedings, A Digital On-Demand Video Service Suporting Content-Based Queries, Little et al. pp. 427-436, Jul. 1993.
Addressable Converters: A New Development at CableData, Via Cable, vol. 1, No. 12, Dec. 1981, 11 pages.
Advanced Analog Systems—Addressable Terminals, General Instrument Corp. of Horsham, Pennsylvania (URL:http--www.gi.com-BUSAREA-ANALOG-TERMINALWATCH-watch.html) Printed from the Internet on Mar. 4, 1999.
Advertisement for “TV Decisions,” Cable Vision, Aug. 4, 1986, 3 pages.
Alexander “Visualizing cleared-off desktops,” Computerworld, May 6, 1991, 1 page.
Anderson et al., UNIX Communications and the Internet (3d ed. 1995).
Antonoff, “Interactive Television,” Popular Science, Nov. 1992, pp. 92-128.
Antonoff, “Stay Tuned for Smart TV,” Popular Science, Nov. 1990, pp. 62-65.
Armstrong, “Channel-Surfing's next wave: Henry Yuen's interactive TV guide takes on TCI and Viacom,” BusinessWeek, Jul. 31, 1995, 3 pages.
Arnold, “Britain to get wired city—via telephone,” Electronics, Mar. 4, 1976, at 76, 3 pages.
Bach, U. et al., “Multimedia TV Set, Part 2 and Conclusion,” Radio-Fernsehen Elektronik (RFE), Oct. 1996, pp. 38-40. (English language translation attached.).
Bach, et al., “Multimedia TV Set, Part 1” Radio-Fernsehen Elektronik (RFE), Sep. 1996, pp. 28, 30, 31, 12 pages (English language translation attached).
Baer, “Innovative Add-On TV Products,” IEEE Transactions on Consumer Electronics, vol. CE-25, Nov. 1979, pp. 765-771.
Beddow, “The Virtual Channels Subscriber Interface,” Communications Technology, Apr. 30, 1992.
Bell Atlantic Buys Cable TV Company for $22bn, Financial Times (London), Oct. 14, 1993 p. 65.
Bensch, “VPV Videotext Programs Videorecorder,” IEEE Paper, Jun. 1988, pp. 788-792.
Berniker, “TV Guide going online”, Broadcasting & Cable, pp. 49-52, Jun. 13, 1994.
Bertuch, “New Realities for PCs: Multimedia between aspiration and commerce,” (translation), Exhibit NK 12 of TechniSat's nullity action against EP'111, Issue 10, pp. 40-46 (1991).
Bestler, Caitlin “Flexible Data Structures and Interface Rituals for Rapid Development of OSD Applications,” Proceedings from the Eleven Technical Sessions, 42nd Annual Convention and Exposition and Exploration of the NCTA, San Francisco, CA Jun. 6-9, 1993, pp. 223-236. Jun. 6, 1993.
Blahut et al., “Interactive Television,” Proceedings of the IEEE, pp. 1071-1085, Jul. 1995.
Boyd-Merritt, “Television wires two-way video,” Electronic Engineering Times, Apr. 25, 1994, 3 pages.
Brochure, “Weststar and Videotoken Network Present the CableComputer,” Revised Aug. 15, 1985 (Plaintiff's 334).
Brochure, Time Inc., “Now, Through the Advances of the Computer Age, You Can Get the Information You Want, When You Want It. Instantly and Conveniently, on Your Home TV Screen, ”Time Teletext, Time Video Information Services, Inc., 9 pages, undated (V 79167-79175).
Brochure, VTN “Videotoken Network, New Dimension Television,” Dec. 1985 (Plaintiff's Exhibit 313).
Brugliera, “Digital On-Screen Display—A New Technology for the Consumer Interface,” Symposium Record Cable TV Sessions of the 18th International Television Symposium & Technical Exhibition—Montreux, Switzerland, Jun. 10-15, 1993, pp. 571-586.
CNN Tech: Sonicblue revives ReplayTV, articles cnn.com, Sep. 10, 2001, retrieved from the internet: http://articles.cnn.com/2001-09-10/tech/replay.tv.idg_1_replaytv-sonicblue-digital-video?_s=PM:TECH, 2 pages.
Cable Computer User's Guide, Rev. 1, Dec. 1985 (Plaintiff's Exhibit 289).
Cable Television Equipment, Jerrold Communications Publication, dated 1992 and 1993, pp. 8-2.1 to 8-6 and 8-14.1 to 8-14.3.
CableData, Roseville Consumer Presentation, Mar. 1985 12 pages.
Cameron et al., Learning GNU Emacs (2d ed. 1996).
Carne, E.B., “The Wired Household,”IEEE Spectrum, vol. 16 No. 10, Oct. 1979, pp. 61-66.
Cascading Style Sheets, level 1, W3C Recommendation (Dec. 17, 1996), available at http://www.w3.org/TR/REC-CSS1-961217#anchor-pseudo-classes.
Case 4:11-cv-06591-PJH, Complaint for Declaratory Relief (plaintiff), dated Dec. 21, 2011, 16 pages.
Chan, “Learning Considerations in User Interface Design: The Room Model,” Publication of the Software Portability Laboratory, University of Waterloo, Ontario, Canada, Jul. 1984, 52 pages.
Chang et al., “An Open-Systems Approach to Video on Demand,” IEEE Communications Magazine, May 1994, pp. 68-80.
Chen et al., “Real Time video and Audio in the World Wide Web,” Proc. 4th World Wide Web Conference, 1995, 15 pages.
Cherrick et al., “An Individually Addressable TV Receiver With Interactive Channel Guide Display, VCR, and Cable Box Control”, IEEE Transactions on Consumer Electronics, vol. 4:3 (Aug. 1994), pp. 317-28.
Christodoulakis, Steven and Graham, Stephen “Browsing Within Time-Driven Multimedia Documents,” publication of the Institute for Computer Research, University of Waterloo, Waterloo, Ontario, Canada Jul. 1988 pp. 219-227.
Communication of a Notice of Opposition, European Patent Application No. 08103167.6, Jan. 11, 2012, 24 pages.
Computer Network: Current Status and Outlook on Leading Science and Technology, Bureau of Science & Technology (Japan), vol. 1, Dec. 1986, 326 pages.
Contents of the website of StarSight Telecast, Inc. (http://www.StarSight.com) as of Apr. 21, 2004.
U.S. Appl. No. 60/179,548, filed Feb. 1, 2000.
Cox, J. et al, “Extended Services in a Digital Compression System,” Proceedings from Eleven Technical Sessions: 42nd Annual Convention and Exposition of the National Cable Television Association, Jun. 1993, pp. 185-191.
Creation/Modification of the Audio Signal Processor Setup for a PC Audio Editor, IBM Technical Disclosure Bulletin, vol. 30, No. 10, Mar. 1988, pp. 367-376.
D2B-Home Bus Fur Audio and Video, Selektor, Apr. 1990, pp. 10, 12.
DAVIC Digital Audio-Visual Council, DAVIC 1.5 Specification, Baseline Document 1, Revised 4.0, Applications for Home Storage and Internet Based Systems, Published by Digital Audio-Visual Council 1995-1999.
DIRECTV Digital Satellite Receiver—Operating Instructions, Sony Electronics Inc. (2001).
DIRECTV Plus2 System, Thompson Consumer Electronics, Inc. (1999), 2 pages.
DIRECTV Receiver—Owner's Manual, Samsung, DIRECTV, Inc. (2002).
DIRECTV Receiver with TiVo Digital Satellite Receiver/Recorder SAT-T60—Installation Guide, Sony Electronics Inc. (2000).
DIRECTV Receiver with TiVo Installation Guide, Philips, TiVo Inc. (2000).
DIRECTV Receiver with TiVo Viewer's Guide, TiVo Inc., Sony Corp. (1999, 2000).
Daily, Mack, “Addressable Decoder with Downloadable Operation,” Proceedings from the Eleven Technical Sessions, 42nd Annual Convention and Exposition of the NCTA, Jun. 6-9, 1993, pp. 82-89.
Damouny, “Teletext Decoders-Keeping Up With the Latest Advances,” IEEE Transactions on Consumer Electronics, vol. CE-30, No. 3, Aug. 1984, pp. 429-435.
Das, D. and ter Horst, H., Recommender Systems for TV, Technical Report WS-98-08—Papers from the AAAI Workshop, Madison, WI (1998), 2 pages.
Davis, TV Guide on Screen, “Violence on Television”, House of Representatives, Committee on Energy and Commerce, Subcommittee on Telecommunications and Finance, pp. 93-163, Jun. 25, 1993.
Day, “The Great PC/TV Debate,” OEM Magazine, Jul. 1, 1996, 6 pages.
Dec., Presenting JAVA, “Understanding the Potential of Java and the Web”, pp. 1-208, © 1995 by Sams.net Publishing.
Declaration Under 37 C.F.R. § 1.132 of Richard E. Glassberg, signed Oct. 20, 2006, filed Oct. 24, 2006, from U.S. Appl. No. 10/346,266, 5 pages.
DiRosa, S. “BIGSURF Netguide”, Jul. 1995, vol. 3.1 (Sections 18,21, and 28—renumbered as pp. 1-27).
Dial M for Movie, Funkschau Nov. 1994 Perspektiven, Video on Demand, vol. Nov. 1994, pp. 78-79. (English language translation attached).
Dialing the printed page, ITT in Europe Profile, 11/Spring 1977, 2 pages.
Digital TV—at a price, New Scientist, Sep. 15, 1983, vol. 99. No. 1375, p. 770.
Digital Video Video Broadcasting (DVB); DVB specification for data broadcasting, European Telecommunication Standards Institute, Draft EN 301 192 V1.2.1 (Jan. 1999).
Dinwiddle et al., “Combined-User Interface for Computers, Televison, Video Recorders, and Telephone, etc.” IBM Technical Disclosure Bulletin, vol. 33(36), pp. 116-118 (1990).
DishPro Satellite System—User's Guide, Dish Network (Sep. 1, 2001).
Does NBC Get It, Aug. 14, 1995, retrieved from the internet at http://www.open4success.org/db/bin19/019687.html, retrieved on Dec. 11, 2013, 1 page.
Dr. Dobb's, “Implementing a Web Shopping Cart,” from the internet at https://www.drdobbs.com/article/print?articlelId=184409959&siteSect . . . , Sep. 1, 1996, printed from the internet on Sep. 13, 2012, 15 pages.
Draft Grounds of Invalidity for EP (UK) 0 880 856 (Trial B), No. HC11 C 04556, between Starsight Telecast and United Video Properties (Claimants) and Virgin Media, Virgin Media Payments, and TiVo (Defendants), 7 pgs., Oct. 2013.
Duck Tales, (1987)[TV Series 1987-1990], Internet Movie Database (IMDB) [Retrieved on Apr. 7, 2007], 5 pages.
Eckhoff, “TV Listing Star on the Computer”, Central Penn Business Journal/High Beam Research, pp. 1-4, Mar. 15, 1996.
Edwardson, “CEEFAX: A Proposed New Broadcasting Service,” Journal of the SMPTE, Jan. 1974, vol. 83 No. 1, pp. 14-19.
Ehrmantraut et al., “The Personal Electronic Program Guide—Towards the Pre-Selection of Individual TV Programs,” CIKM 96, Rockville, MD., Dec. 31, 1996, pp. 243-250.
Eitz et al., “Videotext Programmiert Videoheimgerate,” Rundfunktech Mitteilungen, Jahrg. 30, H.5, 1986, S. 223 bis 229 (English translation attached).
Eitz, Gerhard, “Zukünftige Informations-und Datenangebote beim digitalen Femsehen-EPG Und ‘Lesezeichen’,” RTM Rundfunktechnische Mitteilungen, Jun. 1997, vol. 41, pp. 67-72.
Electronic Program Guide via Internet, Research Disclosure, Kenneth Mason Publications, Hampshire, GB vol. 385(2) (May 1996) p. 276, ISSN:0374-4353.
Email from Iain Lea to Kent Landfield, comp.sources.misc, vol. 29, Issue 19 (Mar. 27, 1992, 03:28:12 GMT), available at https://groups.google.com/group/comp.sources.misc/msg/2e79d4c058a8a4fe?dmode=source&output=gplain&noredirect&pli=1.
Enhanced Content Specification, ATVEF, from the internet at http://www.atvetcomilibraryispec.html, printed Aug. 22, 2001, the document bears a Copyright date of 1998, 1999, 2000, 41 pages.
Ernst & Young “On track: A primer on media asset identification” May 2011 ,retrieved from the internet May 29, 2014. URL http://www.ey.com/Publication/vwLUAssets/Media_asset_identification_primer/$FILE/Media_Entertainment.pdf.
European Search Report dated Nov. 19, 2002 from European Application No. 989446111.7, 3 pages.
European Search Report dated Oct. 24, 2006 from European Application No. 06076553, 7 pages.
European Telecommunication Standard, “Electronic Programme Guide (EPG); Protocol for a TV Guide using electronic data transmission,” 89 pages, sections 1-11.12.7 and annex A-P, bearing a date of May 1997.
European Telecommunications Standards: Digital Broadcasting Systems for Television Sound and Data Services; Specification for Service Information (SI) in Digital Video Broadcasting (DVB) Systems, European Telecommunications Standards Institute, Dec. 1994, 64 pages.
Extended European Search Report for EP10183222 dated Jun. 20, 2011.
Fuller, C., Streaming gijutsu no genzai Web video system no gaiyou [Current Streaming Technology, Outline of Web Video System], UNIX Magazine, Japan, ASCII K.K., Mar. 1, 2000, vol. 15, No. 3, p. 65-72.
Facsimile Transmission, NHK Research Monthly Report, Dec. 1987(Unknown author).
Fall 2001 TiVo Service Update with Dual Tuner!, TiVo Inc. (2001).
Fry et al., “Delivering QoS Controlled Continuous Media on the World Wide Web,” Proceedings of the 4th International IFIP Workshop on QoS, Paris, Mar. 6-8, 1996, 12 pages.
GameSpot's Downloads for Allied General, accessed from the internet at http://web.archive.org/web/19970205060703/http://www.gamespot.com/strategy/allie . . . , copyright 1997, printed on Sep. 19, 2013, 1 page.
Gateway Destination: The PC for the Office and the Family Room, PC Magazine, First Looks section, pp. 39-41, Jun. 11, 1996, 3 pages.
Gavron, Jacquelyn, Moran, Joseph, How to Use Microsoft Windows NT 4 Workstation, 1996, entire document, 5 pages.
Getting Started Installation Guide, Using StarSight 1 Manual, and Remote Control Quick Reference Guide, copywright 1994, 93 pages.
Growing US interest in the impact of viewdata, Computing Weekly, Jul. 20, 1978, 1 page.
Gutta, et al., “TV Content Recommender System”, Proceedings of the Seventeenth National Conference on Artificial Intelligence and Twelfth Conference on Innovative Applications of Artificial Intelligence, (Jul. 30, 2000), 2 pages.
Hallenbeck, P., Developing an interactive television system that works, R&D Magazine, vol. 39:7, Jun. 1997, p. 54.
Hartwig et al. “Broadcasting and Processing of Program Guides for Digital TV,” SMPTE Journal, pp. 727-732, Oct. 1997.
Hedger, “Telesoftware: Home Computing Via Broadcast Teletext,” IEEE Transactions on Consumer Electronics, vol. CE-25, No. 3, Jul. 1979, pp. 279-287.
Hendrix, “A Natural Language Interface Facility”, Artificial Intelligence Center, Stanford Research Institute, SIGART Newsletter, No. 61, Feb. 1977, 2 pages.
Hill, et al., “Recommending and Evaluating Choices in a Virtual Community of Use” CHI '95 Mosaic of Creativity, pp. 194-201 (1995).
Hiroshi Ishii et al, “Clearface: Translucent Multiuser Interface for TeamWorkStation,” ECSCW, Sep. 1991, pp. 6-10.
Hiroshi Ishii et al, “Toward an Open Shared Workspace: Computer and Video Fusion Approach of Team Workstation,” Communications of the ACM, Dec. 1991, vol. 34 No. 12, pp. 37-50.
Hirotada Ueda et al, “Impact: An Interactive Natural-Motion-Picture Dedicated Multi-Media Authoring System,” Communications of the ACM, Mar. 1991, pp. 343-350.
Hitachi Consumer Electronics Co., Ltd., Certification of market introduction in 1993 of Hitachi Projection TV Model 55EX7K, Dec. 17, 2012, 1 page.
Hitachi Projection Color TV Operating Guide, for Models 55EX7K, 50EX6K, 50ES1B/K, and 46EX3B/4K, 38 pages, undated.
Hitachi Service Manual, No. 0021, Projection Color Television, Models 55EX7K, 50EX6K, 50ES1B/K, 46EX3B/4K, and 46EX3BS/4KS, Aug. 1993, 1 page.
Hoarty, “Multimedia on Cable Television Systems,” Symposium Record Table TV Sessions, 18th International Television Symposium and Technical Exhibition, Montreux, Switzerland, Jun. 10, 1993, pp. 555-567.
Hobbes' Internet Timeline 10.2, by Robert H'obbes' Zakon, from the internet at http://www.zakon.org/robert/internet/timeline/, printed from the internet on Sep. 13, 2012, 29 pages.
Hofmann, Neumann, Oberlies & Schadwinkel, “Videotext Programmiert Videorecorder,” Rundfunktechnischen Mitteilungen, (Broadcast Engineering Reports), vol. 26 No. 6, pp. 254-257, Nov.-Dec. 1982.
Holland, “NAPLPS standard defines graphics and text communications,” EDN, Jan. 10, 1985, pp. 179-192.
IPG Attitude and Usage Study, prepared by Lieberman Research Worldwide for Gemstar—TV Guide International, Oct. 2002.
ITC Inv. No. 337-TA-845: Commission Opinion dated Dec. 11, 2013, 27 pages.
ITC Investigation of Certain Products Containing Interactive Program Guide and Parental Control TRechnology, Investigation No. 3376-TA-845, “Final Initial Determination” Jun. 7, 2013, 375 pages.
Imike, S., Interactive Video Management and Production, Educational Technology Publications, May 1991, http://www.amazon.com/Interactive-Video-Management-Production-Steven/dp/0877782334/ref=sr_1_1?ie=UTF8&qid=1416426739&sr=8-1&keywords=interactive+video+management+and+production&pebp=1416426742553, 2 pages.
Instruction Manual, “Using StarSight 2,” StarSight Telecast, Inc., 1994, 27 pages.
Instructional Manual, “Sonic the Hed ehog,” Sega of America, 1992, 11 pages.
Interactive Computer Conference Server, IBM Technical Bulletin, vol. 34, No. 7A, Dec. 1991, pp. 375-377.
Interface Device for Conventional TVs to Improve Functionality, IBM Technical Disclosure Bulletin, vol. 36, No. 7, Jul. 1993, pp. 53-54.
International Search Report for PCT/US95/11173 dated Dec. 14, 1995.
International Search Report for PCT/US99/04163 dated Jun. 23, 1999.
International Search Report for PCT/US99/08842 dated Jul. 7, 1999.
Internet User Forecast by Country, Computer Industry Almanac—Press Release, from the internet at http://www.c-i-a.com/internetusersexec.html, printed from the internet on Sep. 13, 2012, 3 pages.
Irven, “Multi-Media Information Services: A Laboratory Study,” IEEE Communications Magazine, vol. 26, No. 6, Jun. 1988, pp. 27-33 and 36-44.
JVC Service Manual, 27″ Color Monitor/Receiver, Model AV-2771S (U.S.), Jul. 1991, 89 pages.
James Sorce, David Fay, Brian Raila and Robert Virzi, Designing a Broadband Residential Entertainment Service: A Case Study, GTE Laboratories Incorporated, undated, pp. 141-148.
James, “Oracle—Broadcasting the Written Word,” Wireless World, Jul. 1973, vol. 79 No. 1453, pp. 314-316.
Judice, “Move Over Cable, Here Comes Video Via Voice Lines,” Network World, Sep. 1986, p. 26.
Kai et al., Development of a Simulation System for Integrated Services Television, Report from Information Processing Society of Japan, Japan, Sep. 13, 1996, vol. 96, No. 90 p. 13-20.
Karstad., “Microprocessor Control for Color-TV Receivers,” IEEE Transactions on Consumer Electronics, vol. CE-26, May 1980, pp. 149-155.
Kojima, Akira et al., “Implementation Measures to Expand Metadata Application Services”, http://www/ntt.co.jp/tr/0306/files/ntr200306051.pdf, (Jun. 2003), 6 pages.
Komarinski, Anonymous FTP p. 1, May 1, 1995 Linux Journal, 5 pages.
Kornhaas, “Von der Textprogrammierung uber TOP zum Archivsystem,” Radio Fernsehen Elektronik, vol. 40, No. 8, Aug. 30, 1991, pp. 465-468, XP 000240875 Veb Verlag Technik. Berlin, DE ISSN: 1436-1574.
Large, “Throw away the books—Viewdata's coming,” Guardian, Jan. 10, 1978, 1 page.
Large, “Viewdata, the invention that brings boundless advice and information to the home, also sets a test for the Post Office,” Financial Guardian, Jun. 20, 1978, 3 pages.
Lee, Hee-Kyung et al., “Personalized Contents Guide and Browsing based on User Preference”, http://vega.icu.ac.kr/˜mccb-lab/publications/Paper/PersonalizedTV(2002).pdf, (2002), 10 pages.
Letter from StarSight Telecast, Inc. to a StarSight Ipg subscriber (with subscriber name, address and account number redacted) notifying the subscriber of termination of the StarSight IPG, 2003.
Listing of computer code for Video HTU Program (Plaintiff's Exhibit 299).
Listing of computer code for operating system within the Cable Computer in 1985 (Plaintiff's Exhibit 298).
Lists> What's on Tonite! TV Listings (fwd), Internet article (On line), Jan. 28, 1995, XP 002378869 Retrieved from the Internet: URL: www.scout.wisc.edu/Projects/PastProjects/NH/95-01-31/0018.html> [Retrieved on Apr. 28, 2006]. The whole document, 4 pages.
Lloyd, “Impact of technology,” Financial Times, Jul. 1978, 2 pages.
Lowenstein, R.L. and Aller, H.E., “The Inevitable March of Videotex,” Technology Review, vol. 88, Oct. 1985, p. 22-29.
Lynch, Keith, timeline of net related terms and concepts, Mar. 22, 2007, 8 pages.
M/A-COM, Inc., “Videocipher II Satellite Descrambler Owner's Manual,” dated Prior Feb. 1986, pp. 1-17.
MSI Datacasting Systems, TV Communications Journal, Jan. 1973, 2 pages.
Make Room for POP, Popular Science, Jun. 1993, p. 4.
Mannes, “List-Mania, On-Screen, interactive TV guides that can program your VCR are just around the corner,” Video Review, May 1992, pp. 34-36.
Mannes, “Smart Screens: Development of Personal Navigation Systems for TV Viewers,” Video Magazine, Dec. 1993, 6 pages.
Mar. 19, 1985 letter from G. Knapp of CableData to R. Hansen of Weststar Communications, Inc. (Plaintiffs Exhibit 325).
Markowitz, A. “Companies Jump on Interactive Bandwagon,” Discount Store News, Dec. 6, 1993, pp. 4 and 131.
McKenzie, G.A., “Oracle—An Information Broadcasting Service Using Data Transmission in the Vertical Interval,” Journal of the SMPTE, Jan. 1974, vol. 83 No. 1, pp. 6-10.
Merrell, R.G., “Tac Timer,” 1986 NCTA Technical Papers, pp. 203-206.
Miller, Matthew D., “A Scenario for the Deployment of Interactive Multimedia Cable Television Systems in the United States in the 1990's”, Proceedings of the IEEE, vol. 82, pp. 585-589, Apr. 1994.
Minutes of Oral Proceeding in EP Application No. 04 075 205.7 dated Dec. 21, 2009.
Minutes of Oral Proceedings in EP Appeal No. T 1288/04 Held on May 3, 2004 for EP Application No. EP00200971.0, Applicant E-Guide, Inc.
Money, “Teletext and Viewdata,” Butterworth & Co. Ltd., London, 1979, 159 pages.
Motohashi, Iizuka, Kuwana, Building Internet TV Guide Service 1 and 2, the 53rd National Conference Proceedings, Japan, Information Processing Society of Japan, Sep. 6, 1996, 5 pages [english translation].
Neumann, Andreas, “WDR Online Aufbau and Perspektiven Automatisierter Online Dienste im WDR,” RTM Rundfunktechnische Mitteiluneen, vol. 41, Jun. 1997, pp. 56-66.
Nikkei Click, You can do it now with your existing computer, Nikkei Business Publications, Inc., 188 (No US Translation).
Oberlies, et al.; “VPS-Anzeige Und Uberwachungsgerat”, Rundfunktechnische Mitteilungen, vol. 30, No. 1 Jan. 1986-Feb. 1986, Norderstedt (DE).
Open TV Launches OpenStreamer TM Technology for Broadcasters to Deliver First Ever Real-Time Digital Interactive Television, from the internet at http://www.opentv.corn/news/openstreamer_press_final.htm, printed on Jun. 28, 1999, the document bears a copyright date of 1999, 2 pages.
Open TV fur interaktives Fernsehen, Trend and Technik, 9-95 RFE, retrieved from the internet Sep. 2, 2006, 4 pages (English language translation attached).
Owen, “How dial-a-fact is coming closer to home,” The Times, Sep. 30, 1977, 2 pages.
Owen, “Why the Post Office is so excited by its plans for a TV screen information service,” The Times, Sep. 26, 1976, 2 pages.
PTV Recorder Setup Guide, Philips Electronics, TiVo Inc. (2000).
Panasonic TX-33A1G Operating Instructions (undated).
Partial European Search Report dated Feb. 22, 2010 from corresponding European Application No. EP 03 01 3370.
Patent Abstracts of Japan vol. 017 , No. 494, Sep. 7, 1993 and JP 05 122692 A (Pioneer Electron Corp), May 18, 1993.
Patent Abstracts of Japan vol. 098, No. 001, Jan. 30, 1998 and JP 09 247565 A (Sony Corp), Sep. 19, 1997.
Pazzani et al., “Learning and Revising User Profiles: The Identification of Interesting Web Sites,” 27 Machine Learning, pp. 313-331 (1997).
Peddicord, “New on TV: You Bet Your Horse,” The Sun, Baltimore Maryland, Dec. 15, 1994, 1 page.
Pfister, Larry T., “Teletext: Its Time Has Come,” Prepared for the IGC Videotext / Teletext Conference, Andover, Massachusetts, Dec. 14, 1982, pp. 1-11.
Philips TV Set, model No. 25 PT 910A, User Manual; 40 pages (undated).
Poole, “Demand for Viewdata grows,” Sunday Times, Feb. 10, 1977, 3 pages.
Postel, J., Reynolds, J., Request for Comments: 959 File Transfer Protocol, Oct. 1985, 70 pages.
Prevue Guide Brochure, Spring 1984, 1 page.
Prevue Guide Brochure, Spring 1994, 22 pages.
Prevue Networks and OpenTV(R) Agree to Work Together on Deploying Interactive Program Guides Worldwide, from the internet at http://www.opentv.com/news/prevuefinal.htm, printed on Jun. 28, 1999, 2 pages.
Prevue Networks, Inc. Promotional Materials (undated).
Probe XL Brochure, Auto Tote Systems Inc., (Newark, Delaware) (undated) 59 pages.
Product Comparison—Group messaging software: Having the last word, InfoWorld, Nov. 6, 1995.
Qayyum, “Using IVDS and VBI for Interactive Television,” IEEE, Jun. 10, 1996, 11 pages.
RCA Satellite Receiver User's Guide, Thomson Multimedia Inc. (2001).
Ramachandran, “Space-Time Memory: a parallel programming abstraction for interactive multimedia applications, SIGPLAN Notices”, vol. 34:8 (Aug. 1999), pp. 183-192.
Raskutti et al., “A Feature-based Approach to Recommending Selections based on Past Preferences” 7 User Modeling and User-Adapted Interaction, pp. 179-218 (1997).
Rath et al., “Set-Top Box Control Software: A Key Component in Digital Video,” Philips Journal of Research, vol. 50, No. 1/2 1996, pp. 185-189.
Rayers, D.J., “Telesoftware by Teletext,” 1984 IEEE Conference Papers, vol. 240, p. 323.
Revolution on the Screen, 2nd Ed. Verlag, Wilhelm Goldmann. 1979 (English Translation).
Rewind, reply and unwind with new high-tech TV devices, by Lawrence J. Magid, LA Times. This document was printed from the Internet on Jun. 6, 1999 and bears a date of May 19, 1999.
Robertson, “Reaching Through Technology,” CHI '91 Conference Proceedings, Apr. 27-May 2, 1991, 15 pages.
Rogers, “Telcos vs. Cable TV : The Global View With Markets Converging and Regulatory Barriers Falling, Service Carriers Are Ready to Rumble,” Data Communications, Sep. 21, 1995, vol. 24, No. 13, pp. 75-76, 78, 80, XP000526196.
Roizen, Joseph “Teletext in the USA,” Society of Motion Picture and Television Engineers Journal, Jul. 1981, pp. 602-610.
Rosch, “New data and information system set for commercial market trial,” Telephony, Mar. 20, 1978, pp. 98-102.
Roseville City Council Presentation, Mar. 13, 1985 (Defendant's Exhibit 226).
Ryan, “Interactive TV Takes a Corporte Twist,” Electronic Engineering Times, Jul. 10, 1995, 3 pages.
Sato, T. et al., WWW jou no eizou browsing kikou no teian to Jitsugen [A Proposal for a Video Browsing Mechanism on World Wide Web and its Implementation], Japan Society for Software Science and Technology, collection of 14th convention articles, Japan, Japan Society for Software Science and Technology, Sep. 30, 1997, p. 193-196.
SONICblue Incorporated: ReplayTV 4000 User Guide 12.17, Chapter Five: Networking, Sep. 10, 2001, retrieved from the internet: http://www.digitalnetworksna.com/support/replayTV/downloads/ReplayTV4000UserGuide.12.17.pdf.
STORit, Report on the IBC'99 Demonstration, Deliverable #8 AC312/phi/prl/dslp/008b1 Oct. 1999.
Saito, Takeshi, et al., “Homenetwork Architecture Considering Digital Home Appliance,” Technical Committee meeting of the Institute of Electronics, Information and Communication Engineers (IEICE), Japan, Nov. 6, 1997, vol. 97, No. 368, p. 57-64.
Sandringham, “Dress rehearsal for the PRESTEL show,” New Scientist, Jun. 1, 1978, 9 pages.
Savage, “Internet's ‘What's on Tonite!’ Tells You Just That and More,” The News, InfoWatch, May 29, 1995, 1 page.
Schauer, Tom, No subject, (tschauer@moscow.com) Thu, Sep. 28, 1995 16:46:48-700, XP-002378870 [Retrieved from the Internet Apr. 28, 2006].
Schlender, B.R., “Couch Potatoes! Now It's Smart TV,” Fortune, Nov. 20, 1989, pp. 111-116.
Schmuckler, Eric “A marriage that's made in cyberspace (television networks pursue links with online information services),” May 16, 1994 MEDIAWEEK, vol. 4, No. 20, 5 pages.
Sealfon, Peggy, “High Tech TV,” Photographic, Dec. 1984 2 pages.
Selected pages from the “BBC Online—Schedules” web page. This web page is located at http://www.bbc.co.uk/schedules/ (as printed from the Internet on Oct. 19, 1999 and being dated as early as May 24, 1997, 6 pages.
Sharpless et al., “An advanced home terminal for interactive data communication,” Conf. Rec. Int. Conf. Commun. ICC '77, IEEE, Jun. 12-15, 1977, 6 pages.
Soin et al., “Analogue-Digital ASICs”, Peter Peregrinus Limited, 1991, p. 239.
Split Personality, Popular Science, Jul. 1993, p. 52.
StarSight CB 1500 Customer Letter, 1994, 27 pages.
StarSight Operating Guide and Quick Reference, 19 sheets (undated).
StarSight Telecast, StarSight introduces TVGuide-like programmer for homes, 1994, 1 page.
Start Here, Sony, TiVo and DIRECTV (undated).
Statement in an Examination Report dated Aug. 2, 1999 for a counterpart foreign application filed in New Zealand in which the foreign Examiner alleges that he has used “the Internet to access television listings for BBC World television as far back as mid 1996 . . . ”, 2 pages.
Stickland, D.C., “It's a common noun,” The Economist, Jun. 5, 1978, 1 pages.
Stokes, “The viewdata age: Power to the People,” Computing Weekly, Jan. 1979, 2 pages.
Sunada, et al, “Teletext Color Television Receiver Model C-29M950, C26M940,” NEC Home Electronics, NEC Giho, 1987, 16 pages.
Super-TVs, Popular Science, Jul. 1985, p. 64.
SuperGuide on Screen Satellite Program Guide, User's Guide, Owner's Manual, and sales literature, 74 sheets (undated).
Supplementary European Search Report for Application No. EP 98 93 5889, completed on Sep. 28, 2001, 3 pages.
Sussman, “GTE Tunes in to Home TV Shopping,” PC Week, vol. 5(26), Jun. 28, 1988, 2 pages.
Symposium Record Cable Sesssions, “Digital On-Screen Display of a New Technology for the Consumer Interface,” Publication Date May 1993.
TV Guide movie database Internet web pages printed on Aug. 12, 1999. 9 pages.
TV Guide on Screen prior Use Transcript of Proceedings—“Violence on Television,” House of Representatives, Committee on Energy and Commerce, Subcommittee on Telecommunications and Finance, Jun. 25, 1993, 36 pages.
TV Listings Functional Spec., Time Video Information Services, Inc., 11 pages, undated.
Tech Notes: Product Updates from M/A-COM Cable Home Group, “Videocipher Owners Manual Update,” Issue No. 6, Feb. 1986, 19 pages.
Technical White Paper, “Open TV™ Operating Environment,” (© 1998 OpenTV Inc.), pp. 1-12.
Technological Examination & Basic Investigative Research Report on Image Databases, Japan Mechanical Engineering Organization Int'l Society for the Advancement of Image Software, Japan, Mar. 1988, 127 pages.
Technology Overview for TV Guide on Screen Information Sheets, 8 Sheets (undated).
Technology: Turn on, tune in and print out—An experimental interactive television service is set to alter our viewing habits, Financial Times (London), Oct. 14, 1993, p. 11.
Teletext presents the alternative view, Financial Times, Oct. 24, 1977, 2 pages.
The Columbia House Video Club: Download Software, accessed from the intemet at http://web.archive.org/web/19961223163101/http://www.columbiahouse.com/repl/vc . . . , copyright 1996, printed on Sep. 19, 2013, p. 1.
The New Media and Broadcast Policy: An Investigation & Research Conference Report on Broadcasting Diversification, Radio Regulatory Bureau, Japan Ministry of Posts & Telecommunications, Mar. 1982, 114 pages.
The television program guide website of Gist Communications, Inc. of New York, New York. This website is located at www.gist.com (as printed from the Internet on Aug. 14, 1997), 272 pages.
The television program guide website of TV Guide Entertainment Network. This website is located at www.tvguide.com (as printed from the Internet on Aug. 13-18, 1997) pp. 139.
Thomas, “Electronic Program Guide Applications—The Basics of System Design,” NCTA Technical Papers, 1994, pp. 15-20.
Three men on a Viewdata bike, The Economist, Mar. 25, 1978, pp. 1-2.
Today's Stop: What's on Tonite, Oct. 3, 1995, retrieved from the internet at http://internettourbus.com/arch/1995/TB100395.TXT, 3 pages.
Tol, et al., “Requirements and Scenarios for the Bi-directional Transport of Metadata”, TV Anytime Meeting, Version 1.0, Document TV150 (Aug. 2, 2002), 7 pages.
Transcript of the Deposition of John Roop, Jun. 2001, p. 608.
Transcript of the Deposition of John Roop, Oct. 1996, pp. 186-187.
Transcript of the testimony of Brian Klosterman, May 1997, pp. 1700-1981.
Transcript of the testimony of Michael Faber and Larry Wangberg, May 1996, pp. 554-743.
Trial testimony of Michael Axford, Prevue Interactive, Inc. and United Video Satellite Group, Inc. v. StarSight Telecast, Inc., May 9, 1998, pp. 186-187, 295-315, and 352-357.
U.S. Appl. No. 10/453,388, Office Action dated Sep. 8, 2006.
UVSG Offers System-Specific Web Site Development for OPS, press release of United Video Satellite Group, Apr. 12, 1996, 2 pages.
UVSG Teams With Microsoft on Internet Information Server, press release of United Video Satellite Group, Feb. 22, 1996, 2 pages.
Ueda, Hirotada et al, “Impact: An Interactive Natural-Motion-Picture Dedicated Multi-Media Authoring System,” Communications of the ACM, Mar. 1991, pp. 343-350.
Uniden, UST-4800 Super Integrated Receiver/Descrambler, Preliminary Reference Manual, 80 pages, Nov. 12, 1991.
Uniden, UST-4800, Integrated Receiver/Descrambler, Installation Guide, 60 pages, © 1990, Uniden America Corporation.
Uniden, UST-4800, Integrated Receiver/Descrambler, Operating Guide, 24 pages, © 1990, Uniden America Corporation.
User's Guide RCA Color TV with TV Plus + Guide, Thomson Consumer Electronics(1997).
Various publications of Insight Telecast, 1992 and 1993, 10 pages.
Veith, R.H., “Television's Teletext,” Elsevier Science Publishing Co., Inc, 1983, pp. 13-20, 41-51.
Video Plus, Billboard, vol. 98, No. 4, Jan. 25, 1986, p. 25.
VideoGuide User's Manual, 14 sheets (undated).
VideoGuide, “VideoGuide User's Manual,” pp. 1-28 (p. 11 is the most relevant).
Videocipher Stipulation, May 1996, 5 pages.
Viewdata and its potential impact in the USA: Final ReportNolume One, The UK Experience, Link and Butler Cox & Partners Limited, Oct. 1978, 129 pages.
Viewdata moves in US but GEC may lose out, Computing Weekly, Jan. 25, 1978, 1 page.
Vision/1 from Tecmar, IBM transforms PS/1 into a TV, Info World, vol. 14(9), Mar. 2, 1992, p. 34.
Web TV and Its Consumer Electronics Licenses debut First Internet Television Network and Set Top Box XP 002113265 Retrieved from the Internet: <URL http://www.webtv.net/company/news/archive/License.html> Jul. 10, 1996, 6 pages [retrieved on Dec. 1, 2005].
Welcome to Columbia House Online, accessed from the internet at http://web.archive.org/web/19961221085121/http://www.columbiahouse.com/, copyright 1996, printed on Sep. 19, 2013, 1 page.
Whitehorn, “Viewdata and you,” Observer, Jul. 30, 1978, 1 page.
Wikipedia article on CompuServe, Mar. 22, 2007, 7 pages.
Wikipedia article, “Geschichte des Internets,” from the internet at http://de.wikipedia.org/wiki/Geschichte_des_Internets, page last modified on Apr. 28, 2012, printed from the internet on May 18, 2012, 17 pages (Concise explanation included in IDS letter).
Wikipedia article, “Internet Explorer,” from the internet at http://de.wikipedia.org/wiki/Internet_Explorer, page last modified on Sep. 9, 2012, printed from the internet on Sep. 13, 2012, 14 pages (Concise explanation included in IDS letter).
Wikipedia article, “MSN TV,” from the internet at http://en.wikipedia.org/wiki/MSN_TV, page last modified on May 15, 2012, printed from the internet on Sep. 13, 2012.
Wikipedia article, “NCSA Mosaic,” from the internet at http://de.wikipedia.org/wiki/NCSA_Mosaic, page last modified on Sep. 3, 2012, printed from the internet on Sep. 13, 2012, 2 pages (Concise explanation included in IDS letter).
Wikipedia-Teletext Excerpt (English Translation), printed from the internet Jul. 1, 2013, 18 pages.
Windows 98 Feature Combines TV, Terminal and the Internet, New York Times, Aug. 18, 1998.
Winkler, M., “Computer Cinema: Computer and video: from TV converter to TV studio,” Computerkino, (translation) Exhibit NK 13 of TechniSat's nullity action against EP'111, Issue 10, pp. 100-107 (1992).
Wittig et al, “Intelligent Media Agents in Interactive Television Systems,” Proceedings of the International Conference on Multimedia Computing and Systems, Los Alamitos, CA, US, May 1518, 1995, p. 182-189, XP 000603484.
Yoshida, “Interactive TV a Blur,” Electronic Engineering Times, Jan. 30, 1995, 2 pages.
Related Publications (1)
Number Date Country
20150149443 A1 May 2015 US
Provisional Applications (4)
Number Date Country
60834966 Aug 2006 US
60796614 May 2006 US
60784027 Mar 2006 US
60779547 Mar 2006 US
Continuations (4)
Number Date Country
Parent 13959289 Aug 2013 US
Child 14473441 US
Parent 13021086 Feb 2011 US
Child 13959289 US
Parent 12882451 Sep 2010 US
Child 13021086 US
Parent 11682588 Mar 2007 US
Child 12882451 US