System for generating media recommendations in a distributed environment based on seed information

Abstract
A method for generating a media recommendation is disclosed. The method comprises receiving information about a user associated with a requesting device, identifying profile information based on the information about the user, receiving a media recommendation request from the requesting device, the media recommendation request comprising seed information comprising information identifying a media item, determining at least one related media item based on at least the information identifying the media item and the profile information, and providing information identifying the at least one related media item to the requesting device.
Description
FIELD OF THE DISCLOSURE

The present disclosure relates to discovery of media content based on geographic location.


BACKGROUND

Systems for sharing and generating playlists are known. For example, Gracenote Playlist™ by Gracenote® of Emeryville, Calif., offers playlist generation technology for automatically generating digital music playlists that works in offline devices, including portable MP3 players, as well as desktop applications.


Gracenote Playlist Plus™ allows a user to generate a More Like This™ playlist by selecting one or more songs, albums, or artists as seeds. Gracenote Playlist then returns a mix of music that contains music from related artists and genres. This is accomplished by Gracenote Playlist Plus analyzing text data available in file tags, called metadata, and filenames of the music to link the music to an internal database of music information. Gracenote Playlist Plus uses Gracenote's proprietary metadata types, which include a genre system that has more than 1600 individual genre categories and associated relational data. The system lets Gracenote Playlist Plus find relationships between songs that may be missed by simpler systems. For example, a “Punk Pop” song may be more similar to a “Ska Revival” song than it might be to one belonging to another “Punk” sub-category, such as “Hardcore Punk.”


Last.fm Ltd. is a UK-based internet radio and music community website. Using a music recommendation system called “Audioscrobbler”, Last.fm™ builds a profile of each user's musical taste by recording details of all the songs the user listens to, either on streamed radio stations or on the user's own computer or music player. This information is transferred to Last.fm's database (“Scrobbled”) via a plugin installed into the user's music player. The profile data is displayed on the user's Last.fm profile page for others to see. The site offers numerous social networking features and can recommend and play artists similar to the user's favorites. Users can create custom radio stations and playlists from any of the audio tracks in Last.fm's music library. A user can embed a playlist in their profile page to which others can listen, but the playlist needs to have at least 15 streamable tracks, each from different artists.


Similarly, U.S. Pat. No. 7,035,871 entitled “Method and Apparatus for Intelligent and Automatic Preference Detection of Media Content” provides a system for listening to music online by creating a preference profile for a user. When the user signs up for the service and provides details reflecting his preferences and his play history, a preference profile is generated and stored in a preference database. The system analyzes the stored profiles in the database and learns from the patterns it detects. The system recommends music to the user with attributes similar to the user's play history.


U.S. Patent Application Publication No. 2006/0143236 entitled “Interactive Music Playlist Sharing System and Methods” describes a community media playlist sharing system, where system users upload media playlists in real-time, which are automatically converted to a standardized format and shared with other users of the community. A playlist search interface module browses the database of media playlists and returns similar playlists of system users based on similarity of one or more of the following inputs from a system user: media identification information, media category information, media relations information, user information, or matching a plurality of media items on respective playlists. Based on the results of the playlist search interface module, the system returns a list of recommended playlists to the user.


Although conventional systems for generating playlists perform for their intended purposes, conventional systems suffer disadvantages that may render the results overbroad for the user's tastes. One disadvantage is that although conventional systems may take into account the playlists of other users, conventional systems fail to analyze the playlists of a specific group of users, and fail to consider peer group influences. For example, the music that a particular teenager listens to may be highly influenced by the music listened to by a group of the teenager's peers, such as his or her friends. A further disadvantage is that conventional systems fail to take into account the fact that the music tastes of a user may be influenced by his or her geographic location when generating playlists.


SUMMARY

The present disclosure relates to generating a media recommendation. In one embodiment, a method is provided. The method comprises receiving information about a user associated with a requesting device; identifying profile information based on the information about the user; receiving a media recommendation request from the requesting device, the media recommendation request comprising seed information comprising information identifying a media item; determining at least one related media item based on at least the information identifying the media item and the profile information; and providing information identifying the at least one related media item to the requesting device.


Those skilled in the art will appreciate the scope of the present disclosure and realize additional aspects thereof after reading the following detailed description of the preferred embodiments in association with the accompanying drawing figures.





BRIEF DESCRIPTION OF THE DRAWING FIGURES

The accompanying drawing figures incorporated in and forming a part of this specification illustrate several aspects of the disclosure, and together with the description serve to explain the principles of the disclosure.



FIG. 1 illustrates a system for identifying media items played within one or more geographic areas of interest to a user according to one embodiment of the present disclosure;



FIG. 2 illustrates the operation of the system of FIG. 1 according to one embodiment of the present disclosure;



FIG. 3 is a flow chart illustrating a more detailed process for identifying media items played within one or more geographic areas of interest according to one embodiment of the present disclosure;



FIG. 4 is a flow chart illustrating a more detailed process for identifying media items played within one or more geographic areas of interest according to another embodiment of the present disclosure;



FIG. 5 illustrates an exemplary Graphical User Interface (GUI) enabling a user to define one or more media channels according to one embodiment of the present disclosure;



FIGS. 6A through 6I graphically illustrate an exemplary process by which a user defines multiple media channels and receives corresponding results according to one embodiment of the present disclosure;



FIG. 7 illustrates the operation of the media service of FIG. 1 according to another embodiment of the present disclosure;



FIG. 8 is an exemplary GUI for presenting representative information to a requesting user according to one embodiment of the present disclosure;



FIG. 9 is an exemplary GUI for presenting representative information to a requesting user according to another embodiment of the present disclosure;



FIG. 10 is a block diagram of the central server of FIG. 1 according to one embodiment of the present disclosure; and



FIG. 11 is a block diagram of one of the devices of FIG. 1 according to one embodiment of the present disclosure.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

The embodiments set forth below represent the necessary information to enable those skilled in the art to practice the invention and illustrate the best mode of practicing the invention. Upon reading the following description in light of the accompanying drawing figures, those skilled in the art will understand the concepts of the invention and will recognize applications of these concepts not particularly addressed herein. It should be understood that these concepts and applications fall within the scope of the disclosure and the accompanying claims.



FIG. 1 illustrates a system 10 for identifying media items played within one or more geographic areas of interest to a user according to one embodiment of the present disclosure. In general, the system 10 includes a media service 12 and a number of devices 14-1 through 14-N having associated users 16-1 through 16-N. The devices 14-1 through 14-N are enabled to communicate with the media service 12 via a network 18 such as, but not limited to, the Internet. The media service 12 includes at least one central server 20 connected to the network 18, a user accounts repository 22, and a content repository 24. Note that each of the user accounts repository 22 and the content repository 24 may alternatively be hosted by the central server 20.


The central server 20 includes a tunersphere function 26, which may be implemented in software, hardware, or a combination thereof. In general, the tunersphere function 26 includes a playback tracking function 28, a channel definition function 30, and a request processor 32. The playback tracking function 28 operates to track playback of media items by the users 16-1 through 16-N at the devices 14-1 through 14-N. Preferably, the playback tracking function 28 maintains a play history for each of the users 16-1 through 16-N. Using the user 16-1 as an example, the play history of the user 16-1 includes an entry for each media item played by the user 16-1. Each entry includes information identifying the corresponding media item played by the user 16-1 such as, for example, a Globally Unique Identifier (GUID) of the media item, a fingerprint of the media item, a title of the media item, or the like. In addition, for at least a portion of the media items played by the user 16-1, the corresponding entries in the play history of the user 16-1 include information identifying locations at which the media items were played by the user 16-1. The information identifying the locations at which the media items were played by the user 16-1 may be, for example, latitude and longitude coordinates, a street address, a zip code, an area code, or the like. Still further, for at least a portion of the media items played by the user, the corresponding entries in the play history of the user 16-1 may include time stamps identifying times at which the corresponding media items were played by the user 16-1. In one embodiment, a time stamp may indicate a date on which the corresponding media item was played by the user 16-1 and a time of day. For example, a time stamp may be Jul. 23, 2008 at 11:17 A.M. EST.


In order to track the play histories of the users 16-1 through 16-N, the playback tracking function 28 obtains the play histories of the users 16-1 through 16-N from the devices 14-1 through 14-N. More specifically, in one embodiment, the devices 14-1 through 14-N provide playback information to the playback tracking function 28 automatically in response to playback of media items. Using the device 14-1 as an example, when the device 14-1 plays a media item for the user 16-1, the device 14-1 may automatically provide playback information including information identifying the media item, a current location of the device 14-1 which is the location at which the media item is played, and a time stamp. The device 14-1 may automatically provide the playback information once playback of the media item has been initiated, after a threshold amount of the media item has been played, or at the completion of playback of the media item.


In another embodiment, the devices 14-1 through 14-N automatically provide playback information to the playback tracking function 28 in a batch type process. Using the device 14-1 as an example, the device 14-1 may periodically provide playback information to the playback tracking function 28 for a number of media items played by device 14-1 since the device 14-1 last provided playback information to the playback tracking function 28. In this embodiment, the playback information includes information identifying the media items played by the device 14-1 since playback information was last sent, locations at which the media items were played, and time stamps identifying times at which the media items were played.


In yet another embodiment, the playback tracking function 28 may request playback information from the devices 14-1 through 14-N periodically or as otherwise desired. Again, using the device 14-1 as an example, the playback tracking function 28 may request playback information from the device 14-1. In response, the device 14-1 returns playback information for a number of media items played by the device 14-1 since the playback tracking function 28 last requested playback information from the device 14-1. Alternatively, the request from the playback tracking function 28 may define a time period such that the response from the device 14-1 includes playback information for media items played by the device 14-1 during that time period. Alternatively, the device 14-1 may return its entire play history to the playback tracking function 28 in response to the request.


The channel definition function 30 enables the users 16-1 through 16-N to define media channels for discovering media items played in defined geographic areas of interest to the users 16-1 through 16-N. Using the user 16-1 as an example, the channel definition function 30 enables the user 16-1 to define a media channel by selecting or otherwise defining one or more geographic areas of interest to the user 16-1. In addition, the user 16-1 may define a desired time window, one or more user-based criteria, one or more content-based criteria, or the like.


The request processor 32 generally operates to process media requests from the devices 14-1 through 14-N of the users 16-1 through 16-N. Continuing the example above, after the user 16-1 has defined the media channel, the user 16-1 may initiate a media request. Alternatively, the device 14-1 may automatically generate and send the media request to the media service 12. In response, the request processor 32 identifies one or more media items satisfying the media channel definition and provides a response to the device 14-1 of the user 16-1. In one embodiment, the response is a streaming media channel including the media items identified by the request processor 32. In another embodiment, the response is a list of media recommendations recommending the identified media items to the user 16-1. A media recommendation may include, for example, information identifying the recommended media item, the recommended media item, a preview of the recommended media item, a reference (e.g., URL) to the recommended media item, a reference (e.g., URL) to a preview of the recommended media item, or the like.


The user account repository 22 includes a user account 34 for each of the users 16-1 through 16-N. Using the user 16-1 as an example, the user account 34 of the user 16-1 includes the play history of the user 16-1 as maintained by the playback tracking function 28. Note that if the user 16-1 has access to multiple devices 14-1, then a separate play history may be maintained for the user 16-1 for each of the multiple devices 14-1 or a single aggregate play history may be maintained for the user 16-1 for all of the multiple devices 14-1. In this embodiment, the user account 34 of the user 16-1 also includes a media channel definition for each of one or more media channels defined by the user 16-1. A media channel definition includes information defining one or more geographic areas of interest. In addition, the media channel definition may include a desired time window, one or more user-based criteria, one or more content-based criteria, or any combination thereof. However, in an alternative embodiment, the media channel definitions may be stored at the device 14-1 of the user 16-1 and provided to the media service 12 in association with corresponding media requests.


The user account 34 of the user 16-1 may also include user preferences of the user 16-1, an online status of the user 16-1, collection information for the user 16-1, a friends list of the user 16-1, a group list of the user 16-1, and a user profile of the user 16-1. The user preferences may be defined by the user at, for example, the device 14-1 of the user 16-1 and then uploaded to the central server 20. Alternatively, the user 16-1 may interact with the central server 20 via, for example, a web interface such as a web browser on the device 14-1 to define his user preferences at the central server 20. As discussed below, the user preferences of the user 16-1 may be used to order or sort lists of media items for the user 16-1 based on expected desirability. For instance, media items identified by the request processor 32 may be scored based on the user preferences of the user 16-1 to provide a sorted list of media items. The user preferences of the user may include, for example, a weight or priority assigned to each of a number of categories such as user, genre, decade of release, and location/availability and each of a number of possible attributes of each of the categories. For an exemplary process for scoring media items based on user preferences defined as category weights and attribute weights, the interested reader is directed to U.S. Patent Application Publication No. 2008/0016205 entitled “P2P Network For Providing Real Time Media Recommendations,” which is hereby incorporated herein by reference in its entirety.


The online status of the user 16-1 may be used to store information indicating whether the user 16-1 is currently online and logged into the media service 12. The collection information may include a record of media items in a media collection of the user 16-1 stored on the device 14-1. Note that in another embodiment, the user 16-1 may be associated with multiple devices 14-1 (e.g., home computer, work computer, portable media player, mobile smart phone, or the like), where each of the multiple devices 14-1 stores a different media collection of the user 16-1. As such, the collection information in the user account 34 of the user 16-1 may include a record of media items in each of the media collections of the user 16-1.


The friends list of the user 16-1 is a list of users with which the user 16-1 has a direct relationship in a contact list or buddy list, a list of users with which the user 16-1 has a direct or indirect relationship in a social network, or the like. The group list of the user 16-1 may define grouping of the users identified in the friends list of the user 16-1 (e.g., family, co-workers, friends, or the like).


The user profile of the user 16-1 may include demographic information describing the user 16-1 such as, for example, age, income level, gender, marital status, or the like. In addition, the user profile of the user 16-1 may include statistics about the media collection(s) of the user 16-1 such as, for example, an artist distribution, a genre distribution, and release year distribution. The artist distribution may include, for example, a number or percentage of media items in the media collection(s) of the user 16-1 performed by each of a number of artists. Likewise, the genre distribution may include, for example, a number or percentage of media items in the media collection(s) of the user 16-1 in each of a number of genres (e.g., music genres or video genres). The release year distribution may include, for example, a number or percentage of media items in the media collection(s) of the user 16-1 released in each of a number of years or range of years.


In this embodiment, the content repository 24 may include a number of media items known by the central server 20 and/or content descriptors for each of the number of media items known by the central server 20. The media items may be audio items such as songs, audio books, audio clips, or similar audio content; video items such as movies, television programs, music videos, video clips, or similar video content; or the like. The content descriptors may contain information identifying each media item known by the central server 20. For each media item, the content repository 24 may include one or more content descriptors such as, for example, a media fingerprint of the media item, a GUID of the media item, metadata for the media item, a reference (e.g., URL) to the media item in local or remote storage, or the like. Using a song as an example, the metadata for the song may include, for example, a title of the song, an artist of the song, an album on which the song was released, a date on which the song was released, a genre of the song, or the like.


Each of the devices 14-1 through 14-N may be, for example, a personal computer, a mobile smart phone having media playback capabilities, a portable media player having network capabilities, a gaming console having network and media playback capabilities, a set-top box, or the like. The device 14-1 includes a media player 36-1, a location determination function 38-1, and a content requestor 40-1, each of which may be implemented in software, hardware, or a combination thereof. The media player 36-1 operates to play media items from a media collection 42-1 of the user 16-1 stored locally at the device 14-1 or media items streamed to the device 14-1 from the media service 12.


Either the media player 36-1 or the content requestor 40-1 provide playback information to the playback tracking function 28 of the tunersphere function 26 identifying media items played by the user 16-1 and locations at which the media items were played. In addition, the playback information may include timestamps defining times at which the media items were played. In one embodiment, in response to playback of a media item by the media player 36-1, the media player 36-1 or the content requestor 40-1 automatically provides corresponding playback information to the media service 12. In another embodiment, the media player 36-1 or the content requestor 40-1 periodically provides playback information to the media service 12 for media items played by the media player 36-1 in the time period since playback information was last sent to the media service 12. As a final example, the playback tracking function 28 may periodically request playback information from the device 14-1.


The location determination function 38-1 generally operates to obtain the location of the device 14-1, where the location of the device 14-1 is then included in the playback information provided to the media service 12 as discussed above. In general, the location determination function 38-1 may be any software and/or hardware application enabled to determine or otherwise obtain the location of the device 14-1. As an example, the location determination function 38-1 may be a Global Positioning System (GPS) receiver. As another example, the device 14-1 may be a mobile telephone where mobile base station triangulation is utilized to determine the location of the device 14-1. The location determination function 38-1 may then obtain the location of the device 14-1 from the mobile telecommunications network periodically or as needed.


In this embodiment, the content requestor 40-1 interacts with the channel definition function 30 of the tunersphere function 26 to enable the user 16-1 to define one or more media channels. In addition, either automatically or when initiated by the user 16-1, the content requestor 40-1 issues a media request for a media channel to the tunersphere function 26 of the media service 12. The content requestor 40-1 may also operate to process responses received from the request processor 32 of the tunersphere function 26 in response to media requests. In this embodiment, the content requestor 40-1 may be implemented as, for example, a standard web browser having web access to the tunersphere function 26, a plug-in for a web browser, a stand-alone application, or the like.


It should be noted that in an alternative embodiment, some of the functionality of the tunersphere function 26 may be implemented on the devices 14-1 through 14-N. For example, the channel definition function 30 may alternatively be implemented at the device 14-1 as part of, for example, the content requestor 40-1. The media channel definitions may then be stored locally at the device 14-1 and provided to the media service 12 as part of corresponding media requests or in association with corresponding media requests. Alternatively, the media channel definitions may be uploaded to the central server 20 and stored in the user account 34 of the user 16-1 where media requests issued by the content requestor 40-1 may reference the corresponding media channel definition.


Like the device 14-1, the devices 14-2 through 14-N include media players 36-2 through 36-N, location determination functions 38-2 through 38-N, content requestors 40-2 through 40-N, and media collections 42-2 through 42-N, respectively. The media players 36-2 through 36-N, the location determination functions 38-2 through 38-N, the content requestors 40-2 through 40-N, and the media collections 42-2 through 42-N are substantially the same as the corresponding elements of the device 14-1. As such, the details regarding the media players 36-2 through 36-N, the location determination functions 38-2 through 38-N, the content requestors 40-2 through 40-N, and the media collections 42-2 through 42-N are not repeated.



FIG. 2 illustrates the operation of the system 10 of FIG. 1 according to one embodiment of the present disclosure. First, the devices 14-1 through 14-N provide the play histories of the users 16-1 through 16-N to the central server 20 (steps 100-1 through 100-N). As discussed above, the play histories of the users 16-1 through 16-N identify media items played by the users 16-1 through 16-N, locations at which the users 16-1 through 16-N played the media items, and, in some embodiments, times at which the users 16-1 through 16-N played the media items.


In this example, via the device 14-1, the user 16-1 interacts with the channel definition function 30 of the tunersphere function 26 hosted by the central server 20 to define a media channel (step 102). Again, the user 16-1 generally defines the media channel by selecting or otherwise defining one or more geographic areas of interest for the media channel. In addition, the user 16-1 may define a time window of interest, one or more user-based criteria, one or more content-based criteria, or any combination thereof. The media channel definition is then stored by the central server 20 in the user account 34 of the user 16-1 (step 104).


Next, the content requestor 40-1 of the device 14-1 sends a media request for the media channel to the central server 20 (step 106). In response, the request processor 32 obtains the media channel definition from the user account 34 of the user 16-1 and processes the play histories of the other users 16-2 through 16-N, or some select subset thereof, to identify one or more media items that were played within the one or more geographic areas of interest for the media channel and that satisfy any additional criteria for the media channel (step 108). In general, the request processor 32 identifies the one or more media items based on the media channel definition which includes information defining one or more geographic areas of interest and, optionally, one or more of the following: a time window, one or more user-based criteria, and one or more content-based criteria. More specifically, in a first exemplary embodiment, the media channel is defined by one or more geographic areas of interest. As such, the request processor 32 may process the play histories of all of the other users 16-2 through 16-N to identify one or more media items that were played at locations within the one or more geographic areas of interest for the media channel.


In a second exemplary embodiment, the media channel definition includes information defining one or more geographic areas of interest and a time window. As such, the request processor 32 may process the play histories of the other users 16-2 through 16-N to identify one or more media items that were played at locations within the one or more geographic areas of interest for the media channel during the defined time window for the media channel. The time window may be relative to a current time. For example, the time window may be “within the last 30 days” or “within the last 1 hour.” The time window may alternatively be a static time window. For example, the static time window may be “the year 2008” or “June of 2008.”


In a third exemplary embodiment, the media channel definition includes information defining one or more geographic areas of interest and one or more user-based criteria. As such, the request processor 32 may process the play histories of at least a subset of the other users 16-2 through 16-N to identify one or more media items that were played at locations within the one or more geographic areas of interest for the media channel by users satisfying the one or more user-based criteria. The user-based criteria may include, for example, a friends list of the user 16-1, a group of friends of the user 16-1, one or more profile matching criteria, a social distance criterion, a status criterion, one or more keyword criteria, or the like. The friends list may be used by the request processor 32 such that the one or more identified media items are media items played at locations within the one or more geographic areas of interest for the media channel by other users in the friends list of the user 16-1. Similarly, the group of friends may be used by the request processor 32 such that the one or more identified media items are media items played at locations within the one or more geographic areas of interest for the media channel by other users in the group of friends of the user 16-1.


The profile matching criteria may be defined such that media items played by users having user profiles that match the user profile of the user 16-1 at least to a defined degree are selected. The profile matching criteria may, for example, define one or more user profile elements that must match or match at least to a defined degree, define a threshold number of user profile elements that must match or at least match to a defined degree before two user profiles are determined to be matching, or the like. As such, the one or more media items identified by the request processor 32 are media items played at locations within the one or more geographic areas of interest for the media channel by other users having user profiles that match the user profile of the user 16-1 at least to a defined degree.


The social distance criterion may define a maximum social distance (e.g., maximum degree of separation) such that media items played by other users within the maximum social network distance from the user 16-1 in a social network are selected. As such, the one or more media items identified by the request processor 32 are media items played at locations within the one or more geographic areas of interest for the media channel by other users located within the maximum social distance from the user 16-1 in the social network.


The status criterion may be defined such that media items played by other users that are currently online are selected. As such, the one or more media items identified by the request processor 32 are media items played at locations within the one or more geographic areas of interest for the media channel by other users that are currently online.


In one embodiment, users such as the users 16-1 through 16-N may be enabled to tag other users with keywords. As such, one or more keyword criteria may be defined such that media items played by users tagged with keywords that satisfy the keyword criteria are selected. As such, the one or more media items identified by the request processor 32 are media items played at locations within the one or more geographic areas of interest for the media channel by users that have been tagged with keywords that satisfy the one or more keyword criteria.


In a fourth exemplary embodiment, the media channel definition includes information defining one or more geographic areas of interest and one or more content-based criteria. As such, the request processor 32 may process the play histories of at least a subset of the other users 16-2 through 16-N to identify one or more media items that were played at locations within the one or more geographic areas of interest for the media channel and that satisfy the one or more content-based criteria. The content-based criteria may include, for example, seed media item information, a performance criterion, a creator criterion, one or more metadata criteria, an age criterion, one or more keyword criteria, one or more feature criteria, a usage criterion, or the like.


The seed media item criteria may be information describing a seed media item. For example, if the seed media item is a seed song, the information describing the seed song may be metadata for the seed song such as an artist of the seed song, an album on which the seed song was released, a year of release of the seed song, a decade of release of the seed song, a genre of the seed song, one or more keywords appearing in the title and/or lyrics of the seed song, or the like. In addition or alternatively, the information describing the seed song may be one or more features of the seed song such as, for example, a tempo of the seed song, beats-per-minute of the seed song, or the like. As such, the one or more media items identified by the request processor 32 are media items played at locations within the one or more geographic areas of interest for the media channel and that match the information describing the seed media item at least to a defined degree.


The performance criterion may be defined such that the one or more media items identified by the request processor 32 having a live performance location (e.g., upcoming concert or live performance location) within a defined proximity of a current location of the user 16-1 are selected. As such, the one or more media items identified by the request processor 32 are media items played at locations within the one or more geographic areas of interest for the media channel and that have upcoming live performance locations within the defined proximity of the user 16-1.


The creator criterion selects media items having artists or other persons that created the media items if the artists or other persons that created the media items are located within a defined proximity of a current location of the user 16-1. The creator of a media item is in proximity to the user 16-1 if the creator is within a defined geographic distance from the user 16-1.


The one or more metadata criteria may be defined such that media items having metadata that satisfies the metadata criteria are selected. For songs, the metadata criteria may define, for example, one or more desired music genres, one or more desired artists, one or more desired dates of release, one or more desired decades of release, one or more music genres that are not desired, one or more artists that are not desired, one or more dates of release that are not desired, one or more decades of release that are not desired, or the like. Likewise, similar metadata criteria may be defined for other types of audio and video media items. As such, the one or more media items identified by the request processor 32 are media items played at locations within the one or more geographic areas of interest for the media channel and that have metadata that satisfies the one or more metadata criteria.


The age criterion may be defined such that media items that have a time lapse since the media items were last played that matches the age criterion are selected. The age criterion may be a time window relative to the current time such as, for example, “last played within previous hour.” The age criterion may alternatively define a time-window relative to a static time such as, for example, “last played after Jul. 22, 2008.” As such, the one or more media items identified by the request processor 32 are media items played at locations within the one or more geographic areas of interest for the media channel and that satisfy the age criterion.


In one embodiment, users such as the users 16-1 through 16-N may be enabled to tag media items with keywords. As such, one or more keyword criteria may be defined such that media items that have been tagged with keywords satisfying the keyword criteria are selected. As such, the one or more media items identified by the request processor 32 are media items played at locations within the one or more geographic areas of interest for the media channel and that have been tagged with keywords that satisfy the one or more keyword criteria.


The one or more feature criteria may be defined such that media items having metadata that satisfies the feature criteria are selected. Using features of a song as an example, the feature criteria may define, for example, one or more desired tempos, one or more desired beats-per-minute values, one or more tempos that are not desired, one or more beats-per-minute values that are not desired, or the like. As such, the one or more media items identified by the request processor 32 are media items played at locations within the one or more geographic areas of interest for the media channel and that have metadata that satisfies the one or more feature criteria.


The usage criterion may be defined such that media items played more than a threshold number of times within the one or more geographic areas of interest or more than a threshold number of times by a particular user are selected. As such, the one or more media items identified by the request processor 32 are media items played at locations within the one or more geographic areas of interest for the media channel and that satisfy the usage criterion.


Once the one or more media items for the media channel are identified, the request processor 32 of the tunersphere function 26 hosted by the central server 20 sends a response to the media request to the device 14-1 of the user 16-1 (step 110). In one embodiment, the response is a streaming media channel including the one or more media items identified by the request processor 32. In another embodiment, the response is a list of media recommendations identifying the one or more media items identified by the request processor 32. Again, a media recommendation may include, for example, information identifying the recommended media item, the recommended media item, a preview of the recommended media item, a reference (e.g., URL) to the recommended media item, a reference (e.g., URL) to a preview of the recommended media item, or the like.


Note that, in one embodiment, once a media request has been issued, the request processor 32 may continually update the response. More specifically, as the play histories of the other users 16-2 through 16-N are updated, the request processor 32 may update the response provided to the device 14-1 of the user 16-1 to include any additional media items played by the other users 16-2 through 16-N that satisfy the requirements of the media channel. As an example, if the response provided by the request processor 32 is a streaming media channel, the streaming media channel may be continually or periodically updated to add media items to the media channel that have been played in the one or more geographic areas of interest for the media channel and that satisfy any additional criteria for the media channel. As another example, if the response provided by the request processor 32 is a list of media recommendations, the request processor 32 may continually or periodically update the list of media recommendations to include media recommendations for additional media items that have been played in the one or more geographic areas of interest for the media channel and that satisfy any additional criteria for the media channel.



FIG. 3 is a more detailed flow chart illustrating step 108 of FIG. 2 according to one embodiment of the present disclosure. In this example, the media channel definition includes information defining one or more geographic areas of interest, a time window, one or more user-based criteria, and one or more content-based criteria. However, as discussed above, the present disclosure is not limited thereto. First, the request processor 32 of the tunersphere function 26 hosted by the central server 20 identifies users from the other users known to the media service 12, which for this example are the other users 16-2 through 16-N, that satisfy the one or more user-based criteria (step 200). Next, the request processor 32 processes the play histories of the users that satisfy the user-based criteria to identify media items played in the one or more geographic areas of interest for the media channel during the defined time window (step 202). The request processor 32 then filters the identified media items based on the one or more content-based criteria to identify the one or more media items for the media channel (step 204).


Optionally, the request processor 32 may score the one or more media items identified for the media channel based on the user preferences of the requesting user, which in this example is the user 16-1 (step 206). In addition, as discussed below, the scores of the media items may further be a function of weights assigned to the one or more geographic areas of interest. The scores may then be used to prioritize the media items identified for the media channel when generating and sending the response to the device 14-1 of the user 16-1. For example, if the response is a streaming media channel, the media items may be provided in the streaming media channel in an order defined by the scores of the media items. As another example, if the response is a list of media recommendations, the scores may be provided in association with the media recommendations as an indication of an expected desirability of the media items to the user 16-1.



FIG. 4 is a more detailed flow chart illustrating step 108 of FIG. 2 according to another embodiment of the present disclosure. In this example, the media channel definition includes information defining one or more geographic areas of interest, a time window, one or more user-based criteria, and one or more content-based criteria. However, as discussed above, the present disclosure is not limited thereto. First, the request processor 32 processes the play histories of the users known to the media service 12 to identify media items played in the one or more geographic areas of interest for the media channel during the defined time window (step 300). Next, the request processor 32 of the tunersphere function 26 hosted by the central server 20 filters the identified media items based on the one or more user-based criteria and the one or more content-based criteria to identify the one or more media items for the media channel (step 302).


Optionally, the request processor 32 may score the one or more media items identified for the media channel based on the user preferences of the requesting user, which in this example is the user 16-1 (step 304). In addition, as discussed below, the scores of the media items may further be a function of weights assigned to the one or more geographic areas of interest. Again, the scores may then be used to prioritize the media items identified for the media channel when generating and sending the response to the device 14-1 of the user 16-1. For example, if the response is a streaming media channel, the media items may be provided in the streaming media channel in an order defined by the scores of the media items. As another example, if the response is a list of media recommendations, the scores may be provided in association with the media recommendations as an indication of an expected desirability of the media items to the user 16-1.



FIG. 5 illustrates an exemplary Graphical User Interface (GUI) 44 enabling a user to define one or more media channels according to one embodiment of the present disclosure. In general, the GUI 44 includes a geographic area selection tool 46. In this example, the geographic area selection tool 46 includes a map of an overall geographic area from which the user may select, which in this case is a map of the Earth. In addition, the map is segmented into a number of cells, which in this example are hexagons. The user may then select one or more geographic areas of interest for the media channel by selecting corresponding cells on the map. In this example, the user has selected four geographic areas. Specifically, the user has selected cell A as a first geographic area, cell B as a second geographic area, cell C as a third geographic area, and cell D as a fourth geographic area. Note that in this example, each of the geographic areas is defined by a single cell. However, the present disclosure is not limited thereto. Each geographic area may be defined by one or more cells.


In addition to the geographic area selection tool 46, the GUI 44 includes criteria selection tools 48 through 54 enabling the user to select additional criteria for the media channel for each of the defined geographic areas. In this example, the criteria selection tool 48 enables the user to select one or more music genres to be filtered or removed when identifying media items for the media channel. So, for instance, when identifying media items for the media channel, songs from the Pop, Reggae, Folk, Country, Bluegrass, and Hair Metal genres that were played in the geographic area defined by cell A are not selected for the media channel. However, in other embodiment, the criteria selection tools 48 through 54 may enable the user to select other types of content-based criteria or user-based criteria and/or a time window for the media channel.


The GUI 44 may also enable the user to assign weights to each of the geographic areas for the media channel. In this example, slider bars 56 through 62 enable the user to assign weights to the geographic areas defined by cells A through D, respectively. In one embodiment, the weights assigned to the geographic areas as well as user preferences of the user may be used to score the media items identified by the request processor 32 for the media channel. The media items may then be prioritized based on their scores, media items having scores less than a threshold may be filtered, or the like. In another embodiment, the weights assigned to the geographic areas may be used to determine a number of media items selected for each of the geographic areas such that more media items are selected for geographic areas having higher weights.



FIGS. 6A through 61 graphically illustrate the operation of the system 10 of FIG. 1 according to an exemplary embodiment of the present disclosure. FIG. 6A illustrates a GUI 64 enabling a user to define one or more media channels and receive a response from the media service regarding one or more media items identified for the one or more media channels. The GUI 64 includes a toolbar 66, which includes a number of tools 68 through 92. A home tool 68 takes the user to a base screen, which in this example is a full view of a map 69 of the Earth. More specifically, in this example, the map 69 is a globe of the Earth. A globe tool 70 enables the user to select among multiple views of the map 69 of the Earth such as a satellite view, an abstracted view, a political boundaries view, or the like. The globe tool 70 may also enable the user to save a favorite view. A hexagon tool 72 toggles a hexagon overlay on or off in order to enable the user to select desired geographic areas by selecting corresponding hexagons overlaid onto the map 69. The hexagon tool 72 may also enable the user to set a size of the hexagons relative to a current zoom level on the map 69. Note that hexagons are exemplary; other shapes may be used.


A hand tool 74 enables the user to grab and rotate the map 69, or globe, of the Earth. A hand pointer tool 76 enables the user to select desired geographic areas by, for example, selecting corresponding hexagons overlaid onto the map 69, by drawing arbitrary shapes on the map 69, or the like. A pointer tool 78 enables the user to click and drag selections between the multiple sections of the GUI 64. A magnifier tool 80 enables the user to magnify the map 69 independently from zooming in or out of the map 69 using a zoom tool 81. A shopping cart tool 82 enables the user to select discovered media items for subsequent purchase. A notes tool 84 enables the user to attach comments to media items and/or media channels. A chat tool 86 enables the user to initiate a chat session with other users of the media service 12. A transmit tool 88 enables the user to enable or disable sharing of his play history. A log-in tool 90 enables the user to login to and logout of the media service 12. A trash tool 92 is a general purpose trash function that enables the user to discard media channels, selected geographic areas in a media channel, criteria defined for a media channel, or the like.


In this example, the GUI 64 includes a first section 94, which includes the map 69 of the Earth. The GUI 64 also includes a second section 96, which enables the user to define and access a number of media channels defined by the user. More specifically, in this example, the user is currently viewing a “Rome Alternative” media channel previously defined by the user. The user may select a forward button 98 or a reverse button 100 to browse through additional media channels. Once the user has browsed to a last or first media channel, the user may select the forward button 98 or the reverse button 100, respectively, to cause the creation of a new media channel.


In this example, the “Rome Alternative” media channel has a defined geographic area 102 and a number of criteria 104. While the criteria may include a time window, one or more user-based criteria, and/or one or more content-based criteria, in this example, the criteria 104 include a number of content-based criteria which are more specifically a number of music genres. Further, for each music genre, the user has defined a corresponding weight using slider bars 106 through 126. Note that in this example, rather than selecting which music genres to include or exclude as was done in the exemplary embodiment of FIG. 5, the user is enabled to defined weights for each of the music genres. Then, in order to filter media items played within the geographic area 102, the media items may be scored as a function of the weights set by the slider bars 106 through 126. Media items having scores less than a threshold may then be filtered. Alternatively, the weights assigned to the music genres may control a number or percentage of the media items identified for the media channel for each genre or a maximum number or percentage of the media items identified for the media channel for each genre.


The GUI 64 also includes a third section 126, which is used to display a list 128 of the one or more media items identified by the request processor 32 of the tunersphere function 26 for the selected media channel. In addition, if scores are generated for the media items, the list 128 may also include the scores of the media items. As discussed above, in one embodiment, the response from the request processor 32 of the tunersphere function 26 is a streaming media channel. As such, the list 128 may include a list of media items included in the streaming media channel. The user may be enabled to skip forward or backward in the stream by selecting corresponding media items from the list 128. If the user desires to add one of the media items to his shopping cart, the user may drag the corresponding entry from the list 128 to the shopping cart tool 82. In another embodiment, the list 128 is a list of media recommendations provided for the media channel. The user may add desired media items from the list 128 to his shopping cart using the shopping cart tool 82.



FIG. 6B illustrates a situation where the user is defining a new media channel, which in this example is a “Detroit Rock” media channel. The user has already defined weights for a number of music genres for the “Detroit Rock” media channel and is in the process of selecting a geographic area of interest for the “Detroit Rock” media channel. The user has rotated the map 69 and zoomed in on North America. The user has then activated the magnifier tool 80. Note that the magnification of the magnifier tool 80 is set independently from the zoom level set by the zoom tool 81. The magnifier tool 80 may also enable the user to set a desired view for the zoom tool 81 such as, for example, realistic (i.e., satellite), abstract, hybrid, or custom view. The user may also be enabled to set a filter to “current” or “custom.” When the filter for the magnifier tool 80 is set to “current,” then the magnifier tool 80 shows the positions of users that have played media items satisfying the current criteria for the “Detroit Rock” media channel or locations at which media items satisfying the current criteria for the “Detroit Rock” media channel have been played. Alternatively, the user may define a “custom” filter such that the magnifier tool 80 shows the positions of users that have played media items satisfying the custom filter or locations at which media items satisfying the custom filter have been played.



FIG. 6C illustrates the scenario where the user has further zoomed in on a geographic area. Then, in this example, the user activates the hexagon tool 72 such that hexagons are overlaid on the map 69 as shown in FIG. 6D. The user may then activate the hand pointer tool 76 and select one or more of the hexagons overlaid on the map 69 in order to select a desired geographic area of interest as shown in FIG. 6E. Once the geographic area of interest has been selected, the user may drag and drop the selected geographic area into the second section 96 of the GUI 64 as shown in FIG. 6F, thereby associating the selected geographic area with the “Detroit Rock” media channel. In this example, a media request is then automatically sent to the media service 12. In response, the request processor 32 identifies one or more media items for the media channel and returns a response. The response may be a streaming media channel including the identified media items, a list of recommended media items, or the like. As such, in this example, a list 130 of the identified media items is presented in the third section 126 of the GUI 64.



FIGS. 6G and 6H illustrate a scenario where the user defines a new “Rome/Detroit Rock” media channel. The user may initialize the “Rome/Detroit Rock” media channel using the “Rome Alternative” media channel definition of FIG. 6A. The user may then open a window 132 for the “Detroit Rock” media channel and add the geographic area selected for the “Detroit Rock” media channel to the “Rome/Detroit Rock” media channel using a drag and drop process. Optionally, particularly when a media channel has more than one geographic area, the user may be enabled to assign a weight to each of the geographic areas as illustrated in FIG. 61. As discussed above, the weights assigned to the geographic areas may be used during identification and selection of media items for the media channel.


With regard to weighting and scoring, in one embodiment, weights are assigned to the geographic areas of interest, user-based criteria, and content-based criteria for the media channel. As such, the request processor 32 may identify all media items played within the one or more geographic areas of interest during any defined time window. The identified media items may then be scored as a function of the weights assigned to the geographic areas of interest, the weights assigned to the user-based criteria, the weights assigned to the content-based criteria and, optionally, the user preferences of the user. The media items having scores less than a threshold may then be filtered.


Before proceeding, it should be noted that while the embodiments of FIGS. 1 through 6I described above focus on the use of play histories, any type of media interaction history may additionally or alternatively be used here. As used here, interaction with a media item refers to any type of interaction with a media item such as, for example, listening to a song, watching or viewing a video, receiving a recommendation of the media item, making a recommendation for the media item, downloading the media item, purchasing the media item, rating the media item (e.g., a Facebook® like), identification of the media item (e.g., via fingerprinting or similar audio or video identification technique), or the like. A media interaction history identifies (e.g., lists) media interacted with by a corresponding user.



FIG. 7 illustrates the operation of the media service 12 of FIG. 1 according to another embodiment of the present disclosure. In this embodiment, the media service 12, and more specifically the playback tracking function 28, obtains the media interaction histories of the users 16-1 through 16-N as described above (step 400). The media interaction histories may be obtained in any suitable manner. For example, some media interaction histories may be maintained by and obtained from a social networking service (e.g., Facebook), maintained by the media service 12, or a combination thereof. In one embodiment, the media interaction histories of the users 16-1 through 16-N are play histories of the users 16-1 through 16-N. However, the media interaction histories are not limited thereto. The media interaction histories of the users 16-1 through 16-N are more generally information providing a historical record of media items interacted with by the users 16-1 through 16-N. Again, as used herein, interaction with a media item refers to any type of interaction with the media item such as, for example, listening to the media item in the case where the media item is a song, watching or viewing the media item in the case where the media item is a video, receiving a recommendation for the media item, making a recommendation for the media item, downloading the media item, purchasing the media item, rating the media item (e.g., a Facebook® like), identifying the media item, or the like.


The media service 12 receives a map request from, in this example, the device 14-1 of the user 16-1, which is referred to as the requesting user 16-1 (step 402). The map request may be sent by, for example, the content requestor 40-1 automatically or in response to user input from the user 16-1. The map request is generally a request for representative information for a geographic area of interest. The geographic area of interest may be a specific geographic area of interest for which representative information is desired. Alternatively, the geographic area of interest may be a broad geographic area of interest that is to be sub-divided into sub-areas for purposes of determining representative information. Further, the geographic area of interest may be one of a number of predefined geographic areas of interest selected by the requesting user 16-1 or an arbitrary geographic area of interest selected by the requesting user 16-1. In addition to defining the geographic area of interest, the map request may define one or more time-based criteria (e.g., a time window of interest), one or more user-based criteria, one or more content-based criteria, or any combination thereof.


In response to the map request, the request processor 32 determines representative information for the one or more areas of interest based on at least a subset of the media interaction histories of the users 16-1 through 16-N (step 404). More specifically, in one embodiment, the geographic area of interest is a specific area of interest, and the request processor 32 processes the media interaction histories of the users 16-1 through 16-N, or some select subset thereof, to identify one or more media items that were interacted with by at least some of the users 16-1 through 16-N within the geographic areas of interest and, in some embodiments, satisfy any additional criteria for the map request (e.g., one or more time-based criteria, one or more user-based criteria, and/or one or more content-based criteria). The request processor 32 then determines representative information for the geographic area of interest based on the identified media items.


In one embodiment, the representative information for the geographic area of interest includes information that identifies:

    • a most consumed (e.g., played) media item in the geographic area of interest,
    • a most frequently consumed media item in the geographic area of interest,
    • a media item having a greatest increase in consumption in the geographic area of interest over a defined period of time,
    • a media item having a greatest decrease in consumption in the geographic area of interest over a defined period of time,
    • a most consumed song in the geographic area of interest,
    • a most frequently consumed song in the geographic area of interest,
    • a song having a greatest increase in consumption in the geographic area of interest over a defined period of time,
    • a song having a greatest decrease in consumption in the geographic area of interest over a defined period of time,
    • a most consumed music album in the geographic area of interest,
    • a most frequently consumed music album in the geographic area of interest,
    • a music album having a greatest increase in consumption in the geographic area of interest over a defined period of time,
    • a music album having a greatest decrease in consumption in the geographic area of interest over a defined period of time,
    • a most consumed media genre (e.g., music genre) in the geographic area of interest,
    • a most frequently consumed media genre in the geographic area of interest,
    • a media genre having a greatest increase in consumption in the geographic area of interest over a defined period of time,
    • a media genre having a greatest decrease in consumption in the geographic area of interest over a defined period of time,
    • a most consumed music artist in the geographic area of interest,
    • a most frequently consumed music artist in the geographic area of interest,
    • a music artist having a greatest increase in consumption in the geographic area of interest over a defined period of time, or
    • a music artist having a greatest decrease in consumption in the geographic area of interest over a defined period of time.


      As used herein, a media item is consumed by a user when the user plays, listens to, or views the media item. Similarly, a music album is consumed when a user plays or listens to one or more, or in some embodiments all, of the songs forming the music album. A media genre (e.g., music genre) is consumed when a user consumes a media item in the media genre. Similarly, a music artist is consumed when a user consumed a song of the music artist.


The media item having the greatest increase in consumption in the geographic area of interest over the defined period of time may be determined by, for example, determining the number of times that a media item was consumed by the users 16-1 through 16-N within the geographic area of interest during a first time interval at the start of the defined time period and the number of times that the media item was consumed by the users 16-1 through 16-N within the geographic area of interest during a second time interval at the end of the defined time period. A difference in these two numbers (e.g., a percentage increase from the number for the first time interval to the number for the second time interval) is the increase in consumption for the media item. This process is repeated for other media items as well. Then, the media item having the greatest increase in consumption is identified. The media item having the greatest decrease in consumption in the geographic area of interest may be determined in the same manner. Likewise, the song, music album, media genre, and music artist having the greatest increase/decrease in consumption for the geographic area of interest may be determined using the same technique.


In addition or alternatively, the representative information for the geographic area of interest includes information that identifies:

    • a most followed media recommender in the geographic area of interest,
    • a most frequently followed media recommender in the geographic area of interest,
    • a media recommender having a greatest increase in recommendation consumption (e.g., recommendation receipt or purchases resulting from recommendation) in the geographic area of interest over a defined period of time, or
    • a media recommender having a greatest decrease in consumption in the geographic area of interest over a defined period of time.


      A “follower” may be, for example, a Twitter® follower or the like. A media recommender may be a person, organization, service, or the like that makes media recommendations (e.g., song recommendations). More specifically, the media interaction histories of the users 16-1 through 16-N may identify media recommendations received by the user 16-1 through 16-N and media recommenders from which the media recommendations were received. The media recommenders may be other ones of the users 16-1 through 16-N, users other than the users 16-1 through 16-N, an automated service, an organization (e.g., a business), or the like. In one embodiment, the users 16-1 through 16-N receive the media recommendations as a result of being “followers” of the corresponding media recommenders or by having otherwise subscribed to media item recommendations from those media recommenders.


The media recommender having the greatest increase in recommendation consumption in the geographic area of interest over the defined period of time may be determined by, for example, determining the number of times that media recommendations from a recommender were consumed by the users 16-1 through 16-N within the geographic area of interest during a first time interval at the start of the defined time period and the number of times that the media recommendations from the recommender were consumed by the users 16-1 through 16-N within the geographic area of interest during a second time interval at the end of the defined time period. A difference in these two numbers (e.g., a percentage increase from the number for the first time interval to the number for the second time interval) is the increase in recommendation consumption for the recommender. This process is repeated for other recommenders as well. Then, the recommender having the greatest increase in recommendation consumption is identified. The media recommender having the greatest decrease in recommendation consumption in the geographic area of interest may be determined in the same manner.


In addition or alternatively, the representative information for the geographic area of interest includes information that identifies:

    • a most liked (e.g., liked via Facebook® “like” feature or similar feature of a similar social media application or service) media item in the geographic area of interest,
    • a most frequently liked media item in the geographic area of interest,
    • a media item having a greatest increase in likes in the geographic area of interest over a defined period of time,
    • a media item having a greatest decrease in likes in the geographic area of interest over a defined period of time,
    • a most liked song in the geographic area of interest,
    • a most frequently liked song in the geographic area of interest,
    • a song having a greatest increase in likes in the geographic area of interest over a defined period of time,
    • a song having a greatest decrease in likes in the geographic area of interest over a defined period of time,
    • a most liked music album in the geographic area of interest,
    • a most frequently liked music album in the geographic area of interest,
    • a music album having a greatest increase in likes in the geographic area of interest over a defined period of time,
    • a music album having a greatest decrease in likes in the geographic area of interest over a defined period of time,
    • a most liked media genre (e.g., music genre) in the geographic area of interest,
    • a most frequently liked media genre in the geographic area of interest,
    • a media genre having a greatest increase in likes in the geographic area of interest over a defined period of time,
    • a media genre having a greatest decrease in likes in the geographic area of interest over a defined period of time,
    • a most liked music artist in the geographic area of interest,
    • a most frequently liked music artist in the geographic area of interest,
    • a music artist having a greatest increase in likes in the geographic area of interest over a defined period of time, or
    • a music artist having a greatest decrease in likes in the geographic area of interest over a defined period of time.


The media item having the greatest increase in likes in the geographic area of interest over the defined period of time may be determined by, for example, determining the number of times that a media item was liked by the users 16-1 through 16-N within the geographic area of interest during a first time interval at the start of the defined time period and the number of times that the media item was liked by the users 16-1 through 16-N within the geographic area of interest during a second time interval at the end of the defined time period. A difference in these two numbers (e.g., a percentage increase from the number for the first time interval to the number for the second time interval) is the increase in likes for the media item. This process is repeated for other media items as well. Then, the media item having the greatest increase in likes is identified. The media item having the greatest decrease in likes in the geographic area of interest may be determined in the same manner. Likewise, the song, music album, media genre, and music artist having the greatest increase/decrease in likes for the geographic area of interest may be determined using the same technique.


In yet another embodiment, the representative information for the geographic area of interest includes information that identifies:

    • a most downloaded media item (e.g., song) in the geographic area of interest,
    • a most frequently downloaded media item in the geographic area of interest,
    • a media item having a greatest increase in downloads in the geographic area of interest over a defined period of time,
    • a media item having a greatest decrease in downloads in the geographic area of interest over a defined period of time,
    • a most downloaded music album in the geographic area of interest,
    • a most frequently downloaded music album in the geographic area of interest,
    • a music album having a greatest increase in downloads in the geographic area of interest over a defined period of time,
    • a music album having a greatest decrease in downloads in the geographic area of interest over a defined period of time,
    • a most downloaded media genre (e.g., music genre) in the geographic area of interest,
    • a most frequently downloaded media genre in the geographic area of interest,
    • a media genre having a greatest increase in downloads in the geographic area of interest over a defined period of time,
    • a media genre having a greatest decrease in downloads in the geographic area of interest over a defined period of time,
    • a most purchased media item (e.g., song) in the geographic area of interest,
    • a most frequently purchased media item in the geographic area of interest,
    • a media item having a greatest increase in purchases in the geographic area of interest over a defined period of time,
    • a media item having a greatest decrease in purchases in the geographic area of interest over a defined period of time,
    • a most purchased music album in the geographic area of interest,
    • a most frequently purchased music album in the geographic area of interest,
    • a music album having a greatest increase in purchases in the geographic area of interest over a defined period of time,
    • a music album having a greatest decrease in purchases in the geographic area of interest over a defined period of time,
    • a most purchased media genre (e.g., music genre) in the geographic area of interest,
    • a most frequently purchased media genre in the geographic area of interest,
    • a media genre having a greatest increase in purchases in the geographic area of interest over a defined period of time, or
    • a media genre having a greatest decrease in purchases in the geographic area of interest over a defined period of time.


The media item having the greatest increase in downloads (or purchases) in the geographic area of interest over the defined period of time may be determined by, for example, determining the number of times that a media item was downloaded by the users 16-1 through 16-N within the geographic area of interest during a first time interval at the start of the defined time period and the number of times that the media item was downloaded by the users 16-1 through 16-N within the geographic area of interest during a second time interval at the end of the defined time period. A difference in these two numbers (e.g., a percentage increase from the number for the first time interval to the number for the second time interval) is the increase in downloads for the media item. This process is repeated for other media items as well. Then, the media item having the greatest increase in downloads is identified. The media item having the greatest decrease in downloads in the geographic area of interest may be determined in the same manner. Likewise, the song, music album, and media genre having the greatest increase/decrease in downloads (or purchases) for the geographic area of interest may be determined using the same technique.


In yet another embodiment, the representative information for the geographic area of interest includes information that identifies:

    • a most identified media item (e.g., song) in the geographic area of interest,
    • a most frequently identified media item in the geographic area of interest,
    • a media item having a greatest increase in identifications in the geographic area of interest over a defined period of time, or
    • a media item having a greatest decrease in identifications in the geographic area of interest over a defined period of time.


      Note that media items may be identified using any suitable media item identification technique such as, for example, fingerprinting. For some exemplary and non-limiting examples of media item identification techniques, the interested reader is directed to U.S. Pat. No. 7,765,192, which is hereby incorporated herein by reference for its teachings related to media item identification techniques. In general, as used herein, an identification of a media item is any event wherein a media item is identified for a user. As an example, a song heard on the radio may be identified as Song X by Artist Y by an application (e.g., Shazam) running on the user's smart phone by processing an audio sample(s) obtained via the smart phone's microphone.


The media item having the greatest increase in identification in the geographic area of interest over the defined period of time may be determined by, for example, determining the number of times that a media item was identified by the users 16-1 through 16-N within the geographic area of interest during a first time interval at the start of the defined time period and the number of times that the media item was identified by the users 16-1 through 16-N within the geographic area of interest during a second time interval at the end of the defined time period. A difference in these two numbers (e.g., a percentage increase from the number for the first time interval to the number for the second time interval) is the increase in identifications for the media item. This process is repeated for other media items as well. Then, the media item having the greatest increase in identifications is identified. The media item having the greatest decrease in identifications in the geographic area of interest may be determined in the same manner.


In another embodiment, the geographic area of interest is a broad geographic area of interest, and the request processor 32 divides the geographic area of interest into a number of sub-areas. The sub-areas may be, for example, of a predefined shape and size (e.g., hexagons of predefined shape or size). As another example, the sub-areas may be a predefined shape but have a size that is relative to a size of the geographic area of interest (e.g., larger sub-areas for larger geographic area of interest). Notably, the sub-areas may be of different shapes and sizes (e.g., the sub-areas may correspond to country, state, city, or other governmental entity borders). For each sub-area, the request processor 32 processes the media interaction histories of the users 16-1 through 16-N, or some select subset thereof, to identify one or more media items that were interacted with within the sub-area and, in some embodiments, satisfy any additional criteria for the map request (e.g., one or more time-based criteria, one or more user-based criteria, and/or one or more content-based criteria). The request processor 32 then determines representative information for the sub-area based on the identified media items.


In one embodiment, for each sub-area of the geographic area of interest, the representative information for the sub-area includes information that identifies:

    • a most consumed (e.g., played) media item in the sub-area,
    • a most frequently consumed media item in the sub-area,
    • a media item having a greatest increase in consumption in the sub-area over a defined period of time,
    • a media item having a greatest decrease in consumption in the sub-area over a defined period of time,
    • a most consumed song in the sub-area,
    • a most frequently consumed song in the sub-area,
    • a song having a greatest increase in consumption in the sub-area over a defined period of time,
    • a song having a greatest decrease in consumption in the sub-area over a defined period of time,
    • a most consumed music album in the sub-area,
    • a most frequently consumed music album in the sub-area,
    • a music album having a greatest increase in consumption in the sub-area over a defined period of time,
    • a music album having a greatest decrease in consumption in the sub-area over a defined period of time,
    • a most consumed media genre (e.g., music genre) in the sub-area,
    • a most frequently consumed media genre in the sub-area,
    • a media genre having a greatest increase in consumption in the sub-area over a defined period of time,
    • a media genre having a greatest decrease in consumption in the sub-area over a defined period of time,
    • a most consumed music artist in the sub-area,
    • a most frequently consumed music artist in the sub-area,
    • a music artist having a greatest increase in consumption in the sub-area over a defined period of time, or
    • a music artist having a greatest decrease in consumption in the sub-area over a defined period of time.


In addition or alternatively, the geographic area of interest is divided into a number of sub-areas, and the representative information includes, for each sub-area, information that identifies:

    • a most followed media recommender in the sub-area,
    • a most frequently followed media recommender in the sub-area,
    • a media recommender having a greatest increase in recommendation consumption (e.g., recommendation receipt or purchases resulting from recommendation) in the sub-area over a defined period of time, or
    • a media recommender having a greatest decrease in consumption in the sub-area over a defined period of time.


In addition or alternatively, the geographic area of interest is divided into a number of sub-areas, and the representative information includes, for each sub-area, information that identifies:

    • a most liked media item in the sub-area,
    • a most frequently liked media item in the sub-area,
    • a media item having a greatest increase in likes in the sub-area over a defined period of time,
    • a media item having a greatest decrease in likes in the sub-area over a defined period of time,
    • a most liked song in the sub-area,
    • a most frequently liked song in the sub-area,
    • a song having a greatest increase in likes in the sub-area over a defined period of time,
    • a song having a greatest decrease in likes in the sub-area over a defined period of time,
    • a most liked music album in the sub-area,
    • a most frequently liked music album in the sub-area,
    • a music album having a greatest increase in likes in the sub-area over a defined period of time,
    • a music album having a greatest decrease in likes in the sub-area over a defined period of time,
    • a most liked media genre (e.g., music genre) in the sub-area,
    • a most frequently liked media genre in the geographic area of interest,
    • a media genre having a greatest increase in likes in the sub-area over a defined period of time,
    • a media genre having a greatest decrease in likes in the sub-area over a defined period of time,
    • a most liked music artist in the sub-area,
    • a most frequently liked music artist in the sub-area,
    • a music artist having a greatest increase in likes in the sub-area over a defined period of time, or
    • a music artist having a greatest decrease in likes in the sub-area over a defined period of time.


In addition or alternatively, the geographic area of interest is divided into a number of sub-areas, and the representative information includes, for each sub-area, information that identifies:

    • a most downloaded media item (e.g., song) in the sub-area,
    • a most frequently downloaded media item in the sub-area,
    • a media item having a greatest increase in downloads in the sub-area over a defined period of time,
    • a media item having a greatest decrease in downloads in the sub-area over a defined period of time,
    • a most downloaded music album in the sub-area,
    • a most frequently downloaded music album in the sub-area,
    • a music album having a greatest increase in downloads in the sub-area over a defined period of time,
    • a music album having a greatest decrease in downloads in the sub-area over a defined period of time,
    • a most downloaded media genre (e.g., music genre) in the sub-area,
    • a most frequently downloaded media genre in the sub-area,
    • a media genre having a greatest increase in downloads in the sub-area over a defined period of time,
    • a media genre having a greatest decrease in downloads in the sub-area over a defined period of time,
    • a most purchased media item (e.g., song) in the sub-area,
    • a most frequently purchased media item in the sub-area,
    • a media item having a greatest increase in purchases in the sub-area over a defined period of time,
    • a media item having a greatest decrease in purchases in the sub-area over a defined period of time,
    • a most purchased music album in the sub-area,
    • a most frequently purchased music album in the sub-area,
    • a music album having a greatest increase in purchases in the sub-area over a defined period of time,
    • a music album having a greatest decrease in purchases in the sub-area over a defined period of time,
    • a most purchased media genre (e.g., music genre) in the sub-area,
    • a most frequently purchased media genre in the sub-area,
    • a media genre having a greatest increase in purchases in the sub-area over a defined period of time, or
    • a media genre having a greatest decrease in purchases in the sub-area over a defined period of time.


In addition or alternatively, the geographic area of interest is divided into a number of sub-areas, and the representative information includes, for each sub-area, information that identifies:

    • a most identified media item (e.g., song) in the sub-area,
    • a most frequently identified media item in the sub-area,
    • a media item having a greatest increase in identifications in the sub-area over a defined period of time, or
    • a media item having a greatest decrease in identifications in the sub-area over a defined period of time.


Once the representative information is determined, the request processor 32 effects display of the representative information (step 406). In one embodiment, the request processor 32 effects display of the representative information at the device 14-1 of the requesting user 16-1. More specifically, the request processor 32 sends the representative information to the device 14-1 of the requesting user 16-1. The device 14-1 then displays the representative information to the requesting user 16-1.


In this exemplary embodiment, the media service 12 receives a request to change or modify the geographic area of interest from the content requestor 40-1 of the user 16-1 (step 408). For example, the request may be a zoom request to either zoom in or zoom out on the geographic area of interest. As another example, the request may be a request to change the geographic area of interest to a new geographic area of interest. In response, the content requestor 40-1 changes or modifies the geographic area of interest (step 410), and then the process returns to step 404.



FIG. 8 illustrates an exemplary GUI 134 for displaying representative information to a user such as, for example, the requesting user 16-1 according to one embodiment of the present disclosure. As illustrated, in this example, the GUI 134 includes a map display area 136, and a map request with a geographic area of interest corresponding to a geographic area displayed in the map display area 136 was received by the media service 12. In response, the media service 12 divided the geographic area of interest into a number of sub-areas 138 and determined representative information for each of the sub-areas 138. In this example, the representative information for each of the sub-areas 138 is album art for a most consumed music album in that sub-area. However, the representative information is not limited thereto.


The GUI 134 may enable the user to select one of the sub-areas 138 to zoom in on that sub-area 138. In response to the selection of one of the sub-areas 138, a request to zoom in on that sub-area is sent to the media service 12. The media service 12 then determines representative information for each of a number of sub-areas of the selected sub-area. The GUI 134 is then updated to zoom in on the selected sub-area 138 to show its sub-areas and their representative information. The GUI 134 may also enable the user to rotate or otherwise manipulate the map display area 136. For instance, in this example, the map display area 136 includes a globe shaped object representing the spherical shape of the earth. The GUI 134 may enable the user to rotate the globe shaped object to view representative information for additional areas, which may be processed as sub-areas within the geographic area of interest or sub-areas within a new geographic area of interest. Still further, the GUI 134 may also enable the user to select the representative information for desired sub-area 138 to initiate playback, downloading, purchasing, or the like of one or more corresponding media items.



FIG. 9 illustrates an exemplary GUI 140 for displaying representative information to a user such as, for example, the requesting user 16-1 according to one embodiment of the present disclosure. As illustrated, in this example, the GUI 140 includes a map display area 142, and a map request with a geographic area of interest corresponding to a geographic area displayed in the map display area 142 was received by the media service 12. In this example, the geographic area of interest is divided into a number of states, which serve as sub-areas of the geographic area of interest. The media service 12 determined representative information for each of the states. In this example, the representative information for each of the states is provided as icons 144 representing most consumed music recommenders in the corresponding states/sub-areas. However, the representative information is not limited thereto.


The GUI 140 may enable the user to select one of the sub-areas to zoom in on that sub-area. In response to the selection of one of the sub-areas, a request to zoom in on that sub-area is sent to the media service 12. The media service 12 then determines representative information for each of a number of sub-areas of the selected sub-area. The GUI 140 is then updated to zoom in on the selected sub-area to show its sub-areas and their representative information. The GUI 140 may also enable the user to manipulate the map display area 142. Still further, the GUI 140 may also enable the user to select the representative information for desired state/sub-area to initiate playback, downloading, purchasing, or the like of one or more corresponding media items.



FIG. 10 is a block diagram of the device 14-1 of FIG. 1 according to one embodiment of the present disclosure. This discussion is equally applicable to the other devices 14-2 through 14-N. In general, the device 14-1 includes a control system 146 having associated memory 148. In this example, the media player 36-1 and the content requestor 40-1 are each implemented in software and stored in the memory 148. However, the present disclosure is not limited thereto. Each of the media player 36-1 and the content requestor 40-1 may be implemented in software, hardware, or a combination thereof. In this example, the location determination function 38-1 is implemented in hardware and connected to the control system 146. For example, the location determination function 38-1 may be a GPS receiver. However, the present disclosure is not limited thereto. The location determination function 38-1 may be implemented in software, hardware, or a combination thereof. The device 14-1 may also include one or more digital storage devices 150 such as, for example, one or more hard disk drives, one or more internal or removable memory units, or the like. The media collection 42-1 (FIG. 1) may be stored in the one or more digital storage devices 150, the memory 148, or a combination thereof. The device 14-1 also includes a communication interface 152 enabling the device 14-1 to connect to the network 18 (FIG. 1). Lastly, the device 14-1 also includes a user interface 154 including components such as, for example, a display, one or more user input devices, a speaker, or the like.



FIG. 11 is a block diagram of the central server 20 of FIG. 1 according to one embodiment of the present disclosure. In general, the central server 20 includes a control system 156 having associated memory 158. In this example, the tunersphere function 26 is implemented in software and stored in the memory 158. However, the present disclosure is not limited thereto. The tunersphere function 26 may be implemented in software, hardware, or a combination thereof. The central server 20 may also include one or more digital storage devices 160 such as, for example, one or more hard disk drives. In one embodiment, the user account repository 22 and/or the content repository 24 (FIG. 1) are stored in the one or more digital storage devices 160. The central server 20 also includes a communication interface 162 communicatively coupling the central server 20 to the network 18 (FIG. 1). Lastly, the central server 20 may include a user interface 164, which may include components such as, for example, a display, one or more user input devices, or the like.


Those skilled in the art will recognize improvements and modifications to the preferred embodiments of the present disclosure. All such improvements and modifications are considered within the scope of the concepts disclosed herein and the claims that follow.

Claims
  • 1. A method for generating a media recommendation on a server comprising at least one processor and memory containing software executable by the at least one processor, the method comprising by the server: receiving information about a user associated with a requesting device;identifying profile information based on the information about the user;receiving a media recommendation request from the requesting device, the media recommendation request comprising seed information comprising information identifying a media item;determining at least one related media item based on at least the information identifying the media item and the profile information; andproviding information identifying the at least one related media item to the requesting device.
  • 2. The system of claim 1 wherein information identifying the at least one media item comprises a media fingerprint.
  • 3. The system of claim 1 wherein the central server is further configured to: receive information from a social network service based on the information identifying the user; andwherein determining a related media item is further based on the information received from the social network service.
  • 4. The system of claim 1 wherein the media recommendation request further comprises a current location of the requesting device and determining at least one related media item is further based on the information identifying the current location of the requesting device.
  • 5. The system of claim 1 wherein the at least one related media item is one of a digital image, a slideshow, an audio book, a textual presentation, a video presentation, and an audio presentation.
  • 6. The system of claim 5 wherein the video presentation is one of a movie, a television program, and a music video.
  • 7. The system of claim 5 wherein the audio presentation is a song.
  • 8. The system of claim 5 wherein the textual presentation is a digital book.
  • 9. The system of claim 1 wherein the requesting device is one of a personal computer, a laptop computer, a mobile telephone, a portable media player, and a PDA.
  • 10. The system of claim 1 wherein the information identifying the at least one related media item further comprises information chosen from the group consisting of: an identifier of the at least one related media item,a title of the media item of the at least one related media item,a Uniform Resource Locator (URL) enabling other devices to obtain of the at least one related media item through downloading,a Uniform Resource Locator (URL) enabling other devices to obtain of the at least one related media item through streaming,a Uniform Resource Locator (URL) enabling other devices to purchase of the at least one related media item from an e-commerce service,a Uniform Resource Locator (URL) enabling other devices to obtain a preview of the at least one related media item, andmetadata describing of the at least one related media item.
  • 11. A non-transitory computer readable medium storing software for instructing a controller of a computing device to: receive information about a user associated with a requesting device;identify profile information based on the information about the user;receive a media recommendation request from the requesting device, the media recommendation request comprising seed information comprising information identifying a media item;determine at least one related media item based on at least the information identifying the media item and the profile information; andprovide information identifying the at least one related media item to the requesting device.
  • 12. The non-transitory computer readable medium of claim 11 wherein information identifying the at least one media item comprises a media fingerprint.
  • 13. The non-transitory computer readable medium of claim 12 wherein the requesting device is one of a personal computer, a laptop computer, a mobile telephone, a portable media player, and a PDA.
  • 14. The non-transitory computer readable medium of claim 12 wherein the information identifying the at least one related media item further comprises information chosen from the group consisting of: an identifier of the at least one related media item,a title of the media item of the at least one related media item,a Uniform Resource Locator (URL) enabling other devices to obtain of the at least one related media item through downloading,a Uniform Resource Locator (URL) enabling other devices to obtain of the at least one related media item through streaming,a Uniform Resource Locator (URL) enabling other devices to purchase of the at least one related media item from an e-commerce service,a Uniform Resource Locator (URL) enabling other devices to obtain a preview of the at least one related media item, andmetadata describing of the at least one related media item.
  • 15. The non-transitory computer readable medium of claim 11 wherein the central server is further configured to: receive information from a social network service based on the information about the user; andwherein determining a related media item is further based on the information received from the social network service.
  • 16. The non-transitory computer readable medium of claim 11 wherein the media recommendation request further comprises a current location of the requesting device and determining at least one related media item is further based on the information identifying the current location of the requesting device.
  • 17. The non-transitory computer readable medium of claim 11 wherein the at least one related media item is one of a digital image, a slideshow, an audio book, a textual presentation, a video presentation, and an audio presentation.
  • 18. The non-transitory computer readable medium of claim 17 wherein the video presentation is one of a movie, a television program, and a music video.
  • 19. The non-transitory computer readable medium of claim 18 wherein the textual presentation is a digital book.
  • 20. The non-transitory computer readable medium of claim 17 wherein the audio presentation is a song.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/658,551, entitled “System For Generating Media Recommendations In A Distributed Environment Based On Seed Information” which was filed on Mar. 16, 2015, which is a continuation of U.S. patent application Ser. No. 14/488,456, entitled TUNERSPHERE which was filed on Sep. 17, 2014, which is a continuation of U.S. patent application Ser. No. 14/069,761, entitled TUNERSPHERE which was filed on Nov. 1, 2013, now U.S. Pat. No. 8,874,554, which is a continuation of U.S. patent application Ser. No. 13/655,648, entitled TUNERSPHERE which was filed on Oct. 19, 2012, now U.S. Pat. No. 8,577,874, which is a continuation of U.S. patent application Ser. No. 13/228,688, entitled TUNERSPHERE which was filed on Sep. 9, 2011, now U.S. Pat. No. 8,316,015, which is a Continuation-in-Part (CIP) of U.S. patent application Ser. No. 12/192,682, entitled TUNERSPHERE which was filed on Aug. 15, 2008, now U.S. Pat. No. 8,117,193, which is a Continuation-in-Part (CIP) of U.S. patent application Ser. No. 11/963,050, entitled METHOD AND SYSTEM FOR GENERATING MEDIA RECOMMENDATIONS IN A DISTRIBUTED ENVIRONMENT BASED ON TAGGING PLAY HISTORY INFORMATION WITH LOCATION INFORMATION, which was filed on Dec. 21, 2007, now U.S. Pat. No. 8,060,525, all of which are hereby incorporated herein by reference in their entireties.

US Referenced Citations (671)
Number Name Date Kind
3291919 Robitaille Dec 1966 A
4682370 Matthews Jul 1987 A
4720873 Goodman et al. Jan 1988 A
4788675 Jones et al. Nov 1988 A
4870579 Hey Sep 1989 A
4920432 Eggers et al. Apr 1990 A
5119188 McCalley et al. Jun 1992 A
5127003 Doll, Jr. et al. Jun 1992 A
5129036 Dean et al. Jul 1992 A
5132992 Yurt et al. Jul 1992 A
5134719 Mankovitz Jul 1992 A
5168481 Culbertson et al. Dec 1992 A
5305438 MacKay et al. Apr 1994 A
5351276 Doll, Jr. et al. Sep 1994 A
5396417 Burks et al. Mar 1995 A
5414455 Hooper et al. May 1995 A
5442701 Guillou et al. Aug 1995 A
5455570 Cook et al. Oct 1995 A
5526284 Mankovitz Jun 1996 A
5539635 Larson, Jr. Jul 1996 A
5557541 Schulhof et al. Sep 1996 A
5572442 Schulhof et al. Nov 1996 A
5592511 Schoen et al. Jan 1997 A
5617565 Augenbraun et al. Apr 1997 A
5621456 Florin et al. Apr 1997 A
5621546 Klassen et al. Apr 1997 A
5629867 Goldman May 1997 A
5706427 Tabuki Jan 1998 A
5721827 Logan et al. Feb 1998 A
5732216 Logan et al. Mar 1998 A
5734893 Li et al. Mar 1998 A
5758257 Herz et al. May 1998 A
5771778 MacLean, IV Jun 1998 A
5790935 Payton Aug 1998 A
5809246 Goldman Sep 1998 A
5815662 Ong Sep 1998 A
5818510 Cobbley et al. Oct 1998 A
5819160 Foladare et al. Oct 1998 A
5819273 Vora et al. Oct 1998 A
5852610 Olaniyan Dec 1998 A
5855015 Shoham Dec 1998 A
5857149 Suzuki Jan 1999 A
5864682 Porter et al. Jan 1999 A
5907831 Lotvin et al. May 1999 A
5920856 Syeda-Mahmood Jul 1999 A
5926624 Katz et al. Jul 1999 A
5943422 Van Wie et al. Aug 1999 A
5949492 Mankovitz Sep 1999 A
5953005 Liu Sep 1999 A
5956027 Krishnamurthy Sep 1999 A
5956716 Kenner et al. Sep 1999 A
5960437 Krawchuk et al. Sep 1999 A
5963916 Kaplan Oct 1999 A
5973724 Riddle Oct 1999 A
5974441 Rogers et al. Oct 1999 A
5983200 Slotznick Nov 1999 A
5983218 Syeda-Mahmood Nov 1999 A
5986692 Logan et al. Nov 1999 A
6006225 Bowman et al. Dec 1999 A
6009422 Ciccarelli Dec 1999 A
6014569 Bottum Jan 2000 A
6018768 Ullman et al. Jan 2000 A
6029165 Gable Feb 2000 A
6038591 Wolfe et al. Mar 2000 A
6060997 Taubenheim et al. May 2000 A
6067562 Goldman May 2000 A
6081780 Lumelsky Jun 2000 A
6081830 Schindler Jun 2000 A
6088455 Logan et al. Jul 2000 A
6088722 Herz et al. Jul 2000 A
6093880 Arnalds Jul 2000 A
6108686 Williams, Jr. Aug 2000 A
6122757 Kelley Sep 2000 A
6125387 Simonoff et al. Sep 2000 A
6128663 Thomas Oct 2000 A
6134552 Fritz et al. Oct 2000 A
6144375 Jain et al. Nov 2000 A
6161142 Wolfe et al. Dec 2000 A
6167393 Davis, III et al. Dec 2000 A
6169573 Sampath-Kumar et al. Jan 2001 B1
6182128 Kelkar et al. Jan 2001 B1
6195657 Rucker et al. Feb 2001 B1
6199076 Logan et al. Mar 2001 B1
6223210 Hickey Apr 2001 B1
6226672 DeMartin et al. May 2001 B1
6229621 Kulakowski et al. May 2001 B1
6233682 Fritsch May 2001 B1
6246672 Lumelsky Jun 2001 B1
6248946 Dwek Jun 2001 B1
6253069 Mankovitz Jun 2001 B1
6263507 Ahmad et al. Jul 2001 B1
6266649 Linden et al. Jul 2001 B1
6289165 Abecassis Sep 2001 B1
6292786 Deaton et al. Sep 2001 B1
6295555 Goldman Sep 2001 B1
6311194 Sheth et al. Oct 2001 B1
6314094 Boys Nov 2001 B1
6314420 Lang et al. Nov 2001 B1
6317722 Jacobi et al. Nov 2001 B1
6317784 Mackintosh et al. Nov 2001 B1
6334127 Bieganski et al. Dec 2001 B1
6335927 Elliott et al. Jan 2002 B1
6338044 Cook et al. Jan 2002 B1
6339693 Chan Jan 2002 B1
6344607 Cliff Feb 2002 B2
6345289 Lotspiech et al. Feb 2002 B1
6349329 Mackintosh et al. Feb 2002 B1
6349339 Williams Feb 2002 B1
6351733 Saunders et al. Feb 2002 B1
6353823 Kumar Mar 2002 B1
6377782 Bishop et al. Apr 2002 B1
6385596 Wiser et al. May 2002 B1
6388714 Schein et al. May 2002 B1
6389467 Eyal May 2002 B1
6411992 Srinivasan et al. Jun 2002 B1
6415282 Mukherjea et al. Jul 2002 B1
6438579 Hosken Aug 2002 B1
6438759 Jaunault et al. Aug 2002 B1
6473792 Yavitz et al. Oct 2002 B1
6477707 King et al. Nov 2002 B1
6484199 Eyal Nov 2002 B2
6487390 Virine et al. Nov 2002 B1
6496802 van Zoest et al. Dec 2002 B1
6498955 McCarthy et al. Dec 2002 B1
6502194 Berman et al. Dec 2002 B1
6505123 Root et al. Jan 2003 B1
6519648 Eyal Feb 2003 B1
6526411 Ward Feb 2003 B1
6546555 Hjelsvold et al. Apr 2003 B1
6560651 Katz et al. May 2003 B2
6567797 Schuetze et al. May 2003 B1
6581103 Dengler Jun 2003 B1
6587127 Leeke et al. Jul 2003 B1
6587850 Zhai Jul 2003 B2
6600898 De Bonet et al. Jul 2003 B1
6609096 De Bonet et al. Aug 2003 B1
6609253 Swix et al. Aug 2003 B1
6615039 Eldering Sep 2003 B1
6615208 Behrens et al. Sep 2003 B1
6628928 Crosby et al. Sep 2003 B1
6629104 Parulski et al. Sep 2003 B1
6636836 Pyo Oct 2003 B1
6654786 Fox et al. Nov 2003 B1
6662231 Drosset et al. Dec 2003 B1
6670537 Hughes et al. Dec 2003 B2
6684249 Frerichs et al. Jan 2004 B1
6694482 Arellano et al. Feb 2004 B1
6697824 Bowman-Amuah Feb 2004 B1
6701355 Brandt et al. Mar 2004 B1
6711622 Fuller et al. Mar 2004 B1
6721741 Eyal et al. Apr 2004 B1
6725275 Eyal Apr 2004 B2
6735628 Eyal May 2004 B2
6741869 Lehr May 2004 B1
6748237 Bates et al. Jun 2004 B1
6757517 Chang Jun 2004 B2
6757691 Welsh et al. Jun 2004 B1
6772127 Saunders et al. Aug 2004 B2
6792470 Hakenberg et al. Sep 2004 B2
6793142 Yap Sep 2004 B2
6801909 Delgado et al. Oct 2004 B2
6823225 Sass Nov 2004 B1
6865565 Rainsberger et al. Mar 2005 B2
6879963 Rosenberg Apr 2005 B1
6882641 Gallick et al. Apr 2005 B1
6904264 Frantz Jun 2005 B1
6912528 Homer Jun 2005 B2
6925489 Curtin Aug 2005 B1
6941275 Swierczek Sep 2005 B1
6941324 Plastina et al. Sep 2005 B2
6947922 Glance Sep 2005 B1
6973475 Kenyon et al. Dec 2005 B2
6976228 Bernhardson Dec 2005 B2
6981040 Konig et al. Dec 2005 B1
6985694 De Bonet et al. Jan 2006 B1
6986136 Simpson et al. Jan 2006 B2
6987221 Platt Jan 2006 B2
6990453 Wang et al. Jan 2006 B2
6999783 Toyryia et al. Feb 2006 B2
7010263 Patsiokas Mar 2006 B1
7010537 Eyal et al. Mar 2006 B2
7010613 Connor Mar 2006 B2
7013301 Holm et al. Mar 2006 B2
7028082 Rosenberg et al. Apr 2006 B1
7031931 Meyers Apr 2006 B1
7035871 Hunt et al. Apr 2006 B2
7047406 Schleicher et al. May 2006 B2
7058694 De Bonet et al. Jun 2006 B1
7061482 Ferris Jun 2006 B2
7072309 Xie et al. Jul 2006 B2
7072846 Robinson Jul 2006 B1
7072886 Salmenkaita et al. Jul 2006 B2
7075000 Gang et al. Jul 2006 B2
7076553 Chan et al. Jul 2006 B2
7079807 Daum et al. Jul 2006 B1
7089248 King et al. Aug 2006 B1
7096234 Plastina et al. Aug 2006 B2
7102067 Gang et al. Sep 2006 B2
7120619 Drucker et al. Oct 2006 B2
7130608 Hollstrom et al. Oct 2006 B2
7133924 Rosenberg et al. Nov 2006 B1
7139757 Apollonsky et al. Nov 2006 B1
7139770 Nakase et al. Nov 2006 B2
7145678 Simpson et al. Dec 2006 B2
7149961 Harville et al. Dec 2006 B2
7149983 Robertson et al. Dec 2006 B1
7171174 Ellis et al. Jan 2007 B2
7171491 O'Toole et al. Jan 2007 B1
7177872 Schwesig et al. Feb 2007 B2
7203838 Glazer et al. Apr 2007 B1
7206838 Boyd et al. Apr 2007 B2
7219145 Chmaytelli et al. May 2007 B2
7222187 Yeager et al. May 2007 B2
7227071 Tagawa et al. Jun 2007 B2
7240358 Horn et al. Jul 2007 B2
7245925 Zellner Jul 2007 B2
7277955 Elliott Oct 2007 B2
7283992 Liu et al. Oct 2007 B2
7296032 Beddow Nov 2007 B1
7305449 Simpson et al. Dec 2007 B2
7340481 Baer et al. Mar 2008 B1
7343141 Ellis et al. Mar 2008 B2
7437364 Fredricksen et al. Oct 2008 B1
7441041 Williams et al. Oct 2008 B2
7444339 Matsuda et al. Oct 2008 B2
7463890 Herz et al. Dec 2008 B2
7468934 Janik Dec 2008 B1
7469283 Eyal et al. Dec 2008 B2
7496623 Szeto et al. Feb 2009 B2
7509291 McBride et al. Mar 2009 B2
7512658 Brown et al. Mar 2009 B2
7523156 Giacalone, Jr. Apr 2009 B2
7548915 Ramer et al. Jun 2009 B2
7577665 Ramer et al. Aug 2009 B2
7590546 Chuang Sep 2009 B2
7594246 Billmaier et al. Sep 2009 B1
7614006 Molander Nov 2009 B2
7623843 Squibbs Nov 2009 B2
7627644 Slack-Smith Dec 2009 B2
7644166 Appelman et al. Jan 2010 B2
7653654 Sundaresan Jan 2010 B1
7676753 Bedingfield Mar 2010 B2
7680699 Porter et al. Mar 2010 B2
7680959 Svendsen Mar 2010 B2
7711838 Boulter et al. May 2010 B1
7720871 Rogers et al. May 2010 B2
7725494 Rogers et al. May 2010 B2
7730216 Issa et al. Jun 2010 B1
7751773 Linden Jul 2010 B2
7761399 Evans Jul 2010 B2
7765192 Svendsen Jul 2010 B2
7783722 Rosenberg et al. Aug 2010 B1
7797272 Picker et al. Sep 2010 B2
7797321 Martin et al. Sep 2010 B2
7805129 Issa et al. Sep 2010 B1
7827110 Wieder Nov 2010 B1
7827236 Ferris Nov 2010 B2
7840691 De Bonet et al. Nov 2010 B1
7853622 Baluja et al. Dec 2010 B1
7856485 Prager et al. Dec 2010 B2
7865522 Purdy et al. Jan 2011 B2
7870088 Chen et al. Jan 2011 B1
7886072 Wormington et al. Feb 2011 B2
7904505 Rakers et al. Mar 2011 B2
7917645 Ikezoye et al. Mar 2011 B2
7917932 Krikorian Mar 2011 B2
7926085 Del Beccaro et al. Apr 2011 B2
7970922 Svendsen Jun 2011 B2
8045952 Qureshey et al. Oct 2011 B2
8050652 Qureshey et al. Nov 2011 B2
8060525 Svendsen Nov 2011 B2
8112720 Curtis Feb 2012 B2
8117193 Svendsen Feb 2012 B2
8275764 Jeon Sep 2012 B2
8316015 Svendsen Nov 2012 B2
8577874 Svendsen Nov 2013 B2
8983937 Svendsen Mar 2015 B2
9275138 Svendsen Mar 2016 B2
20010013009 Greening et al. Aug 2001 A1
20010021914 Jacobi et al. Sep 2001 A1
20010025259 Rouchon Sep 2001 A1
20010051852 Sundaravel et al. Dec 2001 A1
20020002039 Qureshey et al. Jan 2002 A1
20020010759 Hitson et al. Jan 2002 A1
20020023084 Eyal et al. Feb 2002 A1
20020023270 Thomas et al. Feb 2002 A1
20020035616 Diamond et al. Mar 2002 A1
20020052207 Hunzinger May 2002 A1
20020052674 Chang et al. May 2002 A1
20020052873 Delgado et al. May 2002 A1
20020053078 Holtz et al. May 2002 A1
20020072326 Qureshey et al. Jun 2002 A1
20020082901 Dunning et al. Jun 2002 A1
20020087382 Tiburcio Jul 2002 A1
20020103796 Hartley Aug 2002 A1
20020108112 Wallace et al. Aug 2002 A1
20020116082 Gudorf Aug 2002 A1
20020116476 Eyal et al. Aug 2002 A1
20020116533 Holliman et al. Aug 2002 A1
20020138836 Zimmerman Sep 2002 A1
20020161858 Goldman Oct 2002 A1
20020165793 Brand et al. Nov 2002 A1
20020165912 Wenocur et al. Nov 2002 A1
20020178057 Bertram et al. Nov 2002 A1
20020183059 Noreen et al. Dec 2002 A1
20020194325 Chmaytelli et al. Dec 2002 A1
20020194356 Chan et al. Dec 2002 A1
20020199001 Wenocur et al. Dec 2002 A1
20030001907 Bergsten et al. Jan 2003 A1
20030005074 Herz et al. Jan 2003 A1
20030014407 Blatter et al. Jan 2003 A1
20030018799 Eyal Jan 2003 A1
20030033420 Eyal et al. Feb 2003 A1
20030041110 Wenocur et al. Feb 2003 A1
20030046399 Boulter et al. Mar 2003 A1
20030055516 Gang et al. Mar 2003 A1
20030055657 Yoshida et al. Mar 2003 A1
20030066068 Gutta et al. Apr 2003 A1
20030069806 Konomi Apr 2003 A1
20030084044 Simpson et al. May 2003 A1
20030084086 Simpson et al. May 2003 A1
20030088479 Wooten et al. May 2003 A1
20030089218 Gang et al. May 2003 A1
20030097186 Gutta et al. May 2003 A1
20030103644 Klayh Jun 2003 A1
20030115167 Sharif et al. Jun 2003 A1
20030135513 Quinn et al. Jul 2003 A1
20030137531 Katinsky et al. Jul 2003 A1
20030149581 Chaudhri et al. Aug 2003 A1
20030149612 Berghofer et al. Aug 2003 A1
20030153338 Herz et al. Aug 2003 A1
20030160770 Zimmerman Aug 2003 A1
20030191753 Hoch Oct 2003 A1
20030229537 Dunning et al. Dec 2003 A1
20030232614 Squibbs Dec 2003 A1
20030236582 Zamir et al. Dec 2003 A1
20030237093 Marsh Dec 2003 A1
20040003392 Trajkovic et al. Jan 2004 A1
20040006634 Ferris Jan 2004 A1
20040019497 Volk et al. Jan 2004 A1
20040034441 Eaton et al. Feb 2004 A1
20040073919 Gutta Apr 2004 A1
20040088271 Cleckler May 2004 A1
20040091235 Gutta May 2004 A1
20040107821 Alcalde et al. Jun 2004 A1
20040128286 Yasushi et al. Jul 2004 A1
20040133657 Smith et al. Jul 2004 A1
20040133908 Smith et al. Jul 2004 A1
20040133914 Smith et al. Jul 2004 A1
20040162783 Gross Aug 2004 A1
20040162830 Shirwadkar et al. Aug 2004 A1
20040181540 Jung et al. Sep 2004 A1
20040186733 Loomis et al. Sep 2004 A1
20040199494 Bhatt Oct 2004 A1
20040199527 Morain et al. Oct 2004 A1
20040215793 Ryan et al. Oct 2004 A1
20040216108 Robbin Oct 2004 A1
20040224638 Fadell et al. Nov 2004 A1
20040252604 Johnson et al. Dec 2004 A1
20040254911 Grasso et al. Dec 2004 A1
20040255340 Logan Dec 2004 A1
20040260778 Banister et al. Dec 2004 A1
20040267604 Gross Dec 2004 A1
20050020223 Ellis et al. Jan 2005 A1
20050021420 Michelitsch et al. Jan 2005 A1
20050021470 Martin et al. Jan 2005 A1
20050021678 Simyon et al. Jan 2005 A1
20050022239 Meuleman Jan 2005 A1
20050026559 Khedouri Feb 2005 A1
20050038819 Hicken et al. Feb 2005 A1
20050038876 Chaudhuri Feb 2005 A1
20050044561 McDonald Feb 2005 A1
20050060264 Schrock et al. Mar 2005 A1
20050060666 Hoshino et al. Mar 2005 A1
20050065976 Holm et al. Mar 2005 A1
20050066350 Meuleman Mar 2005 A1
20050071418 Kjellberg et al. Mar 2005 A1
20050091107 Blum Apr 2005 A1
20050120053 Watson Jun 2005 A1
20050125221 Brown et al. Jun 2005 A1
20050125222 Brown et al. Jun 2005 A1
20050131866 Badros Jun 2005 A1
20050138198 May Jun 2005 A1
20050154608 Paulson et al. Jul 2005 A1
20050154764 Riegler et al. Jul 2005 A1
20050154767 Sako Jul 2005 A1
20050158028 Koba Jul 2005 A1
20050166245 Shin et al. Jul 2005 A1
20050197961 Miller et al. Sep 2005 A1
20050198233 Manchester et al. Sep 2005 A1
20050228830 Plastina et al. Oct 2005 A1
20050246391 Gross Nov 2005 A1
20050251455 Boesen Nov 2005 A1
20050251807 Weel Nov 2005 A1
20050256756 Lam et al. Nov 2005 A1
20050256866 Lu et al. Nov 2005 A1
20050267944 Little, II Dec 2005 A1
20050278377 Mirrashidi et al. Dec 2005 A1
20050278380 Ferris Dec 2005 A1
20050278758 Bodleander Dec 2005 A1
20050286546 Bassoli et al. Dec 2005 A1
20050289236 Hull et al. Dec 2005 A1
20060004640 Swierczek Jan 2006 A1
20060004704 Gross Jan 2006 A1
20060008256 Khedouri et al. Jan 2006 A1
20060010167 Grace et al. Jan 2006 A1
20060015378 Mirrashidi et al. Jan 2006 A1
20060020662 Robinson Jan 2006 A1
20060026048 Kolawa et al. Feb 2006 A1
20060026147 Cone et al. Feb 2006 A1
20060048059 Etkin Mar 2006 A1
20060053080 Edmonson et al. Mar 2006 A1
20060064151 Guterman et al. Mar 2006 A1
20060064716 Sull et al. Mar 2006 A1
20060072724 Cohen et al. Apr 2006 A1
20060074750 Clark et al. Apr 2006 A1
20060083119 Hayes Apr 2006 A1
20060085349 Hug Apr 2006 A1
20060085383 Mantle et al. Apr 2006 A1
20060095339 Hayashi et al. May 2006 A1
20060100924 Tevanian, Jr. May 2006 A1
20060101003 Carson et al. May 2006 A1
20060126135 Stevens et al. Jun 2006 A1
20060130120 Brandyberry et al. Jun 2006 A1
20060143236 Wu Jun 2006 A1
20060156242 Bedingfield Jul 2006 A1
20060167576 Rosenberg Jul 2006 A1
20060167991 Heikes et al. Jul 2006 A1
20060171395 Deshpande Aug 2006 A1
20060173910 McLaughlin Aug 2006 A1
20060174277 Sezan et al. Aug 2006 A1
20060190616 Mayerhofer et al. Aug 2006 A1
20060195442 Cone et al. Aug 2006 A1
20060195479 Spiegelman et al. Aug 2006 A1
20060195512 Rogers et al. Aug 2006 A1
20060195513 Rogers et al. Aug 2006 A1
20060195514 Rogers et al. Aug 2006 A1
20060195515 Beaupre et al. Aug 2006 A1
20060195516 Beaupre Aug 2006 A1
20060195521 New et al. Aug 2006 A1
20060195789 Rogers et al. Aug 2006 A1
20060195790 Beaupre et al. Aug 2006 A1
20060200435 Flinn et al. Sep 2006 A1
20060206582 Finn Sep 2006 A1
20060218187 Plastina et al. Sep 2006 A1
20060224757 Fang et al. Oct 2006 A1
20060227673 Yamashita et al. Oct 2006 A1
20060242201 Cobb et al. Oct 2006 A1
20060247980 Mirrashidi et al. Nov 2006 A1
20060248209 Chiu et al. Nov 2006 A1
20060253417 Brownrigg et al. Nov 2006 A1
20060259355 Farouki et al. Nov 2006 A1
20060265409 Neumann et al. Nov 2006 A1
20060265503 Jones et al. Nov 2006 A1
20060265637 Marriott et al. Nov 2006 A1
20060271959 Jacoby et al. Nov 2006 A1
20060271961 Jacoby et al. Nov 2006 A1
20060273155 Thackson Dec 2006 A1
20060277098 Chung et al. Dec 2006 A1
20060282304 Bedard et al. Dec 2006 A1
20060282776 Farmer et al. Dec 2006 A1
20060282856 Errico et al. Dec 2006 A1
20060288041 Plastina et al. Dec 2006 A1
20060288074 Rosenberg Dec 2006 A1
20060293909 Miyajima et al. Dec 2006 A1
20060294091 Hsieh et al. Dec 2006 A1
20060294132 Hsieh et al. Dec 2006 A1
20070005793 Miyoshi et al. Jan 2007 A1
20070008927 Herz et al. Jan 2007 A1
20070014536 Hellman Jan 2007 A1
20070022437 Gerken Jan 2007 A1
20070028171 MacLaurin Feb 2007 A1
20070033292 Sull et al. Feb 2007 A1
20070043766 Nicholas et al. Feb 2007 A1
20070044010 Sull et al. Feb 2007 A1
20070061301 Ramer et al. Mar 2007 A1
20070064626 Evans Mar 2007 A1
20070074617 Vergo Apr 2007 A1
20070078660 Ferris Apr 2007 A1
20070078714 Ott, IV et al. Apr 2007 A1
20070078832 Ott, IV et al. Apr 2007 A1
20070079352 Klein, Jr. Apr 2007 A1
20070083471 Robbin et al. Apr 2007 A1
20070083553 Minor Apr 2007 A1
20070083929 Sprosts et al. Apr 2007 A1
20070088804 Qureshey et al. Apr 2007 A1
20070089132 Qureshey et al. Apr 2007 A1
20070089135 Qureshey et al. Apr 2007 A1
20070094081 Yruski et al. Apr 2007 A1
20070094082 Yruski et al. Apr 2007 A1
20070094083 Yruski et al. Apr 2007 A1
20070094363 Yruski et al. Apr 2007 A1
20070100904 Casey et al. May 2007 A1
20070106672 Sighart et al. May 2007 A1
20070106693 Houh et al. May 2007 A1
20070118425 Yruski et al. May 2007 A1
20070118657 Kreitzer et al. May 2007 A1
20070118802 Gerace et al. May 2007 A1
20070118853 Kreitzer et al. May 2007 A1
20070118873 Houh et al. May 2007 A1
20070130008 Brown et al. Jun 2007 A1
20070130012 Yruski et al. Jun 2007 A1
20070152502 Kinsey et al. Jul 2007 A1
20070156647 Shen et al. Jul 2007 A1
20070156897 Lim Jul 2007 A1
20070162502 Thomas et al. Jul 2007 A1
20070169148 Oddo et al. Jul 2007 A1
20070174147 Klein, Jr. Jul 2007 A1
20070180063 Qureshey et al. Aug 2007 A1
20070182532 Lengning et al. Aug 2007 A1
20070195373 Singh Aug 2007 A1
20070198485 Ramer et al. Aug 2007 A1
20070199014 Clark et al. Aug 2007 A1
20070214182 Rosenberg Sep 2007 A1
20070214259 Ahmed et al. Sep 2007 A1
20070220081 Hyman Sep 2007 A1
20070233736 Xiong et al. Oct 2007 A1
20070233743 Rosenberg Oct 2007 A1
20070238427 Kraft et al. Oct 2007 A1
20070239724 Ramer et al. Oct 2007 A1
20070244880 Martin et al. Oct 2007 A1
20070245245 Blue et al. Oct 2007 A1
20070250571 Griffin Oct 2007 A1
20070264982 Nguyen et al. Nov 2007 A1
20070265870 Song et al. Nov 2007 A1
20070265979 Hangartner Nov 2007 A1
20070266031 Adams et al. Nov 2007 A1
20070269169 Stix et al. Nov 2007 A1
20070277202 Lin et al. Nov 2007 A1
20070282949 Fischer Dec 2007 A1
20070283268 Berger et al. Dec 2007 A1
20070286169 Roman Dec 2007 A1
20070288546 Rosenberg Dec 2007 A1
20070299873 Jones et al. Dec 2007 A1
20070299874 Neumann et al. Dec 2007 A1
20070299978 Neumann et al. Dec 2007 A1
20080005179 Friedman et al. Jan 2008 A1
20080010372 Khedouri et al. Jan 2008 A1
20080016098 Frieden et al. Jan 2008 A1
20080016205 Svendsen Jan 2008 A1
20080031433 Sapp et al. Feb 2008 A1
20080032723 Rosenberg Feb 2008 A1
20080033959 Jones Feb 2008 A1
20080040313 Schachter Feb 2008 A1
20080046948 Verosub Feb 2008 A1
20080052371 Partovi et al. Feb 2008 A1
20080052380 Morita et al. Feb 2008 A1
20080052630 Rosenbaum Feb 2008 A1
20080059422 Tenni et al. Mar 2008 A1
20080059576 Liu et al. Mar 2008 A1
20080085769 Lutnick et al. Apr 2008 A1
20080091771 Allen et al. Apr 2008 A1
20080120501 Jannink et al. May 2008 A1
20080133601 Martin Cervera et al. Jun 2008 A1
20080133763 Clark et al. Jun 2008 A1
20080134039 Fischer et al. Jun 2008 A1
20080134043 Georgis et al. Jun 2008 A1
20080134053 Fischer Jun 2008 A1
20080141136 Ozzie et al. Jun 2008 A1
20080147482 Messing et al. Jun 2008 A1
20080147711 Spiegelman et al. Jun 2008 A1
20080147876 Campbell et al. Jun 2008 A1
20080160983 Poplett et al. Jul 2008 A1
20080162435 Dooms et al. Jul 2008 A1
20080176562 Howard Jul 2008 A1
20080181536 Linden Jul 2008 A1
20080189319 Nielen et al. Aug 2008 A1
20080189336 Prihodko Aug 2008 A1
20080189391 Koberstein et al. Aug 2008 A1
20080189655 Kol Aug 2008 A1
20080195657 Naaman et al. Aug 2008 A1
20080195664 Maharajh et al. Aug 2008 A1
20080208820 Usey et al. Aug 2008 A1
20080208823 Hicken Aug 2008 A1
20080209013 Weel Aug 2008 A1
20080228945 Yoon et al. Sep 2008 A1
20080235632 Holmes Sep 2008 A1
20080242221 Shapiro et al. Oct 2008 A1
20080242280 Shapiro et al. Oct 2008 A1
20080243733 Black Oct 2008 A1
20080244681 Gossweiler et al. Oct 2008 A1
20080249870 Angell et al. Oct 2008 A1
20080250067 Svendsen Oct 2008 A1
20080250312 Curtis Oct 2008 A1
20080250332 Farrell et al. Oct 2008 A1
20080261516 Robinson Oct 2008 A1
20080270561 Tang et al. Oct 2008 A1
20080276279 Gossweiler et al. Nov 2008 A1
20080288536 Pfeiffer et al. Nov 2008 A1
20080288588 Andam et al. Nov 2008 A1
20080301118 Chien et al. Dec 2008 A1
20080301186 Svendsen Dec 2008 A1
20080301187 Svendsen Dec 2008 A1
20080301240 Svendsen Dec 2008 A1
20080301241 Svendsen Dec 2008 A1
20080306826 Kramer et al. Dec 2008 A1
20080307462 Beetcher et al. Dec 2008 A1
20080307463 Beetcher et al. Dec 2008 A1
20080313308 Bodin et al. Dec 2008 A1
20080313541 Shafton et al. Dec 2008 A1
20080319833 Svendsen Dec 2008 A1
20090006368 Mei et al. Jan 2009 A1
20090006374 Kim et al. Jan 2009 A1
20090013347 Ahanger et al. Jan 2009 A1
20090042545 Avital et al. Feb 2009 A1
20090046101 Askey et al. Feb 2009 A1
20090048992 Svendsen et al. Feb 2009 A1
20090049030 Svendsen et al. Feb 2009 A1
20090049045 Askey et al. Feb 2009 A1
20090049390 Nason et al. Feb 2009 A1
20090055376 Slaney et al. Feb 2009 A1
20090055385 Jeon et al. Feb 2009 A1
20090055396 Svendsen et al. Feb 2009 A1
20090055467 Petersen Feb 2009 A1
20090055759 Svendsen Feb 2009 A1
20090061763 Dillon et al. Mar 2009 A1
20090063414 White et al. Mar 2009 A1
20090063645 Casey et al. Mar 2009 A1
20090063971 White et al. Mar 2009 A1
20090064029 Corkran et al. Mar 2009 A1
20090069911 Stefik Mar 2009 A1
20090069912 Stefik Mar 2009 A1
20090070184 Svendsen Mar 2009 A1
20090070267 Hangartner Mar 2009 A9
20090070350 Wang Mar 2009 A1
20090076881 Svendsen Mar 2009 A1
20090077041 Eyal et al. Mar 2009 A1
20090077052 Farrelly Mar 2009 A1
20090077124 Spivack et al. Mar 2009 A1
20090077220 Svendsen et al. Mar 2009 A1
20090083116 Svendsen Mar 2009 A1
20090083117 Svendsen et al. Mar 2009 A1
20090083362 Svendsen Mar 2009 A1
20090083541 Levine Mar 2009 A1
20090089288 Petersen Apr 2009 A1
20090093300 Lutnick et al. Apr 2009 A1
20090094248 Petersen Apr 2009 A1
20090119294 Purdy et al. May 2009 A1
20090125588 Black et al. May 2009 A1
20090129671 Hu et al. May 2009 A1
20090132527 Sheshagiri et al. May 2009 A1
20090157795 Black Jun 2009 A1
20090164199 Amidon et al. Jun 2009 A1
20090164429 Svendsen et al. Jun 2009 A1
20090164448 Curtis Jun 2009 A1
20090164514 Svendsen et al. Jun 2009 A1
20090164516 Svendsen et al. Jun 2009 A1
20090164641 Rogers et al. Jun 2009 A1
20090177301 Hayes Jul 2009 A1
20090198666 Winston et al. Aug 2009 A1
20090222520 Sloo et al. Sep 2009 A1
20090259621 Svendsen et al. Oct 2009 A1
20100017455 Svendsen et al. Jan 2010 A1
20100031366 Knight et al. Feb 2010 A1
20100185732 Hyman Jul 2010 A1
20100198767 Farrelly Aug 2010 A1
20100199218 Farrelly et al. Aug 2010 A1
20100199295 Katpelly et al. Aug 2010 A1
20100228740 Cannistraro et al. Sep 2010 A1
20100280835 Issa et al. Nov 2010 A1
20100324704 Murphy et al. Dec 2010 A1
20100325123 Morrison et al. Dec 2010 A1
20110016483 Opdycke Jan 2011 A1
20110034121 Ng et al. Feb 2011 A1
20110184899 Gadanho et al. Jul 2011 A1
20120042094 Qureshey et al. Feb 2012 A1
20120042337 De Bonet et al. Feb 2012 A1
20120054233 Svendsen et al. Mar 2012 A1
20120066038 Issa et al. Mar 2012 A1
20120072418 Svendsen et al. Mar 2012 A1
20120072846 Curtis Mar 2012 A1
Foreign Referenced Citations (20)
Number Date Country
1586080 Feb 2005 CN
1885284 Dec 2006 CN
101023426 Aug 2007 CN
898278 Feb 1999 EP
1536352 Jun 2005 EP
1835455 Sep 2007 EP
2372850 Sep 2002 GB
2397205 Jul 2004 GB
2005321668 Nov 2005 JP
WO 0003830 Jan 2000 WO
WO 0184353 Nov 2001 WO
WO 0221335 Mar 2002 WO
WO 2004017178 Feb 2004 WO
WO 2004043064 May 2004 WO
WO 2005026916 Mar 2005 WO
WO 2005071571 Aug 2005 WO
WO 2006079973 Jun 2006 WO
WO 2006075032 Jul 2006 WO
WO 2006126135 Nov 2006 WO
WO 2007092053 Aug 2007 WO
Non-Patent Literature Citations (134)
Entry
Kosugi, Naoko et al., “A Practical Query-By-Humming System for a Large Music Database,” Proceedings of the 8th ACM International Conference on Multimedia, Oct. 30-Nov. 3, 2000, Los Angeles, California, copyright 2000, ACM, pp. 333-342.
“About Intermind's Channel Communications Patents,” downloaded from <http://www.intermind.com/materials/patent—desc.html> on Feb. 27, 1998, 5 pages.
“About uPlayMe,” <http://www.uplayme.com/about.php>, copyright 2008, uPlayMe, Inc., 4 pages.
“About.com: http://quintura.com/,” at <http://websearch.about.com/gi/dynamic/offisite.htm?zl=1/XJ&sdn=web...f=10&su=p284.8.150.ip—&tt=13&bt=0&bts=0&zu=http%3A//quintura.com/>, copyright 2007, quintura Inc., printed Oct. 17, 2007, pages.
“Amazon.com: Online Shopping for Electronics, Apparel, Computers, Books, DVDs & m . . . ,” at <http.://www.amazon.com/>, copyright 1996-2007, Amazon.com, Inc., printed Oct. 26, 2007, 4 pages.
Huang, Yao-Chang et al., “An Audio Recommendation System Based on Audio Signature Description Scheme in MPEG-7 Audio,” IEEE Internation Conference on Multimedia and Expo (ICME), Jun. 27-30, 2004, IEEE, pp. 639-642.
Lingnau et al., “An HTTP-based Infrastructure for Mobile Agents,” at <http://www.w3.org/Conferences/WWW4/Papers/150/>. 1995, pp. 1-15, printed Dec. 20, 1999, 15 pages.
“anthony.liekens.net>> Music>> Cloud,” at <http://anthony.liekens.net/index.php/Music/Cloud>, page last modified on Apr. 12, 2007, copyright 2000-2006, Anthony Liekens, printed Oct. 17, 2007, 4 pages.
“AOL Music Now,” at <http://web.archive.org/web/20060508184531/aol.musicnow.com/az/home.jthml?—request . . . >, copyright 2006, AOL Music Now LLC, printed Nov. 16, 2007, 1 page.
“Apple—iPod + iTunes,” at <http://www.apple.com/itunes/>, copyright 2007 by Paramount Pictures, printed Feb. 7, 2007, 2 pages.
“Apple—iPod classic,” at <http://www.apple.com/ipodclassic/>, printed Oct. 26, 2007, 1 page.
“Babulous :: Keep it loud,” at <http://www.babulous.com/home.jhtml>, copyright 2009, Babulous, Inc., printed Mar. 26, 2009, 2 pages.
“Better Propaganda—Free MP3s and music videos,” at <http://www.betterpropaganda.com/>, copyright 2004-2005, betterPropaganda, printed Feb. 7, 2007, 4 pages.
“Billboard.biz—Music Business—Billboard Charts—Album Sales—Concert Tours,” http://www.billboard.biz/bbbiz/index.jsp, copyright 2007 Nielsen Business Media Inc., printed Oct. 26, 2007, 3 pages.
“Bluetooth.com—Learn.” http://www.bluetooth.com/Bluetooth/Learn/, copyright 2007 Bluetooth SIG, Inc., printed Oct. 26, 2007, 1 page.
Mitchell, Bradley, “Cable Speed—How Fast is Cable Modern Internet?,” at <http://www.compnetworking.about.com/od/internetaccessbestuses/f/cablespeed.htm>, copyright 2005, About, Inc., printed Feb. 24, 2010, 2 pages.
“The Classic TV Database—Your Home for Classic TV!—www.classic-tv.com” http://www.classic-tv.com, copyright The Classic TV Database—www.classic-tv.com, printed Feb. 7, 2007, 3 pages.
“Digital Tech Life >> Download of the Week,” earliest post Sep. 30, 2005, latest post Jul. 2, 2006, at <http://www.digitaltechlife.com.category/download-of-the-week.>, printed Feb. 16, 2007, 9 pages.
“Digital Music News,” at <http://www.digitalmusicnews.com/results?title=musicstrands>, copyright 2003-6 Digital Music News, earliest post Aug. 2005, latest post May 2006, printed Aug. 8, 2006, 5 pages.
Aguilera, M.K. and Strom, R.E., “Efficient Atomic Broadcast Using Deterministic Merge,” Proceedings of ACM Symposium on Principles of Distributed Computing (POCD), Jul. 16-19, 2000, copyright 2000, ACM, New York, New York, 10 pages.
Huhn, Mary, “Fed Up With Radio? Create Your Own Online Station,” New York Post at <http://pqasb.pqarchiver.com/nypost/access/68457933.html?FMT=FT&di . . . >, Nov. 22, 1998, printed Oct. 13, 2009, 2 pages.
“Frequently Asked Questions about Intermind's Patents,” downloaded from <http://www.intermind.com/materials/patent—faq.html> on Feb. 27, 1998, 9 pages.
“GenieLab::Music Recommendation System,” at <http://genielab.com/>, from the Internet Atchive on Aug. 13, 2006, copyright 2005, GenieLab, LLC, printed Oct. 30, 2007, 1 page.
“Goombah” Preview at <http://www.goombah.com/preview.html>, printed Jan. 8, 2008, 5 pages.
“Gracenote,”found at <http://www.gracenote.com>, printed Feb. 7, 2007, available on Internet Archive at least as early as Jan. 2006, 1 page.
“Gracenote Playlist,” Product Overview, Revised Dec. 29, 2005, copyright 2005, Gracenote, 2 pages.
“Gracenote Playlist Plus,” Product Overview, Revised Dec. 29, 2005, copyright 2005, Gracenote, 2 pages.
“How many songs are in your iTunes Music library (or libraries in total if you use more than one)?,” at <http://www.macoshints.com/polls/index.php?pid=itunesmusicaccount>, includes postings dated as early as Jun. 2008, printed Feb. 24, 2010, copyright 2010, Mac Publishing LLC, 10 pages.
“Zune.net—How-To—Share Audio Files Zune to Zune,” http://web.archive.org/web/20070819121705/http://www.zune.net/en-us/support/howto/z . . . , copyright 2007 Microsoft Corporation, printed Nov. 14, 2007, 2 pages.
“Hulu—About,” at <http://www.hulu.com/about/product—tour>, copyright 2010, Hulu LLC, appears to have been accessible as early as 2008, printed Jun. 15, 2010, 2 pages.
Kaplan, Marc A., “IBM Cryptolopes TM SuperDistribution and Digital Rights Management,” found at <http://www.research.ibm.com/people/k/kaplan/cryptolope-docs/crypap.html> from the Internet Archive, copyright Dec. 30, 1996, IBM Corporation, printed Mar. 15, 2000, 7 pages.
Nilsson, Martin, “Id3v2.4.0-frames—ID3.org,”at <http://www.id3.org/id3v2/4/0-frames>, dated Nov. 1, 2000, last updated Dec. 18, 2006, copyright 1998-2009, printed Jun. 15, 2010, 31 pages.
“Identifying iPod models,” at <http://support.apple.com/kb/HT1353>, page last modified Jan. 15, 2010, includes information dating back to 2001, printed Feb. 24, 2010, 13 pages.
“IEEE 802.11—Wikipedia, the free encyclopedia,” http://en.wikipedia.org/wiki/IEEE—802.11, printed Oct. 26, 2007, 5 pages.
“ILikeTM—Home,” found at <http://www.ilike.com.>, copyright 2007, iLike, printed May 17, 2007, 2 pages.
Krigel, Beth Lipton, “Imagine Radio spinning off,” CNET News, at <http://news.cnet.com/Imagine-Radio-spinning-off/2100-1033—3-213613.html>, Jul. 22, 1998, printed Oct. 13, 2009, 3 pages.
“InferNote is an exploration tool for your music collection,” at <http://www.itweaks.com/infdoc/index.html>, copyright 2004, otherslikeyou.com Inc., printed Feb. 7, 2007, 13 pages.
“Instant Messenger—AIM—Instant Message Your Online Buddies for Free—AIM,” http://dashboard.aim.com/aim, copyright 2007 AOL LLC, printed Nov. 8, 2007, 6 pages.
Egyhazy et al., “Intelligent Web Search Agents,” at <http://csgrad.cs.vt.edu/˜tplunket/article.html>, pp. 1-23, printed Dec. 20, 1999, 23 pages.
“Intermind Announces Approval of First Patent Application,” dated Oct. 7, 1997, downloaded from <http://www.intermind.com/inside/press—rel/100797—allow.html>, 3 pages.
Pike, S., “intuiTunes—Enhancing the Portable Digital Music Player Experience,” Oct. 21, 2005, pp. 1-26.
“Last.fm—The Social Music Revolution,” at <http://www.last.fm/>, printed Feb. 7, 2007, 1 page.
“Last.fm—Wikipedia, the free encyclopedia,” at <http://en.wikipedia.org/wiki/Last.fm>, last modified on Aug. 8, 2006, printed Aug. 8, 2006, 7 pages.
“LAUNCHcast Radio—Yahoo! Messenger,” http://messenger.yahoo.com/launch.php, copyright 2007 Yahoo! Inc., printed Nov. 8, 2007, 1 page.
Lehmann-Haupt, Rachel, “Library/Internet Radio; Listeners Take on Role of the Deejay,” The New York Times, at <http://www.nytimes.com/1998/11/05/technology/library-internet-radio-lis . . . >, Nov. 5, 1998, printed Oct. 13, 2009, 2 pages.
Lehmann-Haupt, Rachel, “Library/Internet Radio: On Spinner, Wide Range of Choices,” The New York Times, at <http://www.nytimes.com/1998/11/05/technology/library-internet-radio-on-spinner-wide-range-of-choices.html?scp=1&sq=On . . . >, Nov. 5, 1998, printed Oct. 15, 2009, 5 pages.
Lehmann-Haupt, Rachel, “Library/Internet Radio: Web Radio Expands Listening Horizons,” The New York Times, at <http://www.nytimes.com/1998/11/05/technology/library-internet-radio-web-radio-expands-listening-horizons.html?scp=2&sq=. . . >, Nov. 5, 1998, printed Oct. 15, 2009, 5 pages.
Mascia, J. and Reddy, S., “cs219 Project Report—Lifetrak: Music in Tune With Your Life,” Department of Electrical Engineering, UCLA '06, Los Angeles, California, copyright 2006, ACM, 11 pages.
Abstract, Reddy, S. and Mascia, J., “Lifetrak: music in tune with your life,” Proceedings of the 1st ACM International Workshop on Human-Centered Multimedia 2006 (HCM '06), Santa Barbara, California, pp. 25-34, ACM Press, New York, NY, 2006, found at <http://portal.acm.org/citation.clm?id=1178745.1178754>, ACM Portal, printed Oct. 2, 2007, 3 pages.
“LimeWire—Wikipedia, the free encyclopedia,” at <http://en.wikipedia.org/wiki/LimeWire>, last modified Aug. 6, 2006, printed Aug. 6, 2006, 2 pages.
“Listen with Last.fm and fuel the social music revolution,” at <htpp://www.last.fm/tour/>, copyright 2002-2007, Last.fm Ltd, printed Oct. 4, 2007, 1 page.
“Liveplasma music, moves, search engine and discovery engine,” at <http://www.liveplasma.com>, printed May 17, 2007, 1 page.
Boswell, Wendy. “Loading ‘Quintura—Search With Quintura, a Tag Cloud Search Engine’,” at <http://websearch.about.com/od/dailywebsearchtips/qt/dnt0830.htm?p=1>, copyright 2007, About.com, Inc., printed Oct. 17, 2007, 1 page.
“Loomia Personalized Recommendations for Media, Content and Retail Sites,” at <http://www.loomia.com/>, copyright 2006-2007, Loomia Inc., printed Feb. 7, 2007, 2 pages.
“Master's Projects of the KR&R Group,” Faculty of Science, Vrije Universiteit, Amsterdam, URL unknown, publication date unknown, obtained on or prior to Apr. 22, 2009, 7 pages.
“Mercora—Music Search and Internet Radio Network,” at <http://www.mercora.com/v6/—front/web.jsp>, printed Feb. 7, 2007, 1 page.
“Mercora—Music Search and Internet Radio Network,” at <http://www.mercora.com/overview.asp>, copyright 2004-2006, Inc., printed Aug 8, 2006, 1 page.
Henry, Alan, “MixxMaker: The Mix Tape Goes Online,” Jan. 18, 2008, AppScout, found at <http://appscout.pcmag.com/crazy-start-ups-vc-time/276029-mixmaker-the-mix-tape-goes-online#fbid=DfUZtDa46ye>, printed Nov. 15, 2011, 4 pages.
“Mongomusic.com—The Best Download mp3 Resource and Information. This website is for sale!” http://www.mongomusic.com/, printed May 17, 2007, 2 pages.
“MP3 music download website, eMusic,” at <http://www.emusic.com/>, copyright 2007, eMusic.com Inc., printed Feb. 7, 2007, 1 page.
Oliver, N. and Flores-Mangas, F., “MPTrain: A Mobile, Music and Physiology-Based Personal Trainer,” MobileHCI'06, Sep. 12-15, 2006, Helsinki, Finland, 8 pages.
“Music Artist Cloud,” at <http://artistcloud.camaris.be/>, copyright 2007, mac, printed Oct. 17, 2007, 2 pages.
“Music Downloads—Over 2 Million Songs—Try it Free—Yahoo! Music,” http://music.yahoo.com/ymu/default.asp, copyright 2006 Yahoo! Inc., printed Feb. 7, 2007, 1 page.
“Music Recommendations 1.0—MacUpdate,” at <http://www.macupdate.com/info.php/id/19575>, Oct. 4, 2005, printed Feb. 16, 2007, 1 page.
Wang, J. and Reinders, M.J.T., “Music Recommender system for Wi-Fi Walkman,” Number ICT-2003-01 in the ICT Group Technical Report Series, Information & Communication Theory Group, Department of Mediamatics, Faculty of Electrical Engineering, Mathematics and Computer Science, Delft University of Technology, Delft, The Netherlands, 2003, 23 pages.
“MusicGremlin,” at <http://www.musicgremlin.com/StaticContent.aspx?id=3>, copyright 2005, 2006, 2007, MusicGremlin, Inc., printed Oct. 26, 2007, 1 page.
“MusicIP—The Music Search Engine,” at <http://www.musicip.com/>, copyright 2006-2007, MusicIP Corporation, printed Feb. 7, 2007, 1 page.
“musicstrands.com—Because Music is Social,” brochure, copyright 2006, MusicStrands, Inc., 2 pages.
Pampalk, E. and Goto, M., “MusicSun: A New Approach to Artist Recommendation,” In Proceedings of the 8th International Conference on Music Information Retrieval (ISMIR 2007), Vienna, Austria, Sep. 23-27, 2007, copyright 2007, Austrian Computer Society (OCG), found at <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.69.1403&rep=rep1&type=pdf>, 4 pages.
Linder, Brad, “Muziic media player streams audio from YouTube—for now—Download Squad,” at <http://www.downloadsquad.com/2009/08/09/muziic-media-player-streams-audio-from-you . . . >, Mar. 9, 2009, copyright 2008-2009, Weblogs, Inc., printed Jun. 14, 2010, 2 pages.
“MyStrands Social Recommendation and Discovery,” at <http://www.mystrands.com/>, copyright 2006-2007 MedisStrands, Inc., printed Feb. 7, 2007, 2 pages.
“MyStrands Download,” at <http://www.mystrands.com/overview.vm>, copyright 2006-2007, MediaStrands, Inc., printed Feb. 7, 2007, 3 pages.
“MyStrands for Windows 0.7.3 Beta,” copyright 2002-2006. ShareApple.com networks, printed Jul. 16, 2007, 3 pages.
“MyStrands for Windows Change Log,” at <http://www.mystrands.com/mystrands/windows/changelog.vm>, earliest log dated Feb. 2006, printed Jul. 16, 2007, 6 pages.
“MyStrands Labs: Patent-pending Technologies,” at <http://labs.mystrands.com/patents.html>, earliest description from Nov. 2004, printed Feb. 7, 2007, 5 pages.
“Napster—All the Music You Want,” at <http://www.napster.com/using—napster/all—the—music—you—want.html>, copyright 2003-2006, Napster, LLC, printed Feb. 7, 2007, 2 pages.
“Not safe for work—Wikipedia, the free encyclopedia,” http://en.wikipedia.org/wiki/Work—safe, printed Nov. 8, 2007, 2 pages.
“Outlook Home Page—Microsoft Office Online,” http://office.microsoft.com/en-us/outlook.default.aspx, copyright 2007 Microsoft Corporation, printed Nov. 8, 2007, 1 page.
“FAQ,” at <http://blog.pandora.com/faq/>, copyright 2005-2006, Pandora Media, Inc., printed Aug. 8, 2006, 20 pages.
“Pandora—Frequently Asked Questions,” from <http://www.pandora.com>, obtained on or prior to Apr. 22, 2009, copyright 2005-2009, Pandora Media, Inc., 48 pages.
“Pandora Internet Radio—Find New Music, Listen to Free Web Radio,” at <http://www.pandora.com/>, copyright 2005-2007, Pandora Media, Inc., printed Feb. 7, 2007, 1 page.
“Pandora Radio—Listen to Free Internet Radio, Find New Music—The Music Genome Project,” at <http://www.pandora.com/mgp>, copyright 2006-2007, Pandora Media, Inc., printed Oct. 26, 2007, 1 page.
Oliver N. and Kreger-Stickles, L., “PAPA: Physiology and Purpose-Aware Automatic Playlist Generation,” In Proc. of ISMIR 2006, Victoria, Canada, Oct. 2006, 4 pages.
International Search Report for PCT/GB01/03069 mailed Oct. 11, 2002, 3 pages.
Abstract, Elliott, G.T. and Tomlinson, B., “Personal Soundtrack: context-aware playlists that adapt to user pace,” Conferece on Human Factors in Computing Systems 2006 (CHI '06), Apr. 22-27, 2006, Montreal, Quebec, Canada, pp. 735-741, ACM Press, New York, found at <http://portal.acm.org/citation.cfm?id=1125451.1125599>, ACM Portal, printed Oct. 2, 2007, 3 pages.
Merkel, Oliver et al., “Protecting VoD the Easier Way,” Proceedings of the sixth ACM International Conference on Multimedia, Sep. 13-16, 1998, Bristol, United Kingdom, 1998, pp. 21-28, 8 pages.
Krigel, Beth Lipton, “Radio features at center of Net law,” CNET News, at <http://news.cnet.com/Radio-features-at-center-of-Net-law/2100-1033—3-214752.html>, Aug. 24, 1998, printed Oct. 15, 2009, 2 pages.
Sarwar, Badrul M. et al., “Recommender System for Large-scale E-Commerce: Scalable Neighborhood Formation Using Clustering,” Proceedings of the Fifth♂International Conference on Computer and Information Technology, Dec. 27-28, 2002, East West University, Dhaka, Bangladesh, 6 pages.
“Review of Personalization Technologies: Collaborative Filtering vs. ChoiceStream's Attributized Bayesian Choice Modeling,” Technology Brief, ChoiceStream, Feb. 4, 2004, found at <http://www.google.com/url?sa=t&rct=j&q=choicestream%20review%20of%20personalization&source=web&cd=1&ved=0CDcOFjAA&url=http%3A%2fwww.behavioraltargeting.info%2Fdownloadattachment.php%3Faid%3Dcf74d490a8b97edd535b4ccdbfdf55%26articleId%3D31&ei=C2jeTr71AurZ0QGCgsGvBw&usg=AFQjCNEBLn7jJCDh-VYty3h79uFKGFBkRw>, 13 pages.
“Rhapsody—Full-length music, videos and more—FREE,” http://www.rhapsody.com/welcome.html, copyright 2001-2007 Listen.com, printed Feb. 7, 2007, 1 page.
“Ringo: Social Information Filtering for Music Recommendation,” http://jolomo.net/ringo.html, printed Aug. 3, 2009, 1 page.
Nickell, Joe Ashbrook, “Roll Your Own Radio,” at <http://www.wired.com/print/culture/lifestyle/news/1998/08/14706>, Aug. 28, 1998, printed Oct. 13, 2009, 1 page.
“RYM FAQ—Rate Your Music,” at <http://rateyourmusic.com/faq/>, copyright 2000-2007, rateyourmusic.com, printed Nov. 8, 2007, 14 pages.
Cai, Rui et al., “Scalable Music Recommendation by Search,” Proc. ACM Multimedia, Augsburg, Germany, Sep. 23-28, 2007, pp. 1065-1074.
Madan, Sameer, “Search the Web without a headache,” PC World (India), pp. 40-41, Feb. 1998, printed Dec. 20, 1999, 2 pages.
“Searching and Browsing Radio Time,” URL unknown. publication date unknown, obtained on or prior to Apr. 22, 2009, 3 pages.
Lamantia, Joe, “Second Generation Tag Clouds,” Feb. 23, 2006, at <http://www.joelamantia.com/blog/archives/ideas/second—generation—tag—clouds.html>, copyright 2006, Joe Lamantia, printed Nov. 29, 2007, 19 pages.
Gartrell, Charles M., “SocialAware: Context-Aware Multimedia Presentation via Mobile Social Networks,” Masters Thesis, submitted to the Faculty of the Graduate School of the University of Colorado, directed by Dr. Richard Han, Department of Computer Science, 2008, found at <http://www.cs.colorado.edu/˜rhan/Papers/Mike—Gartrell—CU—MS—thesis-final.pdf>, 42 pages.
“Songbird,” at <http://getsongbird.com/>, copyright 2010, Songbird, printed Jun. 15 2010, 2 pages.
“SongReference,” at <http://songreference.com/>, copyright 2008, SongReference.com, printed Jun. 15, 2010, 1 page.
“Soundflavor,” at <http://www.soundflavor.com/>, copyright 2003-2007, Soundflavor, Inc., printed Feb. 7, 2007, 1 page.
“Start Listening with Last.fm,” at <http://www.last.fm/>, date unknown but may date back as early as 2002, 1 page.
“Subscribe to Napster,” at <http://www.napster.com/subscribe>, found on the Internet Archive, dated Aug. 6, 2006, copyright 2003-2006, Napster, LLC, printed Dec. 21, 2011, 4 pages.
“Tag cloud in standalone player—Feedback and Ideas—Last.fm,” at <http://www.last.fm/forum/21717/—/333269>, posting dated Oct. 4, 2007, copyright 2002-2007, Last.fm Ltd., printed Oct. 17, 2007, 2 pages.
Hearst, Marti A. et al., “Tag Clouds: Data Analysis Tool or Social Signaller?” Proceedings of the 41st Annual Hawaii International Conference on System Sciences (HICSS 2008), Jan. 7-10, 2008, Waikoloa, Big Island, Hawaii, p. 160, available from <http://csdl2.computer.org/persagen/DLAbsToc.jsp?resourcePath=/dl/proceedings/&toc=comp/proceedings/hicss/2008/3075/00/3075toc.xml&DOI=10.1109/HICSS.2008.422>, 10 pages.
“Take a look at the Future of Mobile Music—Music Guru,” at <http://www.symbian-freak.com/news/006/02/music—guru.htm> Feb. 23, 2006, copyright 2005, Symbian freak, printed Feb. 7, 2007, 3 pages.
“TalkStreamLive.com—A Dynamic Directory of Streaming Radio,” at <http://www.talkstreamlive.com/aboutus.aspx>, from the Internet Archive, dated Aug. 1, 2008, copyright 2006-2008, 3 pages.
“That canadian girl >> Blog Archive >> GenieLab,” posted Feb. 22, 2005, at <http://www.thatcanadiangirl.co.uk/blog/2005/02/22/genielab/>, copyright 2007, Vero Papperrell, printed Feb. 16, 2007, 3 pages.
Barrie-Anthony, Steven, “That song sounds familiar,” Los Angeles Times, Feb. 3, 2006, available from <http://www.calendarlive.com/printedition/calendar/cl-et-pandora3feb03.0.7458778.story?track=tottext,0,19482.story?track=tothtml>, printed Feb. 3, 2006, 5 pages.
Rouarch, Pierre, “The Cloud Search Beta,” <http://www.itcom3.com/thecloudsearch/aboutthecloudsearch.php>, copyright 2007, Pierre Rouarch, printed Oct. 17, 2007, 2 pages.
Nealon, Andrew D., “The Daily Barometer—GenieLab.com grants music lovers' wishes,” posted Feb. 16, 2005, at <http://media.barometer.orst.edu/home/index.cfm?event=displayArticlePrinterFriendly&uSt . . . >, copyright 2007, The Daily Barometer, printed Feb. 16, 2007, 2 pages.
“The Internet Movie Database (IMDb),” http://www.imdb.com., copyright 1990-2007 Internet Movie Database Inc., printed Feb. 7, 2007, 3 pages.
Gibbon, John F. et al., “The Use of Network Delay Estimation for Multimedia Data Retrieval,” IEEE Journal on Selected Areas in Communications, vol. 14, No. 7, Sep. 1996, pp. 1376-1387, 12 pages.
“Thunderbird—Reclaim your inbox,” http://www.mozilla.com/en-US/thunderbird/, copyright 2005-2007 Mozilla, printed Nov. 8, 2007, 2 pages.
“Tour's Profile,” at <http://mog.com/Tour>, copyright 2006-2009, Mog Inc., printed Aug. 3, 2009, 11 pages.
“Trillian (software)—Wikipedia, the free encyclopedia,” http://en.wikipedia.org/wiki/Trillian—(instant—messenger), printed Nov. 8, 2007, 11 pages.
Golbeck, Jennifer, “Trust and Nuanced Profile Similarity in Online Social Networks,” MINDSWAP Technical Report TR-MS1284, 2006, available from <http://www.cs.umd.edu/˜golbeck/publications.shtml>, 30 pages.
“Try Napster free for 7 Days—Play and download music without paying per song.,” http://www.napster.com/choose/index.html, copyright 2003-2007 Napster, LLC, printed Feb. 7, 2007, 1 page.
“uPlayMe.com Meet People, Music Sharing—Home,” at <http://www.uplayme.com/>, copyright 2008, uPlayMe, Inc., printed Mar. 26, 2009, 1 page.
“UpTo11.net—Music Recommendations and Search,” at <http://www.upto11.net/<, copyright 2005-2006, UpTo11.net, printed Feb. 7, 2007, 1 page.
Hochmair, H.H. et al., “User Interface Design for Semantic Query Expansion in Geo-data Repositories,” Angewandte Geoinformatik 2006—Beitrage zum 18, AGIT-Symposium Satzburg: Heldelberg: Wichmann, 2006, 10 pages.
Smith, Patricia, “WebCompass Takes Web Searching in the Right Direction,” Seybold Report on Desktop Publishing, vol. 10, No. 10, pp. 1-9, found at <http://www.seyboldseminars.com/seybold—report/reports/D1010001.htm>, copyright 1996, Seybold Publications Inc., 9 pages.
“Webjay—Playlist Community,” at <http://www.webjay.org/>, copyright 2006, Yahoo! Inc., printed Feb. 7, 2007, 5 pages.
“Welcome to Internet Talk Radio from Talkzone.com,” at <http://www.talkzone.com/> from the Internet Archive, dated Jul. 19, 2008, copyright 2007-2008, Syndication Networks Corp., 2 pages.
“Welcome to the MUSICMATCH Guide,” at <http://www.mmguide.musicmatch.com/>, copyright 2001-2004, Musicmatch, Inc/. printed Feb. 7, 2007, 1 page.
“What is BlogTalkRadio,” at <http://www.blogtalkradio.com/whatis.aspx> from the Internet Archive, dated Feb. 18, 2009, copyright 2009, appears that it may have existed in 2008, BlogTalkRadio.com 2 pages.
“What is the size of your physical and digital music collection?” at <http://www.musicbanter.com/general-music/47403-what-size-your-physical-digital-music-collection-12.html>, earliest posting shown: Sep. 21, 2008, printed Feb. 24, 2010, copyright 2010, Advameg, Inc., SEO by vBSEO 3.2.0 copyright 2008, Crawlability, Inc., 6 pages.
Dean, Katie, “Whose Song is that Anyway?,” Wired News, Feb. 12, 2003, at <http://www.wired.com/news.digiwood/1,57634-0.html>, copyright 2005, Lycos, Inc., printed Oct. 9, 2006, 3 pages.
Wang, J. et al., “Wi-Fi Walkman: A wireless handhold that shares and recommend music on peer-to-peer networks,” In Proceedings of Embedded Processors for Multimedia and Communications II, part of the IS&T/SPIE Symposium on Electronic Imaging 2005, Jan. 15-20, 2005, San Jose, California, Proceedings published Mar. 8, 2005, found at <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.108.34598rep=rep1&type=pdf>, 10 pages.
“Yahoo! Music,” at <http://music.yahoo.com>, dated Jun. 20, 2005, from the Internet Archive, copyright 2005, Yahoo! Inc., printed Dec. 18, 2009, 14 pages.
“Yahoo Music Jukebox,” Wikipedia, at <http://en.wikipedia.org/wiki/Yahoo—music—engine>, last modified Aug. 3, 2006, printed Aug. 8, 2006, 1 page.
“Yahoo! Messenger—Chat, Instant message, SMS, PC Calls and More,” http://messenger.yahoo.com.webmessengerpromo.php, copyright 2007 Yahoo! Inc., printed Oct. 26, 2007, 1 page.
“Yahoo! Music,” at <http://info.yahoo.com/privacy/ca/yahoo/music/>, Aug. 14, 2007, copyright 2007, Yahoo! Canada Co., obtained from the Internet Archive, printed Apr. 19, 2011, 4 pages.
“YouTube—Broadcast Yourself.,” at <http://www.youtube.com/>, copyright 2007, YouTube, LLC, printed Oct. 26, 2007, 2 pages.
Related Publications (1)
Number Date Country
20160179971 A1 Jun 2016 US
Continuations (5)
Number Date Country
Parent 14658551 Mar 2015 US
Child 15056310 US
Parent 14488456 Sep 2014 US
Child 14658551 US
Parent 14069761 Nov 2013 US
Child 14488456 US
Parent 13655648 Oct 2012 US
Child 14069761 US
Parent 13228688 Sep 2011 US
Child 13655648 US
Continuation in Parts (2)
Number Date Country
Parent 12192682 Aug 2008 US
Child 13228688 US
Parent 11963050 Dec 2007 US
Child 12192682 US