System for providing music content to a user

Information

  • Patent Grant
  • 10785526
  • Patent Number
    10,785,526
  • Date Filed
    Thursday, February 21, 2019
    5 years ago
  • Date Issued
    Tuesday, September 22, 2020
    4 years ago
Abstract
A user may employ a user device (e.g., a television (TV) and a set-top box (STB)) to access a television system via a network. The television system may include one or more content servers (CSs) (e.g., a VOD server, an HTTP server, or other media server) for providing, among other things, a VOD service to the user and a broadcast transmission system for transmitting a multitude of linear television channels. Television system and/or user device is/are operable to enable the user to interact with an enhanced TV service (ETS) that allows the user to easily navigate among various different programmed linear channels (a.k.a., “streaming channels”) and VOD services. The ETS may be hosted in the television system, in user device, or in a combination of the two.
Description
TECHNICAL FIELD

Aspects of this disclosure relate to a system for providing content (such as, but not limited to, music) to a user.


BACKGROUND

Users who enjoy watching music videos and/or listening to music (or other content) may subscribe to a television service operated by a television operator (e.g., cable TV operator) that provides access to such content. Such a television service may give the user the ability to listen to a variety of content (e.g., a variety of genres of music and/or a variety of genres of music videos). As one non-limiting example, the television service may include a number of linear channels dedicated to music programming from one or more content providers. The television service may also provide a video on demand service.


SUMMARY

What is desired is an enhanced TV system that enables the user to easily navigate among the various different linear channels and video on demand (VOD) assets. Embodiments of such an enhanced TV system are described herein. While the embodiments are described with reference to music content, this was done solely for the sake of illustration as the enhanced TV system is applicable to any type of content, not just music content.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated herein and form part of the specification, illustrate various embodiments.



FIG. 1A illustrates a system according to some embodiments.



FIG. 1B illustrates a system according to some other embodiments.



FIG. 2 is a functional block diagram of a user device according to some embodiments.



FIGS. 3-12 illustrate user interface screens, according to some embodiments, that are provided by the application.





DETAILED DESCRIPTION


FIG. 1A illustrates a system 100 according to some embodiments. As shown in FIG. 1A, a user 102 may employ a user device 190 to access a television system 112 via a network 110. In this example, the user device 190 consists of a television (TV) 106 and a set-top box (STB) 108 (e.g., a device comprising a computer system having one or more processors, networking capabilities, and a user input detector for receiving commands from user 102), which is connected to network 110 (e.g., a cable TV network or other network) to which television system 112 is also connected (directly or indirectly). In some embodiments, user device may consist only of TV 106, in which case TV 106 may be a smart-TV comprising networking capabilities (e.g., receiver, transmitter), a computer system having one or more processors, and a user input detector (e.g., a receiver for receiving signals transmitted by a remote control 104). In some embodiments, television system 112 includes hardware and software found at a typical cable-TV head-end system, such as a computer system and networking capabilities. That is, for example, television system 112, in some embodiments, includes one or more content servers (CSs) 196 (e.g., a VOD server or other media server) for providing, among other things, a VOD service to user 102 and a broadcast transmission system for transmitting a multitude of linear television channels as is known in the art. In other embodiments, TV system 112 can be implemented using a merely set of one or more servers (e.g., HTTP servers). TV system 112 may be a distributed computer system or all of its components may be co-located. While CS 196 is shown in FIG. 1A as being a component of the TV system 112, this is not a requirement as CS 196 may be a component separate from TV system 112.


In some embodiments, television system 112 and/or user device 190 is/are operable to enable user 102 to interact with an enhanced TV service (ETS) 197 that allows the user to easily navigate among various different programmed linear channels (a.k.a., “streaming channels”) and video on demand (VOD) services. As shown in FIG. 1A, the ETS 197 may be hosted in television system 112, in user device 190 (e.g., in STB 108 and/or TV 106), or in a combination of the two. For example, the ETS 197 may be a computer program and a portion of the computer program (e.g., a server portion) may run on processor(s) within television system 112 and another portion (e.g., a client portion) may run on processor(s) within user device 190.



FIG. 1B illustrates a system 193 according to other embodiments. In this embodiment, user device 190 is in the form of a communication device 191 (e.g., a personal computer, a smartphone, a tablet, a phablet, a smart TV, internet TV, etc.) comprising networking capabilities (e.g., receiver, transmitter) that enable device 191 to communicate with television system 112 via network 110, a computer system comprising one or more processors, and a user input detector (e.g., a touch screen, keyboard, etc). In some embodiments, television system 112 and/or device 191 is/are operable to enable user 102 to interact with the ETS 197 as described above in connection with system 100. In this embodiment, the ETS may be hosted in television system 112, in user device 190 or in a combination of the two. For example, the ETS 197 may be a computer program and a portion of the computer program (e.g., a server portion) may run on processor(s) within television system 112 and another portion (e.g., a client portion) may run on processor(s) within user device 190.



FIG. 2 illustrates a functional block diagram of user device 190 and/or TV system 112 according to some embodiments. In the embodiment shown, the user device 190 and/or TV system 112 includes an electronic program guide (EPG) module 202 and an ETS module 204 of ETS 197.


In some embodiments, EPG module 202 functions to obtain EPG information from and EPG information server 210 by, for example, transmitting a request message to server 210. In response to such a request message, server 210 may obtain from a database 220 EPG information. In some embodiments, for each linear program channel (a.k.a., linear TV channel or streaming channel) that user 102 may access, the EPG information includes a set of one or more time slot records, where each time slot record in the set includes information identifying a time slot (e.g., a beginning time and an end time) and program information corresponding to the program occurring on the linear channel in that time slot (e.g., a program description). In some embodiments, the set of time slot records for streaming audio and video channels may only consist of a single time slot record.


In some embodiments, a time slot record may also include information for retrieving an object stored at (or generated by) a remote server. For example, the object may be an HTML document and the information for retrieving the HTML document may be a Uniform Resource Identifier (URI) (e.g., a Uniform Resource Locator (URL)). For example, a time slot record for one of the streaming video channels (or one of the streaming audio channels) may include not only information identifying the format of the music that is played on the streaming channel but also a URI for obtaining from a server an object (e.g., HTML document) corresponding to the streaming channel.


The EPG module 202 further functions to display at least some of the EPG information it receives from server 210. EPG module 202 may, before it displays the EPG information, process the information (e.g., format the information). FIG. 3 illustrates an example of EPG information 300 as displayed by EPG module 202 on a display device of user device 190. As shown in FIG. 3, EPG module 202 can provide information to user 102 as to the content that is currently available on each linear channel (as well as content recently shown on each linear channel). As further shown in FIG. 3, at least some of the linear channels are linear “music” channels (i.e., channels 1001, 1002, and 1003). A linear music channel may be a streaming audio channel or a streaming video channel, as is known in the art. As is also known in the art, EPG module 202 can allow user 102 to select for viewing/listening any one of the displayed channels in list 300.


In some embodiments, in response to user 102 selecting one of the streaming audio channels or streaming video channels via the EPG module 202, the EPG module 202 launches the ETS module 204 and provides to the ETS module information indicating that the user 102 desires to consume (e.g., watch or listen) the selected streaming channel. The ETS module 204 then tunes to the selected streaming channel so that the user can see/hear the content (e.g., music or other content) that is currently being transmitted on the streaming channel and displays a user interface to the user (see e.g., user interface 400 in FIG. 4).


For example, in some embodiments, when user 102 selects a particular streaming channel via the EPG module 202, EPG module 202 provides to ETS module 204 (which may be a conventional web browser) the URI included in the current time slot record for the selected streaming music channel (i.e., the time slot record that identifies a time slot that includes the present time). In response to receiving the URI from EPG module 202, ETS module 204 obtains the object identified by the URI. For example, ETS module 204 may send to a remote HTTP server 211 an HTTP request comprising the URI and, in response, receives from server 211 the identified object. The object (e.g., HTML document) causes ETS module 204 to display a user interface screen (see e.g., the user interface shown in FIG. 4) corresponding to the selected music channel. For example, in some embodiments, the object may include URIs that point to additional objects (e.g., images, scripts, videos, JSONobjects, XML files, manifest files for streaming content, flash objects, etc.) hosted by other servers (e.g. servers 212 and 214) and ETS module 204 obtains these additional objects and uses these additional objects in generating the user interface screen.


Referring now to FIG. 4, FIG. 4 illustrates an example user interface screen 400 generated by the ETS module 204 as a result of the user selecting to watch the Hit List streaming video channel (e.g., as a result of module 204 processing the object associated with the Hit List music streaming video channel—e.g., the object identified in the current time slot record for the Hit List streaming video channel).


Screen 400 includes a channel area 402 (a.k.a., a display area) in which the ETS module displays to the user the video content that is currently being transmitted on the selected streaming video channel (in this case the MC Hit List streaming video channel). User 102 is given the option to expand channel area 402 so that it takes up the entire display screen of the user device by selecting a “go full screen” activatable user interface element 410 (hereafter “button”).


User interface screen 400 also includes an artist list 404 and a video content list 406 listing video content related to an artist selected from the list 404. ETS module 204 enables user 102 to select an artist from list 404. In this particular example, the artist list 404 is in the form of an artist carousel, but the invention is not limited to using a carousel to provide a list, as other techniques can be used, such as a menu (drop down, pop-up, etc.) or other interface element for providing choices to a user. In the example shown, artist carousel 404 comprises a set of pictures (e.g., thumbnail photographs), where each picture identifies an artist (e.g., each picture contains a still or moving image of an artist or otherwise identifies the artist). In the embodiment shown, at any given time, at most only one of the pictures in the set is not obscured and the other pictures are either fully or partially obscured. The artist that is shown in the picture that is not obscured is referred to as the “selected artist.”


When screen 400 is first displayed to user 102, the selected artist will be the artist associated with the content that is currently being transmitted on the selected video channel. In this case, a Taylor Swift music video is currently being transmitted over the Hit List channel. Thus, the selected artist in carousel 404 is Taylor Swift. The user 101 can change the selected artist. For example, the user can change which picture in the carousel 404 will be the unobscured picture by, for example, putting the input focus on the carousel 404 and then pressing a certain button on a remote control 104 (or other input device), thereby changing the selected artist. In some embodiments, the artists that are included in artist list are the artists whose videos played just prior to the current video. Thus, if a video from Ariana Grande played just before the currently playing Taylor Swift video, then the artist in the carousel directly underneath Taylor Swift would be Ariana Grande.


In some embodiments, video content list 406 is a list of video content (e.g., music videos) related to whoever is the selected artist. Thus, when the selected artist is changed, list 406 will also change as list 406 display a list of videos related to the selected artist. User 102 can select to watch on demand any of the videos included in list 406.


As shown in the example, the list of video content 406 is presented to the user using a set of pictures, where each picture represents a different video. User 102 can select any of the listed videos to watch on demand. For example, with respect to system 100, user 102 can use remote control device 104 to communicate commands to the ETS 197 to cause the ETS to put the input focus on one of the pictures (e.g., highlight one of the pictures), and then, after the desired picture is highlighted, user 102 can send a “watch” command to the ETS (e.g., user can press a certain button on remote control 104, such as a button labeled “ok” or “select”). In response to receiving the “watch” command, the ETS will initiate a VOD session for the video identified by the selected picture. For example, the ETS may cause user device 190 to transmit to content server 196 (e.g., a VOD server) a video request identifying the selected video, and the server 196 responds to the request by streaming or otherwise providing the requested video to user device 190, which will play the video for user 102. In some embodiments, server 196 streams (or otherwise provides) the requested video to user device 190 by providing to the user device 190 one or more playlist files (e.g., manifest files) that enable the ETS to obtain audio/video data corresponding to the content being transmitted on the channel (e.g., the playlist file identifies a segment of video data corresponding to the video and the ETS upon receiving the playlist file sends a request to a content server for the video segment, and upon receiving the segment of video data the ETS renders the video data in display area 402). In addition to selecting a video from list 406, user 102 can select to watch on demand the video that is currently playing on the linear channel by selecting button 409.


Screen 400 may also include other buttons. For example, screen 400 may include a “Streaming Music” button 411, a “Streaming Video” button 412, and a “Video on Demand” button 413, a “Top Artists” button 414, a “Artist Just Played” button 415, a “Top Videos” button 416, a “You Watched” button 417, “Featured” button 418, and a “Search” button 419.



FIG. 5 illustrates a user interface screen 500 that may be displayed when user 102 selects the “Top Artists” button 414 from screen 400. Screen 500 is similar to screen 400 in that screen 500 includes channel area 402 and buttons 409-419. Additionally, screen 500 includes an artist list 504 in the form of an artist carousel and a video content list 506 listing related video content. Artist carousel 504 enables user 102 to select an artist. In the example shown, carousel 504 comprises a set of pictures, where each picture identifies an artist. In the embodiment shown, at any given time, at most only one of the pictures in the set is not obscured and the other pictures are either fully or partially obscured. As described above, the artist that is shown in the picture that is not obscured is referred to as the “selected artist.”


When screen 500 is first displayed to user 102 in response to user 102 activating the “Top Artists” button, the selected artist will be the artist designated as the current “top artist” within a particular category of music (e.g., within the category of music associated with the channel playing at the time the button was selected). In this case, Ariane Grande has been designated as the top artist within the “Hit List” music category. The user can change which picture in the carousel 504 will be the unobscured picture by, for example, putting the input focus on the carousel 504 and then pressing a certain button on a remote control 104 or other input device, thereby changing the selected artist. In some embodiments, the artists that are included in artist list 504 are only artists who have been designated as top artists within the music category (e.g. the artist within the category having the greatest number of video plays within a certain period of time, such as one week).


Like video content list 406, video content list 506 is a list of video content (e.g., music videos) related to whoever is the selected artist in carousel 504. Thus, when the selected artist is changed, list 506 will also change as list 506 displays a list of videos related to the selected artist. User 102 can select to watch on demand any of the videos included in list 506. As shown, the list of video content 506 is presented to the user using a set of pictures, where each picture represents a different video.



FIG. 6 illustrates a user interface screen 600 that may be displayed when user 102 selects to watch a video on demand (e.g., selects a video from a video list, such as list 406 or 506). In this example, user 102 has selected an Ariana Grande video from list 506.


Screen 600 includes a video display area 602 in which the ETS displays to the user the video selected by the user. User 102 is given the option to expand display area 602 so that it takes up the entire display screen of the user device by selecting a “go full screen” button 410.


Screen 600 includes a portion 604 for displaying information related to the artist performing in the selected video (e.g., in this case a picture of the artist). Screen 600 also includes a video content list 606 listing other videos by the selected artist. User 102 can select to watch on demand any of the videos included in list 606.


In addition to including buttons 410-419, screen 600 also includes a “Restart” button 603, a “Back” button 604, and a “More From This Artist” button 606. Activating Restart button 603 causes the ETS to replay the current video from its beginning. Activating Back button 604 will cause the ETS to play the streaming channel that was last played. For example, in this case the Hit List streaming video channel was the last played streaming channel; thus activating button 604 will cause the ETS to display screen 400. Activating button 606 causes the ETS to display on screen 600 additional videos by artist of the currently playing video (Ariana Grande in this example).



FIG. 7 illustrates a user interface screen 700 that may be displayed when user 102 selects “Top Videos” button 416 from screen 400. Screen 700 has many of the same elements as screen 400 but does not include the artist carousel 404. Instead, screen 700 includes a video content list 706 listing a set of videos. The listed videos are the top played videos within the music category of the currently playing streaming channel (i.e., Hit List in this example). User 102 can select to watch on demand any of the videos included in list 706.



FIG. 8 illustrates a user interface screen 800 that may be displayed when user 102 selects “You Watched” button 417 from screen 400. Screen 800 has many of the same elements as screen 400 but does not include the artist carousel 404. Instead, screen 800 includes a video content list 806 listing a set of videos. The listed videos are the most recent videos that user 102 has watched. User 102 can select to watch on demand any of the videos included in list 806.



FIG. 9 illustrates a user interface screen 900 that may be displayed when user 102 selects “Featured” button 418 from screen 400. Screen 900 is similar to screen 600. For example, screen 900 includes video display area 602 in which the ETS displays to the user a selected video. A difference between screen 900 and screen 600 is that the video that plays in video display area 602 on screen 600 is a user selected video, whereas the video that plays in video display 602 on screen 900 is a system selected video (a.k.a., “the featured video”). In this example, the featured video is an episode of a show named “Chronicles.” Also, unlike screen 600, screen 900 includes an artist carousel 904.


Artist carousel 904 is just like carousels 404 and 504 (i.e., carousel 904 enables user 102 to select an artist in the manner described above with reference to carousel 404). The artists listed in artist carousel 904 are the artists who are featured or mentioned in the video that is playing in display area 602. In this example, Drake was one of the artists featured in the Chronicles episode that is playing in area 602. Screen 900 also include a video content list 906, which is a list of video content (e.g., music videos) related to whoever is the selected artist in carousel 904. User 102 can select to watch on demand any of the videos included in list 906.



FIG. 10 illustrates a user interface screen 1000 that may be displayed when user 102 selects “Streaming Videos” button 412 from screen 400. Screen 1000 is similar to screen 400. For example, screen 1000 includes channel area 402 in which the ETS displays to the user the video content that is currently being transmitted on the selected streaming video channel (in this case the MC Hit List streaming video channel).


A difference between screen 1000 and screen 400 is that the artist carousel 404 is replaced with a channel group carousel for enabling user 102 to select a channel group (e.g., “Today's Music”) and video list 406 is replaced with streaming channel list 1006 listing the streaming channels that are included in selected channel group. User 102 can select a channel group in carousel 1004 in the same manner the user selects an artist in carousel 404. When user 102 selects a new channel group, the streaming channels displayed in list 1006 will change such that only those streaming channels included in the selected channel group are displayed. When user 102 selects a streaming video channel from list 1006 (e.g., when a user clicks on a channel or highlights a channel from the list and presses an “ok” button), screen 400 is displayed and the ETS will play in channel area 402 the video content that is currently being transmitted on the selected streaming video channel. That is, selecting a streaming channel from list 1006 has the same effect as selecting a streaming channel via the EPG.



FIG. 11 illustrates a user interface screen 1100 that may be displayed when user 102 selects to listen to a streaming audio channel. User 102 may select a streaming audio channel from either list 1006 or via the EPG module 202 as described above. Screen 1100 includes a content display area 1102 for displaying content (e.g., images, artist trivia, etc.) associated with the music content that is currently playing on the selected streaming audio channel. In the example shown, there are no videos related to the music content that is currently playing on the selected streaming audio channel (e.g., there are no videos related to the artist of the music content). Accordingly, screen 1100 does not include a video selection list. If one or more videos were related to the music content, then screen 1100 would include a video selection list containing the videos, like lists 406, 506.



FIG. 12 illustrates a user interface screen 1200 that may be displayed when user 102 selects “Search” button 419 from screen 400. Screen 1200 includes a text input box 1202 into which a user can input a search query (i.e., a string of characters). In embodiment shown, user 102 inputs the search query by selecting characters (e.g., letters, numbers, etc.) from a character bar 1204 that displays, among other things, the English alphabet. When a character is selected, the character is appended to the search query shown in input box 1202. The ETS may be configured to perform a dynamic search. That is, as characters are added to the search query, ETS performs a search using the search query and displays matching results underneath box 1202. The results of the search can include the names of music videos as well as artist names. In the example, shown, the artist name “will.i.am” is one of the search results that matches the search query “Will” and the user has highlighted this search result. If the user selects the “will.i.am” search result (e.g., highlights the search result and then presses an “OK” button on remote control 104 or other input device), then the ETS may display a list of music videos related to the selected artist. For example, screen 700 may be displayed where list 706 includes only videos related to the selected artist.


As discussed above, when user 102 activates a button on one of the user interface screens described above, the ETS typically changes the user interface in response. More specifically, in some embodiments, user activation of a such a button causes ETS module 204 to transmit to a server (e.g. server 211, 212 or 214) a message comprising information associated with the selected button (e.g., a button identifier). The message may also contain information identifying the streaming channel to which the ETS is tuned. In response to receiving this message, the server may use information in the message to retrieve from a database (e.g., database 215) information related to the activated button (e.g., a list of the videos the user 102 recently watched). The server then provides this information to the ETS module 204, which may then display the information and/or use the information to obtain further objects (e.g., images) to display.


While various embodiments of the present disclosure are described herein, it should be understood that they have been presented by way of example only, and not limitation. Thus, the breadth and scope of the present disclosure should not be limited by any of the above-described exemplary embodiments. Moreover, any combination of the above-described elements in all possible variations thereof is encompassed by the disclosure unless otherwise indicated herein or otherwise clearly contradicted by context.

Claims
  • 1. A method for providing an enhanced television service to a user of a user device in communication with a television system, the method comprising: receiving information indicating that the user desires to consume a selected programmed linear music channel; andafter receiving the information, displaying on a display device of the user device a user interface screen, wherein the user interface screen comprises: i) a first display area for displaying a graphic image associated with music content transmitted by the television system on the selected programmed linear music channel in accordance with a music content schedule for the selected programmed linear music channel,ii) a second display area for displaying a set of graphic images;displaying, in the first display area, a first graphic image associated with first scheduled music content transmitted by the television system on the selected programmed linear music channel in accordance with the music content schedule for the selected programmed linear music channel;while displaying the first graphic image in the first display area displaying in the second display area a first set of graphic images;after the first scheduled music content has ended: 1) automatically displaying, in the first display area, a second graphic image associated with second scheduled music content transmitted by the television system on the selected programmed linear music channel in accordance with the music content schedule for the selected programmed linear music channel and 2) automatically displaying in the second display area a second set of graphic images, wherein each graphic image included in the second set of graphic images is associated with a different video;while displaying the second graphic image in the first display area and the second set of graphic images in the second display area, receiving a user input indicating that the user has selected one of the graphic images included in the second set of graphic images;after receiving the user input, causing the video associated with the selected graphic image to be streamed on-demand to the user device;while displaying the first graphic image in the first display area, displaying an ordered artist list for enabling the user to select an artist, wherein the ordered artist list comprises an ordered set of at least two artist images including a first artist image followed by a second artist image, each artist image included in the set identifies an artist, and either i) the first artist image identifies the artist whose music content played immediately prior to the first scheduled music content or ii) the second artist image identifies the artist whose music content played immediately prior to the first scheduled music content;as a result of first scheduled music content ending, automatically adding to the artist list an artist image that identifies either i) the artist who performed the first scheduled music content or ii) the artist who performed the second scheduled music content.
  • 2. The method of claim 1, wherein the first set of graphic images includes at least two graphic images.
  • 3. The method of claim 2, wherein the first set of graphic images includes at least four graphic images, andthe first set of graphic images is displayed in the second display area in a grid pattern having at least two rows and two columns.
  • 4. The method of claim 1, wherein each graphic image included in the first set of graphic images is associated with a different video associated with the artist identified by the first artist image, andeach graphic image included in the second set of graphic images is associated with a different video associated with the artist identified by the third artist image.
  • 5. The method of claim 1, wherein causing the video to be streamed to on-demand the user device comprises: a) the user device transmitting to a content server a request for a playlist file associated with the video;b) the user device receiving the playlist file, the playlist file identifying a segment of video data corresponding to the video;c) the user device transmitting to a content server a request for the segment of video data corresponding to the video;d) the user device receiving the segment of video data; ande) the user device rendering the segment video data in the first display area.
  • 6. The method of claim 1, wherein, prior to automatically adding to the artist list the third artist image, the step of displaying the artist list comprises displaying only the first artist image.
  • 7. The method of claim 6, wherein the step of displaying the artist list comprises displaying only the third artist image as a result of automatically adding to the artist list the third artist image.
  • 8. The method of claim 6, further comprising receiving a user input with respect to the artist list, wherein, after receiving the user input, the step of displaying the artist list comprises displaying only the second artist image in response to receiving the user input.
  • 9. The method of claim 1, wherein the artist list is in the form of an artist carousel.
  • 10. A method for providing an enhanced television service to a user of a user device in communication with a television system, the method comprising: receiving information indicating that the user desires to consume a selected programmed linear music channel; andafter receiving the information, displaying on a display device of the user device a user interface screen, wherein the user interface screen comprises: i) a first display area for displaying a first graphic image associated with first scheduled music content transmitted by the television system on the selected programmed linear music channel in accordance with a music content schedule for the selected programmed linear music channel, andii) a second display area for displaying a set of graphic images;while displaying the first graphic image: 1) displaying in the second display area a first set of graphic images, wherein each graphic image included in the first set of graphic images is associated with a different video and 2) displaying an artist carousel for enabling the user to select an artist, wherein the artist carousel comprises a first artist image that identifies an artist;when the first schedule music content has ended: 1) automatically displaying, in the first display area, a second graphic image associated with second scheduled music content transmitted by the television system on the selected programmed linear music channel in accordance with the music content schedule for the selected programmed linear music channel; 2) automatically adding a second artist image to the artist carousel, wherein the second artist image either i) identifies the artist of the first scheduled music content or ii) identifies the artist of the second scheduled music content; and 3) automatically displaying in the second display area a second set of graphic images, wherein each graphic image included in the second set of graphic images is associated with a different video; andafter displaying in the second display area the second set of graphic images and in response to receiving a user input directed to the artist carousel: a) displaying the first artist image so that the first artist image is not obscured; andb) replacing, in the second display area, the second set of graphic images with the first set of graphic images such that the first set of graphic images are displayed in the second display area in place of the second set of graphic images.
  • 11. The method of claim 10, wherein the first set of graphic images includes at least two graphic images.
  • 12. The method of claim 11, wherein the first set of graphic images includes at least four graphic images, andthe first set of graphic images is displayed in the second display area in a grid pattern having at least two rows and two columns.
  • 13. The method of claim 11, wherein each graphic image included in the second set of graphic images is associated with a different video associated with the artist identified by the second artist image.
  • 14. A user device for providing an enhanced television service to a user, the user device comprising: a user input detector for receiving information indicating that the user desires to consume a selected programmed linear music channel; anda computer system comprising one or more processors, wherein the computer system is configured to:display, on a display device, a user interface screen, wherein the user interface screen comprises: i) a first display area for displaying a graphic image associated with music content transmitted by the television system on the selected programmed linear music channel in accordance with a music content schedule for the selected programmed linear music channel,ii) a second display area for displaying a set of graphic images;display, in the first display area, a first graphic image associated with first music content transmitted by the television system on the selected programmed linear music channel in accordance with a music content schedule for the selected programmed linear music channel;while displaying the first graphic image: 1) display in the second display area a first set of graphic images and 2) display an artist list for enabling the user to select an artist, wherein the artist list comprises a first artist image that identifies an artist, and further wherein and each graphic image included in the first set of graphic images is associated with a different video;when the first music content has ended: 1) automatically display, in the first display area, a second graphic image associated with second music content transmitted by the television system on the selected programmed linear music channel in accordance with a music content schedule for the selected programmed linear music channel; 2) automatically add to the displayed artist list a second artist image such that the second artist image is not obscured wherein the second artist image either i) identifies the artist of the first scheduled music content or ii) identifies the artist of the second scheduled music content; and 3) automatically display in the second display area a second set of graphic images, wherein each graphic image included in the second set of graphic images is associated with a different video; andafter receiving a user input indicating that the user has selected one of the graphic images displayed in the second display area, cause the video associated with the selected graphic image to be streamed on-demand to the user device.
  • 15. The user device of claim 14, wherein each graphic image included in the first set of graphic images is associated with a different video associated with the artist identified by the first artist image, andeach graphic image included in the second set of graphic images is associated with a different video associated with the artist identified by the second artist image.
  • 16. The user device of claim 14, wherein the user device is configured to cause the video to be streamed to on-demand the user device by performing a process that includes: a) the user device transmitting to a content server a request for a playlist file associated with the video;b) the user device receiving the playlist file, the playlist file identifying a segment of video data corresponding to the video;c) the user device transmitting to a content server a request for the segment of video data corresponding to the video;d) the user device receiving the segment of video data; ande) the user device rendering the segment video data in the first display area.
  • 17. A television system for providing an enhanced television service to a user, the television system comprising: a receiver for receiving information indicating that the user desires to consume a selected programmed linear music channel; anda computer system comprising one or more processors, wherein the computer system is configured to:cause a user device to display, on a display device, a user interface screen, wherein the user interface screen comprises: i) a first display area displaying a first graphic image associated with first music content transmitted by the television system on the selected programmed linear music channel in accordance with a music content schedule for the selected programmed linear music channel,ii) a second display area for displaying a set of graphic images;cause the user device to display in the second display area while the first graphic image is displayed in the first display area a first set of graphic images, wherein each graphic image included in the first set of graphic images is associated with a different video;cause the user device to display an artist list while the first graphic image is displayed in the first display area, wherein the artist list comprises a first artist image that identifies an artist;when the first music content has ended: 1) automatically cause the user device to display, in the first display area, a second graphic image associated with second music content transmitted by the television system on the selected programmed linear music channel in accordance with a music content schedule for the selected programmed linear music channel; 2) automatically cause the user device to add to the displayed artist list a second artist image wherein the second artist image either i) identifies the artist of the first scheduled music content or ii) identifies the artist of the second scheduled music content; and 3) automatically cause the user device to display a second set of graphic images, wherein each graphic image included in the second set of graphic images is associated with a different video.
  • 18. The television system of claim 17, wherein each graphic image included in the first set of graphic images is associated with a different video associated with the artist identified by the first artist image, andeach graphic image included in the second set of graphic images is associated with a different video associated with the artist identified by the second artist image.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 15/677,236, filed on Aug. 15, 2017 (status pending), which is a continuation of U.S. application Ser. No. 15/616,564, filed on Jun. 7, 2017 (status abandoned), which is a continuation of U.S. application Ser. No. 14/922,597, filed on Oct. 26, 2015 (status abandoned), which claims the benefit of U.S. provisional application No. 62/068,374, filed on Oct. 24, 2014. The above identified applications are incorporated by reference.

US Referenced Citations (340)
Number Name Date Kind
4127796 Henderson Nov 1978 A
RE29997 Den Toonder May 1979 E
4336478 Quilty et al. Jun 1982 A
4338623 Asmus et al. Jul 1982 A
4360805 Andrews et al. Nov 1982 A
4677430 Falkman et al. Jun 1987 A
4722005 Ledenbach Jan 1988 A
4760455 Nagashima Jul 1988 A
4799156 Shavit et al. Jan 1989 A
4823386 Dumbauld Apr 1989 A
5027400 Reimer et al. Jun 1991 A
5130615 George Jul 1992 A
5193006 Yamazaki Mar 1993 A
5235680 Bijangte Aug 1993 A
5315448 Ryan May 1994 A
5341350 Frank et al. Aug 1994 A
5355302 Martin et al. Oct 1994 A
5365381 Scheffler Nov 1994 A
5371551 Logan et al. Dec 1994 A
5418654 Scheffler May 1995 A
5420838 Maeda et al. May 1995 A
5481296 Cragun et al. Jan 1996 A
5534911 Levitan Jul 1996 A
5550863 Yurt et al. Aug 1996 A
5557541 Schulhof et al. Sep 1996 A
5559949 Reimer Sep 1996 A
5561709 Reimer et al. Oct 1996 A
5572442 Schulhof et al. Nov 1996 A
5585866 Miller et al. Dec 1996 A
5590282 Clynes Dec 1996 A
5592511 Schoen et al. Jan 1997 A
5596373 White et al. Jan 1997 A
5616876 Cluts Apr 1997 A
5617565 Augenbraun et al. Apr 1997 A
5629867 Goldman May 1997 A
5635989 Rothmuller Jun 1997 A
5636276 Brugger Jun 1997 A
5646992 Subler Jul 1997 A
5675734 Hair Oct 1997 A
5708780 Levergood et al. Jan 1998 A
5721815 Ottesen et al. Feb 1998 A
5726909 Krikorian Mar 1998 A
5734719 Tsevdos et al. Mar 1998 A
5734853 Hendricks et al. Mar 1998 A
5734961 Castille Mar 1998 A
5751282 Girard et al. May 1998 A
5751806 Ryan May 1998 A
5752160 Dunn May 1998 A
5753844 Matsumoto May 1998 A
5754939 Herz et al. May 1998 A
5761606 Wolzien Jun 1998 A
5761607 Gudesen Jun 1998 A
5761662 Dasan Jun 1998 A
5771435 Brown Jun 1998 A
5777997 Kahn Jul 1998 A
5781889 Martin et al. Jul 1998 A
5784095 Robbins et al. Jul 1998 A
5787090 Van Niekerk et al. Jul 1998 A
5790935 Payton Aug 1998 A
5793980 Glaser et al. Aug 1998 A
5808223 Kurakake et al. Sep 1998 A
5809144 Sirbu et al. Sep 1998 A
5809246 Goldman Sep 1998 A
5815634 Reimer et al. Sep 1998 A
5818935 Maa Oct 1998 A
5819049 Reietmann Oct 1998 A
5819160 Foladare et al. Oct 1998 A
5835487 Campanella Nov 1998 A
5841979 Schulhof et al. Nov 1998 A
5848398 Martin et al. Dec 1998 A
5861906 Dunn et al. Jan 1999 A
5878141 Daly et al. Mar 1999 A
5890137 Koreeda Mar 1999 A
5890139 Suzuki et al. Mar 1999 A
5899699 Kamiya May 1999 A
5899980 Wilf et al. May 1999 A
5900830 Scheffler May 1999 A
5905865 Palmer et al. May 1999 A
5913204 Kelly Jun 1999 A
5918012 Astiz et al. Jun 1999 A
5918213 Bernard et al. Jun 1999 A
5926624 Katz et al. Jul 1999 A
5930765 Martin et al. Jul 1999 A
5930768 Hooban Jul 1999 A
5931901 Wolfe et al. Aug 1999 A
5933500 Blatter et al. Aug 1999 A
5943422 Van Wie et al. Aug 1999 A
5944608 Reed et al. Aug 1999 A
5959945 Kleiman Sep 1999 A
5960411 Hartman et al. Sep 1999 A
5968120 Guedalia Oct 1999 A
5969283 Looney et al. Oct 1999 A
5970474 LeRoy et al. Oct 1999 A
5973722 Wakai et al. Oct 1999 A
5980261 Mino et al. Nov 1999 A
5986692 Logan et al. Nov 1999 A
5991374 Hazenfield Nov 1999 A
5991737 Chen Nov 1999 A
6011761 Inoue Jan 2000 A
6011854 Van Ryzin Jan 2000 A
6020883 Herz et al. Feb 2000 A
6021432 Sizer et al. Feb 2000 A
6025868 Russo Feb 2000 A
6038591 Wolfe et al. Mar 2000 A
6055314 Spies et al. Apr 2000 A
6055560 Mills et al. Apr 2000 A
6055566 Kikinis Apr 2000 A
6069655 Seeley et al. May 2000 A
6085235 Clarke et al. Jul 2000 A
6088455 Logan et al. Jul 2000 A
6088722 Herz et al. Jul 2000 A
6105060 Rothblatt Aug 2000 A
6111882 Yamamoto Aug 2000 A
6135646 Kahn et al. Oct 2000 A
6141488 Knudson et al. Oct 2000 A
6151634 Glaser et al. Nov 2000 A
6154772 Dunn et al. Nov 2000 A
6161142 Wolfe et al. Dec 2000 A
6175840 Chen et al. Jan 2001 B1
6182126 Nathan et al. Jan 2001 B1
6188830 Mercs et al. Feb 2001 B1
6192340 Abecassis Feb 2001 B1
6223292 Dean et al. Apr 2001 B1
6226030 Harvey et al. May 2001 B1
6226618 Downs et al. May 2001 B1
6229895 Son et al. May 2001 B1
6232539 Looney et al. May 2001 B1
6233389 Barton et al. May 2001 B1
6233682 Fritsch May 2001 B1
6240553 Son et al. May 2001 B1
6243725 Hempleman et al. Jun 2001 B1
6246672 Lumelsky Jun 2001 B1
6248946 Dwek Jun 2001 B1
6249810 Kiraly Jun 2001 B1
6253235 Estes Jun 2001 B1
6253237 Story et al. Jun 2001 B1
6262772 Shen et al. Jul 2001 B1
6263505 Walker et al. Jul 2001 B1
6271455 Ishigaki et al. Aug 2001 B1
6279040 Ma et al. Aug 2001 B1
6286139 Decinque Sep 2001 B1
6289165 Abecassis Sep 2001 B1
6305020 Horaty et al. Oct 2001 B1
6317784 Mackintosh et al. Nov 2001 B1
6324217 Gordon Nov 2001 B1
6330595 Ullman et al. Dec 2001 B1
6330609 Garofalakis et al. Dec 2001 B1
6338044 Cook et al. Jan 2002 B1
6341375 Watkins Jan 2002 B1
6349339 Williams Feb 2002 B1
6351469 Otani et al. Feb 2002 B1
6360368 Chawla Mar 2002 B1
6366791 Lin et al. Apr 2002 B1
6369851 Marflak et al. Apr 2002 B1
6378129 Zetts Apr 2002 B1
6385596 Wiser et al. May 2002 B1
6389467 Eyal May 2002 B1
6393430 Van Ryzin May 2002 B1
6418421 Hurtado et al. Jul 2002 B1
6434621 Pezzillo et al. Aug 2002 B1
6434747 Khoo et al. Aug 2002 B1
6445306 Trovato Sep 2002 B1
6446080 Van Ryzin et al. Sep 2002 B1
6446130 Grapes Sep 2002 B1
6448987 Easty et al. Sep 2002 B1
6452609 Katinsky et al. Sep 2002 B1
6473792 Yavitz et al. Oct 2002 B1
6481012 Gordon et al. Nov 2002 B1
6490728 Kitazato et al. Dec 2002 B1
6502137 Peterson et al. Dec 2002 B1
6505240 Blumenau Jan 2003 B1
6507727 Henrick Jan 2003 B1
6526411 Ward Feb 2003 B1
6550011 Sims, III Apr 2003 B1
6580870 Kanazawa et al. Jun 2003 B1
6587127 Leeke et al. Jul 2003 B1
6587837 Spagna et al. Jul 2003 B1
6597891 Tantawy et al. Jul 2003 B2
6631522 Erdelyi Oct 2003 B1
6637032 Reimer et al. Oct 2003 B1
6694090 Lewis et al. Feb 2004 B1
6704491 Revis Mar 2004 B1
6718551 Swix et al. Apr 2004 B1
6748427 Drosset et al. Jun 2004 B2
6766357 Fandozzi Jul 2004 B1
6766528 Kim et al. Jul 2004 B1
6782550 Cao Aug 2004 B1
6785707 Teeple Aug 2004 B2
6789106 Eyer Sep 2004 B2
6792280 Hori et al. Sep 2004 B1
6792615 Rowe et al. Sep 2004 B1
6795711 Sivula Sep 2004 B1
6796555 Blahut Sep 2004 B1
6834308 Ikezoye et al. Dec 2004 B1
6842604 Cook Jan 2005 B1
6856550 Kato et al. Feb 2005 B2
6865550 Cok Mar 2005 B1
6868440 Gupta et al. Mar 2005 B1
6898800 Son et al. May 2005 B2
6915529 Suematsu et al. Jul 2005 B1
6925489 Curtin Aug 2005 B1
6928655 Omoigui Aug 2005 B1
6933433 Porteus et al. Aug 2005 B1
6952221 Holtz et al. Oct 2005 B1
6965770 Walsh et al. Nov 2005 B2
6978310 Rodriguez et al. Dec 2005 B1
6985694 De Bonet Jan 2006 B1
7020888 Reynolds et al. Mar 2006 B2
7024678 Gordon et al. Apr 2006 B2
7028082 Rosenberg et al. Apr 2006 B1
7062272 Grilli et al. Jun 2006 B2
7065287 Heredia et al. Jun 2006 B1
7073189 McElhatten et al. Jul 2006 B2
7076561 Rosenberg et al. Jul 2006 B1
7111099 Alexander et al. Sep 2006 B2
7133924 Rosenberg et al. Nov 2006 B1
7140032 Dew et al. Nov 2006 B2
7149471 Arisawa et al. Dec 2006 B1
7155674 Breen et al. Dec 2006 B2
7181538 Tam et al. Feb 2007 B2
7207006 Feig et al. Apr 2007 B1
7249186 Sitaraman et al. Jul 2007 B1
7281035 Ihara et al. Oct 2007 B2
7293275 Krieger et al. Nov 2007 B1
7302253 Moody et al. Nov 2007 B2
7305698 Tanigawa et al. Dec 2007 B1
7320025 Steinberg et al. Jan 2008 B1
7321923 Rosenberg et al. Jan 2008 B1
7325043 Rosenberg et al. Jan 2008 B1
7325245 Clapper Jan 2008 B1
7343179 Theis et al. Mar 2008 B1
7464394 Gordon et al. Dec 2008 B1
7555539 Rosenberg et al. Jun 2009 B1
7600686 Morris Oct 2009 B2
7617295 Farber et al. Nov 2009 B1
7668538 Rosenberg et al. Feb 2010 B2
7711838 Boulter et al. May 2010 B1
7735106 LaRocca et al. Jun 2010 B2
7869580 Tagawa et al. Jan 2011 B2
7870592 Hudson et al. Jan 2011 B2
7962572 Farber et al. Jun 2011 B1
7986977 Rosenberg et al. Jul 2011 B2
8024766 Addington Sep 2011 B2
8060055 Huang Nov 2011 B2
8098811 Singh Jan 2012 B2
8112494 Maghraby Feb 2012 B2
8166133 Steinberg et al. Apr 2012 B1
8170194 Shen et al. May 2012 B2
8245269 Schiller Aug 2012 B2
8260271 Rosenberg et al. Sep 2012 B2
8265237 Reynolds et al. Sep 2012 B2
8291452 Yong et al. Oct 2012 B1
8381252 Young Feb 2013 B2
8526579 Thomas Sep 2013 B2
8533175 Roswell Sep 2013 B2
8639228 Rosenberg et al. Jan 2014 B2
8677416 Arora Mar 2014 B2
8677430 Mitsuji et al. Mar 2014 B2
8700795 Boulter et al. Apr 2014 B2
8868481 Wei et al. Oct 2014 B2
9197937 Rosenberg Nov 2015 B1
9351045 Steinberg et al. May 2016 B1
9414121 Farber et al. Aug 2016 B1
10390092 Farber Aug 2019 B1
10390093 Rosenberg Aug 2019 B1
20010025259 Rouchon Sep 2001 A1
20010032312 Runje et al. Oct 2001 A1
20010042107 Palm Nov 2001 A1
20010044851 Rothman et al. Nov 2001 A1
20010049826 Wilf Dec 2001 A1
20020002039 Qureshey et al. Jan 2002 A1
20020021708 Ishiai Feb 2002 A1
20020023163 Frelechoux et al. Feb 2002 A1
20020023164 Lahr Feb 2002 A1
20020023166 Bar-Noy et al. Feb 2002 A1
20020032019 Marks et al. Mar 2002 A1
20020032728 Sako et al. Mar 2002 A1
20020038359 Ihara et al. Mar 2002 A1
20020042913 Ellis et al. Apr 2002 A1
20020046084 Steele et al. Apr 2002 A1
20020056117 Hasegawa et al. May 2002 A1
20020056118 Hunter et al. May 2002 A1
20020058521 Yamada et al. May 2002 A1
20020059621 Thomas et al. May 2002 A1
20020062261 Mukai May 2002 A1
20020071658 Marko et al. Jun 2002 A1
20020073425 Arai et al. Jun 2002 A1
20020078456 Hudson et al. Jun 2002 A1
20020083148 Shaw et al. Jun 2002 A1
20020087402 Zustak Jul 2002 A1
20020091762 Sohn et al. Jul 2002 A1
20020108115 Palmer Aug 2002 A1
20020124255 Reichardt et al. Sep 2002 A1
20020138630 Solomon et al. Sep 2002 A1
20020143782 Headings et al. Oct 2002 A1
20020152278 Pontenzone et al. Oct 2002 A1
20020161797 Gallo et al. Oct 2002 A1
20020161909 White Oct 2002 A1
20020174430 Ellis et al. Nov 2002 A1
20020194260 Headley et al. Dec 2002 A1
20020194607 Connelly Dec 2002 A1
20020194619 Chang et al. Dec 2002 A1
20030023975 Schrader et al. Jan 2003 A1
20030028505 O'Rourke et al. Feb 2003 A1
20030050058 Walsh et al. Mar 2003 A1
20030050837 Kim Mar 2003 A1
20030097338 Mankovich et al. May 2003 A1
20030120500 Deeds et al. Jun 2003 A1
20030126595 Sie Jul 2003 A1
20030135464 Mourad et al. Jul 2003 A1
20030153302 Lewis et al. Aug 2003 A1
20030162571 Chung Aug 2003 A1
20030182184 Strasnick et al. Sep 2003 A1
20030188313 Ellis et al. Oct 2003 A1
20030192060 Levy Oct 2003 A1
20040255336 Logan et al. Dec 2004 A1
20050060745 Riedl et al. Mar 2005 A1
20050278761 Gonder et al. Dec 2005 A1
20050289588 Kinnear Dec 2005 A1
20060026639 Potrebic et al. Feb 2006 A1
20060173974 Tang Aug 2006 A1
20060194626 Anttila Aug 2006 A1
20060199575 Moore et al. Sep 2006 A1
20060235723 Millard Oct 2006 A1
20070060112 Reimer Mar 2007 A1
20070143493 Mullig Jun 2007 A1
20070168429 Apfel et al. Jul 2007 A1
20080086742 Aldrey et al. Apr 2008 A1
20080195970 Rechsteiner Aug 2008 A1
20090002335 Chaudhri et al. Jan 2009 A1
20090028331 Millar et al. Jan 2009 A1
20090210905 Maruyama et al. Aug 2009 A1
20090327894 Rakib et al. Dec 2009 A1
20100119208 Davis et al. May 2010 A1
20120096499 Dasher et al. Apr 2012 A1
20120158524 Hintz et al. Jun 2012 A1
20130332962 Moritz et al. Dec 2013 A1
20140122593 Bachman et al. May 2014 A1
20150193516 Harb Jul 2015 A1
20150312532 Beaumier Oct 2015 A1
Foreign Referenced Citations (11)
Number Date Country
1022900 Jul 2000 EP
1997037492 Oct 1997 WO
1999010822 Mar 1999 WO
1999017230 Apr 1999 WO
1999039466 Aug 1999 WO
1999048296 Sep 1999 WO
2000007368 Feb 2000 WO
2000019662 Apr 2000 WO
2001036064 May 2001 WO
2001038993 May 2001 WO
2001079964 Oct 2001 WO
Non-Patent Literature Citations (172)
Entry
Final Office Action issued in U.S. Appl. No. 15/485,417 dated Apr. 18, 2019, 10 pages.
Non-Final Office Action issued in U.S. Appl. No. 15/485,417 dated Dec. 14, 2018, 14 pages.
Amended Invalidity Contentions case No. 2:16-cv-586-JRG-RSP (Apr. 13, 2017), 613 pages.
Dougherty, Advertising Music Channel on Cable TV, The New York Times, Jun. 1981, 2 pages.
Yarrow, Cable TV Moves to the Music, The New York Times, Jul. 1982, 2 pages.
CFT2200 User Guide, General Instrument 1999, 63 pages.
Information Disclosure Statement, Dec. 2006, 3 pages.
DR500 User Guide for the DMX Digital Audio Satellite Receiver, DMX, Rev. C (Oct. 1994), 47 pages.
Michaels, F., WBEB Philly Extends Its Reach via Site, Billboard, 88 (Sep. 30, 2000) (“Billboard”), 2 pages.
Dely, L., WBEB Live Links Web and On-air Ads, RadioWorld.com (May 15, 2000) (“RadioWorld”) (available at http://www.radioworld.com/news-and-business/0002/wbeb-live-links-web-and-onair-ads/304743), 7 pages.
Kerschbaumer, K., Philly FM creates novel Web future, BroadcastingCable.com (Jun. 4, 2000) (“Broadcasting Cable”) (available at http://www.broadcastingcable.com/news/news-articles/philly-fm-creates-novel-web-future/86828), 6 pages.
Stingray Digital Group Answer to the Third Amended Complaint (Apr. 7, 2017), 230 pages.
AudioSense Corporation, Have you Seen Radio Lately, 6 pages.
RadioWave.com,Inc., “It's on-line . . .It's Interactive . . . It's the next wave of radio!” (1998), 2 pages.
RadioWave.com,Inc., “It's on-line . . . It's Interactive . . . It's the next wave of radio!,” Install Disk (1998), 2 pages.
ClickZ, RadioWave.com and Enco Systems Establish Alliance, Oct. 19, 1998, 1 page.
Lyster, “Motorola Unit 'Fine-Tuning Internet Radio,” Investor's Business Daily, Nov. 25, 1998, 1 page.
Hiber, “Internet Radio Ratings Coming Soon From Arbitron, RadioWave.com,” Radio@Large (Dec. 1998), 1 page.
Gavin.com, “Today's Highlights,” www.gavin.com/index.shtml (Dec. 25, 1998), 1 page.
Gavin.com, “Secure Digital Music Initiative Begins Portable Device Working Group,” www.gavin.com/news/990305/sdmi.shtml (Mar. 5, 1999), 2 pages.
SEC Form S-1, Broadcast.com Inc. Part 1 (May 1998), 176 pages.
SEC Form S-1, Broadcast.com Inc. Part 2 (May 1998), 175 pages.
Prospectus, Broadcast.com (Jul. 16, 1998), 98 pages.
IPR2017-00888 Patent Owner Preliminary Response, (Jun. 2017), 48 pages.
IPR2017-00888 Institution Decision, (Sep. 2017), 24 pages.
IPR2017-00888 Patent Owner Response, (Jan. 2018), 83 pages.
IPR2017-00888 Ex. 2001 (1st Russ Declaration), (Jan. 2018), 55 pages.
IPR2017-00888 Ex. 2007 (2nd Russ Declaration), (Jan. 2018), 53 pages.
IPR2017-00888—Petitioner's Reply, (Apr. 16, 2018), 33 pages.
IPR2017-00888 Ex. 1009 Mar. 14, 2018 Deposition of Dr. Russ, 128 pages.
IPR2017-00888 Ex. 1010 Reply Declaration of Dr. Shamos, (Apr. 16, 2018), 43 pages.
IPR2017-00888 Ex. 1011 Excerpt from Websters, (1999), 4 pages.
IPR2017-00888 Ex. 1012 Excerpt from the Oxford English Dictionary (2d Ed.), (1989), 3 pages.
IPR2017-01191 Patent Owner Preliminary Response, (Jul. 2017), 42 pages.
IPR2017-01191 Institution Decision, (Oct. 2017), 25 pages.
IPR2017-01191 Patent Owner Response, (Jan. 2018), 68 pages.
IPR2017-01191 Ex. 2109 (1st Russ Declaration), (Jan. 2018), 27 pages.
IPR2017-01191 Ex. 2112 (2nd Russ Declaration), (Jan. 2018), 52 pages.
IPR2017-01191 - Petitioner's Reply, (Apr. 16, 2018), 32 pages.
IPR2017-01191 Ex. 1020 Mar. 14, 2018 Deposition of Dr. Russ, 93 pages.
IPR2017-01191 Ex. 1021 Reply Declaration of Dr. Shamos, (Apr. 16, 2018), 34 pages.
IPR2017-01450 Patent Owner Preliminary Response, (Aug. 28, 2017), 37 pages.
IPR2017-01450 Ex. 2001 Claim Construction Order, (Jul. 6, 2017), 52 pages.
IPR2017-01450 Institution Decision, (Oct. 27, 2017), 35 pages.
IPR2017-01450 Patent Owner Response, (Mar. 5, 2018), 39 pages.
IPR2017-01450 Ex. 2002 Declaration of Dr. Russ, (Mar. 5, 2018), 40 pages.
IPR2017-01450 Ex. 2003 Shamos Deposition Transcript, (Feb. 14, 2018), 65 pages.
IPR2017-01450 Ex. 2004 Shamos Deposition Transcript, (Feb. 13, 2018), 141 pages.
IPR2017-01450 Ex. 2005 Illustrated Dictionary of Electronics, (1999), 6 pages.
IPR2017-01450 Ex. 2006 The Educational Technology Telecommunications Dictionary, (1991), 3 pages.
IPR2017-01450 Ex. 2007 Comprehensive Dictionary of Electrical Engineering, (1999), 5 pages.
IPR2017-01450 Ex. 2008 Dictionary of Information Technology (Third Edition), (1989), 4 pages.
IPR2017-01450 Ex. 2009 Desktop Dictionary of Information Systems Technology, (1989), 6 pages.
IPR2017-01450 Ex. 2010.
Portions of the File history of U.S. Appl. No. 11/002,181, (Dec. 2006-Aug. 2007), 61 pages.
Portions of the File history of U.S. Appl. No. 11/963,164, (Dec. 2010-Dec. 2011), 48 pages.
Portions of the File history of U.S. Appl. No. 13/453,826, (Sep. 2013), 11 pages.
Portions of the File history of U.S. Appl. No. 14/153,872, (Sep. 2015-Dec. 2015), 20 pages.
Portions of the File history of U.S. Appl. No. 14/635,483, (U.S. Pat. No. 9,351,045), (Aug. 2015-Jan. 2016), 20 pages.
Portions of the File history of U.S. Appl. No. 15/162,259, (Sep. 2016), 6 pages.
Portions of the File history of U.S. Appl. No. 11/002,205, (U.S. Pat. No. 7,617,295), (May 2008-Jun. 2009), 56 pages.
Portions of the File history of U.S. Appl. No. 12/605,580, (U.S. Pat. No. 7,962,572), (Aug. 2010-Feb. 2011), 17 pages.
Portions of the File history of U.S. Appl. No. 13/157,386, (Oct. 2013), 15 pages.
Portions of the File history of U.S. Appl. No. 14/163,554, (U.S. Pat. No. 9,414,121), (Jul. 2014-Jun. 2016), 72 pages.
Portions of the File history of U.S. Appl. No. 15/231,152, (Jul. 2017), 35 pages.
Portions of the File history of U.S. Appl. No. 15/670,613, (Aug. 2017-Dec. 2017), 42 pages.
Final Office Action issued in U.S. Appl. No. 15/231,152 dated May 15, 2018, 19 pages.
IPR2017-01450—Petitioner's Reply, (May 18, 2018), 29 pages.
IPR2017-01450 Ex. 1016 Reply Declaration of Dr. Shamos, (May 18, 2018), 26 pages.
IPR2017-01450 Ex. 1017 U.S. Pat. No. 7,783,722, (Aug. 24, 2010), 49 pages.
IPR2017-01450 Ex. 1018 U.S. Pat. No. 7,275,256, (Sep. 25, 2007), 34 pages.
IPR2017-01450 Ex. 1019 Deposition Transcript of Dr. Russ, (Apr. 20, 2018), 89 pages.
IPR2017-01450 Ex. 1020 Definition of “Analog Data”, haps://www.techopedia.com/definition/24871/analog-data, Exhibit 3 to the Apr. 20, 2018 Deposition of Dr. Russ, 4 pages.
IPR2017-01450 Ex. 1021 Definition of “Analog Data”, https://study.com/academy/lesson/analog-data-vs-digital-data.html, Exhibit 4 to the Apr. 20, 2018 Deposition of Dr. Russ, 3 pages.
IPR2017-01450 Ex. 1022 U.S. Patent Publication No. 2008/0101415, (May 1, 2008), 17 pages.
IPR2017-01450 Ex. 1023 U.S. Pat. No. 7,499,822, (Mar. 3, 2009), 38 pages.
IPR2017-01450 Ex. 1024 DirecTV vs. Cable, Wayback Archive of http://www.directv.com:80/DTVAPP/get_directv/directv_vs_cable.dsp, (Mar. 4, 2005), 2 pages.
IPR2017-01450 Ex. 1025 Patent Declaration Combined with Power of Attorney of U.S. Appl. No. 11/427,745, (2006), 4 pages.
IPR2017-01450 Ex. 1026 Definition of “phonograph”, The American Heritage Desk Dictionary (2003 4th ed), 3 pages.
IPR2017-01450 Ex. 1027 Definition of “phonograph”, Merriam-Webster's Collegiate Dictionary (2003 4th ed), 3 pages.
IPR2017-01450 Ex. 1028 “Stations Turn Off Analog Signals as Digital TV Deadline Arrives,” New York Times, (Jun. 12, 2009), 16 pages.
IPR2017-01450 Ex. 1029 FCC Eleventh Annual Report, (Feb. 4, 2005), 151 pages.
Adolphe V. Bemotas, “Computers and TV: Marriage of the Future; Five Star Lift Edition”, St. Louis Post—Dispatch, Oct. 11, 1995, 1 page.
John Sweeney, “An Introduction to Interactive Television”, International Broadcasting Convention, 1994, pp. 503-508.
Pekowsky, S. and R. Jaeger The set-top box as ‘multi-media terminal’;—Consumer Electronics, IEEE Transactions on 1998, pp. 1-8.
AudioRequest, MP3 Home Stereo Jukebox, ReQuest, Inc.—Company Info., and NSI WHOIS Search Results. Pages from the web site for www.request.com owned by ReQuest, Inc., Jun. 22, 2004, 6 pages.
Clark D. (2000). “Click Radio to put a DJ in your PC.” WSJ Interactive Edition.
ClickRadio granted first interactive radio license by universal music group. 3 pages. From the web site at www.clickradio.com, printed Apr. 26, 2000.
Gordon, C. (2000). “Click radio sidesteps competition with music licensing deals.” Atnewyork.com.
Press Release. (Dec. 13, 2000). “Phillips showcases click radio on digital set-top at western show 2000.” Phillips.
SonicNet: The Online Music Network, http:/web.archive.org/web/19991013143923/http://sonicnet.com/, Oct. 13, 1999, 6 pages.
Trowsdale, J., “The ntl guide to digital radio for dummies,” http://www.ntl.com/locales/gb/en/guides/dummies/default.asp, Aug. 13, 2002, 1 page.
Bower (1998). “Digital Radio—A Revolution for In-Car Entertainment” Proc. NavPos Automative '98 Conf. 2(5-8): 40-51.
Deutsche Telekom AG, “Digital Radio,” http://www.telekom.de/dtag/ipll/cda/leve13_a/0,3680,10077,00.html, Aug. 18, 2000, 1 page.
“The Eureka 147 Consortium,” http://eurekadab.org/eureka_147_consortium.htm, Aug. 14, 2000, 3 pages.
Radio Authority (1999). Digital Radio Fact Sheet No. 4 http://www.radioauthority.org.uk/Information/Fact.sub.--Sheets/fs4.htm.
ICTV (2000). Digital Broadband System Press Release: 1-11.
Loeb, S., “Architecting Personalized Delivery of Multimedia Information”, Communications of the ACM, Dec. 1992, vol. 35, No. 12, pp. 39-48.
“Blue Note Radio,” Now Playing on a Computer Screen Near You. EMI's Blue Note Records Expands New Media Initiative with RadioWave.com, Press Release Newswire Association, Inc., Apr. 4, 2000, 2 pages.
“Global Media Announces Launch of Independent Internet Radio station,” News Release, Feb. 1, 1999, 2 pages.
Olenick, Doug, “Internet Radio Listeners Unchained From Their PCs,” Oct. 25, 1999. Twice Computer Technology, 1 page.
“Platinum Entertainment and Liquid Audio Join Forces to Offer Extensive Music Catalog via Digital Downloads”, Press Release, Jul. 15, 1998, 2 pages.
“Set-top box for television that reads your mind,” Financial Times Limited, Dec. 30, 1998, 1 page.
“Sonicbox and Microsoft Bring Windows Media Internet Radio to the Home Stereo,” Dec. 7, 1999 Microsoft Press Release, 3 pages.
Partyka, Jeff , “Sonicbox brings Net radio into your living room,” Oct. 12, 1999. CNN.com, 3 pages.
“Tune into Yahoo! Radio,” Yahoo Media Relations Press Release, Yahoo! teams up with Broadcast.com and Spinner.com to Provide 10 stations of Audio Programming, May 11, 1999, 2 pages.
“WebRadio.com Signs on as Liquid Music Network Affiliate Offering Liquid Audio Digital Downloads,” Business Wire, Inc., Sep. 1, 1999, 2 pages.
http://launch.yahoo.com, “Music on Yahoo”, 2 pages, Jun. 25, 2004.
King, “Tune on, Tune in, Drop Cash” Dec. 8, 2000, Wired News, 4 pages.
LaFrance, “Thinking Globally with a web-based radio station vying for listeners around the world, homegrown internet company fastband aims to shake up the music world”, Times Picayune, Nov. 4, 1999, 2 pages.
Rajapakshe, H. et al., “Video on Demand,” Jun. 1995, pp. 1-15.
Time Warner Cable, Pegasus, “The ISA Tutorial,” Version 1.0, Sep. 13, 2003, 73 pages.
UniView Technologies Now in Yahoo!'s Multicast Affiliate Program, Press Release Newswire Association, Inc., Oct. 19, 1999, 2 pages.
Welz, Gary, Integrated Streaming Technologies, Oct. 30, 1996, www.webdeveloper.cm/multimedi/multimedi.sub.--web/96/mw961030.html.
Yahoo Offers one-stop shop for e-music, Milwaulkee Journal Sentinel (Wisconsin), Aug. 25, 1999, 1 page.
Petition for Inter Parties Review U.S. Pat. No. 7,320,025, IPR Case No. IPR2017-00888, Mar. 17, 2013, 53 pages.
Declaration of Michael Shamos, Petition for Inter Parties Review U.S. Pat. No. 7,320,025, IPR Case No. IPR2017-00888, dated Mar. 7, 2017, 56 pages.
Hallier, J. et al., “Multimedia Broadcasting to mobile, portable and fixed Receivers using the Eureka 148 Digital Audio Broadcasting System,” 5th IEEE International Symposium on Personal, Indoor and Mobile Radio Communications, Wireless Networks—Catching the Mobile Future, Sep. 18-23, 1994, 11 pages.
Petition for Inter Parties Review U.S. Pat. No. 9,351,045, IPR Case No. IPR2017-1191, dated Mar. 30, 2017, 62 pages.
Declaration of Michael Shamos, Petition for Inter Parties Review U.S. Pat. No. 9,351,045, IPR Case No. IPR2017-1191, dated Mar. 30, 2017, 135 pages.
Gonze, L., “A survey of playlist formats,” Nov. 17, 2003, 12 pages.
Petition for Inter Parties Review U.S. Pat. No. 9,414,121, IPR Case No. IPR2017-1450, dated May 18, 2017, 79 pages.
Declaration of Michael Shamos, Petition for Inter Parties Review U.S. Pat. No. 9,414,121, IPR Case No. IPR2017-1450, dated May 18, 2017, 127 pages.
U.S. Appl. No. 60/377,963 (McElhatten-189 provisional application), filed May 3, 2002, 85 pages.
Music Choice's Local Patent Rule 3-1 Cover Pleading Submission in Music Choice v. Stingray Digital Group Inc., Case No. 2:16-CV-0586-JRG-RSP (E.D. Tex.), dated Sep. 12, 2016, 5 pages.
Comaromi, J., et al. (Eds.)., “DDC 20: Dewey Decimal Classification,” 20th Ed., 1989, 27 pages.
“Launch Media and iBeam Team Up to Take on Heavyweight Napster on College Campus Circuit,” digitalcoastdaily.com, Jun. 19, 2000, 10 pages.
Cosmas, J., et al., “CustomTV with MPEG-4 and MPEG-7,” Institution of Electrical Enginners (1999), 7 pages.
Bryhni et al., “On-demand Regional Television Over the Internet,” Nov. 1996, ACM Multimedia, Proceedings of the 4th ACM International Conference on Multimedia, 9 pages.
Bove et al., “Hyperlinked Television Research at the MIT Media Laboratory,” May 2000, IBM Systems Journal, vol. 39, Nos. 3 & 4, 9 pages.
Cosmas et al., “CustomTV with MPEG-4 and MPEG-7,” Dec. 6, 1999, IEE Electronics Communications: Interactive Television, Colloquium, 7 pages.
“Music Choice Europe, A Leader in Digital Music Services,” Sep. 6, 2000, Investec Henderson Crosthwaite Securities, 47 pages.
“Music Choice Prospectus 2000,” Sep. 2000, Investec Henderson Crosthwaite Securities, 95 pages.
“NDS to Showcase Interactive Applications that Transform Living Rooms into Digital Interactive Theaters at NAB 2000,” Apr. 9, 2000, NDS Group plc. Business Wire, 3 pages.
Music Choice Europe, “Music Choice Chooses NDS as its Digital TV Technology Partner,” Jul. 2000, 2 pages.
“NDS Delivers Sophisticated Interactive Application to Music Choice Europe,” May 10, 2001, NDS Group plc. Business Wire, 3 pages.
“NDS Group plc Reports Full Year Revenues Up 35% and Operating Income Growth of 69% Plus Major Contract Wins in Fourth Quarter,” Aug. 6, 2001, NDS Group plc, 15 pages.
NDS Website (http://web.archive.org/web/20000824140133/http://www.nds.com/products/broad_products/nds_broadcast/prod_v alue@tv.htm). Aug. 24, 2000, 7 pages.
Doherty et. al., “Detail-on-Demand Hypervideo,” Nov. 2-8, 2003, FX Palo Alto Laboratory, 2 pages.
Krikke, “Streaming Video Transforms the Media Industry,” Jul.-Aug. 2004, IEEE Computer Society, 7 pages.
Atzori et al., “Multimedia Information Broadcasting Using Digital TV Channels,” Sep. 1997, IEEE Transactions on Broadcasting, vol. 43, No. 3, 10 pages.
Brunheroto et al., “Issues in Data Embedding and Synchronization for Digital Television”, Jul. 30-Aug. 2, 2000, IEEE Publication, 6 pages.
Coden et al., “Speech Transcript Analysis for Automatic Search,” Jan. 3-6, 2001, IEE Proceedings of the 34th Hawaii International Conference on System Science, 11 pages.
Dakss, Jonathan, “HyperActive: An Automated Tool for Creating Hyperlinked Video,” Sep. 1999, Published thesis by the Massachusetts Institutes of Technology, 100 pages.
Jacobs, Bruce, “Transport B for Broadcasters: Boon or Bane?,” Feb. 8-10, 2001, Twin Cities Public Television, Inc., 9 pages.
“Information Technology—Generic Coding of Moving Pictures and Associated Audio Information: Video,” Feb. 2000, International Telecommunication Union (ITU-T), 220 pages.
Chang et al., “Overview of the MPEG-7 Standard,” Jun. 2001, IEEE Transactions on Circuits and Systems for Video Technology, vol. 11, No. 6, 8 pages.
Yao et al., “The Development of a Video Metadata Authoring and Browsing System in XML,” Dec. 2000, Australian Computer Society, Inc. Visualisation 2000, Pan-Sydney Workshop on Visual Information Processing, 8 pages.
Bainbridge et al., “Towards a Digital Library of Popular Music,” Aug. 1, 1999, ACM, 9 pages.
Hacker, Scot, “MP3: The Definitive Guide,” Mar. 2000, O'Reilly Publishing, 378 pages.
Jacso et al., “Music to Your Ears (and Eyes),” Jun.-Jul. 1996, Database; ABI/Inform Global, 10 pages.
Jermey, Jonathan , “Locating Files on Computer Disks,” Apr. 2001, The Indexer, vol. 22, No. 3, 3 pages.
Lippman et al., “Media Banks: Entertainment and the Internet,” Apr. 4, 1996, IBM Systems Journal, vol. 35, Nos. 3&4, 20 pages.
Loudeye Website, 1999-2000, Loudeye Technologies. Archive.org, 2 pages.
Marrin et al., “Steerable Media: Interactive Television via Video Synthesis,” Feb. 19-22, 2001, ACM, 10 pages.
Packham et al., “Transport of Context-Based Information in Digital Audio Data,” Sep. 22-25, 2000, AES 109th Convention, 14 pages.
Papadakis et al., “Technical Note Design and Architectural of a Digital Music Library on the Web,” Jan. 2001, The New Review of Hypermedia and Multimedia, 12 pages.
Vilain et al., “Use Cases and Scenarios in the Conceptual Design of Web Applications,” Feb. 2000, PUC-Rio Inf. MCC 12/00, 12 pages.
Zerod, Richard, “The Evolution: From Car Audio to Digital Mobile Multimedia,” Feb. 24-27, 1997, SAE Technical Paper Series—1997 International Congress & Exposition, 9 pages.
Letter Agreement addressed to Music Choice Europe Limited, dated Sep. 26, 2000, 8 pages.
Defendants Stingray Digital Group Inc.'S and Stingray Music USA, Inc.'s (“Stingray”) Invalidity Contentions Pursuant to Patent L.R. 3-3, Nov. 28, 2016, 25 pages.
Appendix A to Stingray's Invalidity Contentions, dated Nov. 28, 2016, 245 pages.
Appendix C to Stingray's Invalidity Contentions, dated Nov. 28, 2016, 770 pages.
Appendix E to Stingray's Invalidity Contentions, dated Nov. 28, 2016, 968 pages.
Portions of the File history of U.S. Appl. No. 14/167,509, (Jul. 2014-Jan. 2015), 28 pages.
Portions of the File history of U.S. Appl. No. 14/947,017, (Apr. 2016-Oct. 2016, 32 pages.
Final Written Decision in IPR2017-00888 dated Sep. 20, 2018, 35 pages.
Final Written Decision in IPR2017-01191 dated Oct. 11, 2018, 52 pages.
Final Written Decision in IPR2017-01450 dated Oct. 24, 2018, 47 pages.
Final Office action in U.S. Appl. No. 15/670,613, dated Sep. 7, 2018, 18 pages.
Non-Final Office Action issued in U.S. Appl. No. 15/485,417 dated Sep. 16, 2019, 11 pages.
Final Office Action issued in U.S. Appl. No. 15/485,417 dated Apr. 2, 2020 16 pages.
Provisional Applications (1)
Number Date Country
62068374 Oct 2014 US
Continuations (3)
Number Date Country
Parent 15677236 Aug 2017 US
Child 16281804 US
Parent 15616564 Jun 2017 US
Child 15677236 US
Parent 14922597 Oct 2015 US
Child 15616564 US