Mobile media devices may be configured in a variety of different ways to output content from a variety of different sources. For example, a mobile media device may be configured as a handheld music player that may receive content wirelessly via a plurality of networks, such as via a broadcast network (e.g., a one-way communication channel such as FM radio) and a two-way network (e.g., that provides access to the Internet). However, it may be difficult for a user to determine what was recently played by the mobile media device by these different sources.
For example, a user may listen to a variety of different channels of a broadcast network, e.g., different FM radio stations. While audio content (e.g., a song) is being played, a display device of the mobile media device may output information that describes what is currently being played, e.g., a song title and artist. Once the song is done being played, however, the user is not able to access the information using a conventional mobile media device. Therefore, the user may be forced to rely solely on the user's own recollection to remember songs of interest.
A mobile media device user interface is described. In one or more implementations, output of a plurality of audio content is monitored by a mobile media device. Each of the audio content was received via a respective one of a plurality of broadcast channels by the mobile media device. A user interface is displayed on a display device of the mobile media device, the user interface describing each of the plurality of audio content and the respective broadcast channel from which the audio content was received.
In one or more implementations, one or more computer-readable storage comprise instructions that are executable by a mobile media device to output a user interface that describes a plurality of content received via a broadcast and output by the mobile media device. The user interface has a plurality of options to purchase the plurality of content for download.
In one or more implementations, a mobile media device includes a display device and one or more modules. The one or more modules are configured to output a plurality of audio content that have been received wirelessly via one or more broadcast channels. A user interface is displayed on the display device that includes one or more of the audio content that is an advertisement.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.
Overview
A mobile media device (e.g., a portable music player) may output audio content that was obtained from a variety of different sources, such as via a broadcast (e.g., AM, FM, or Satellite radio), via a two-way network (e.g., a stream provided by a website that is accessible via the Internet), and so on. Traditional techniques that were used to describe this content, however, were limited to what was currently being output by the mobile media device. For example, a traditional radio (e.g., in compliance with FM, and/or Satellite) may display a title and artist of a song that is currently being output. Once the output of the song has completed (e.g., the song has been played), however, conventional display techniques did not permit a user to determine what songs (or other audio content) was recently played.
A user interface for a mobile media device is described. In an implementation, the user interface is configured to describe audio content that was output by the mobile media device. For example, a user may interact with the mobile media device to navigate between broadcast channels (e.g., frequencies in FM radio) and output audio content (e.g., songs) from the channels. The mobile media device in this implementation is configured to store metadata that is streamed with and describes the audio content, even if the mobile media device is sequentially tuned to a variety of different broadcast channels. The metadata may then be output in a user interface to describe the audio content that was output by the mobile media device.
The user interface (and the metadata contained therein) may be leveraged in a variety of ways. For example, the user interface may be configured to provide a link to a website to purchase audio content that was recently played by the mobile media device. In another example, the user interface may be configured to include descriptions of audio content configured as advertisements that were broadcast with other audio content, e.g., songs. The description of the advertisements may also be configured to navigate to a corresponding website. Thus, in these examples information that describes broadcast content received by the mobile media device may be leveraged to obtain additional information from a two-way network, e.g., via the Internet. A variety of other examples are also contemplated, further discussion of which may be found in relation to the following sections.
In the following discussion, a mobile media device is described that may receive audio content wirelessly from a variety of different sources. However, it should be readily apparent that the following discussion is not to be limited to a mobile media device, audio content, or wireless communication. Accordingly, a variety of different devices may employ the techniques described herein without departing from the spirit and scope thereof, such as other computers such as desktop PCs, netbooks, wireless phones, personal digital assistants, and so on.
Example Environment
Likewise, the broadcast content provider 102 may be configured in a variety of ways. For example, the broadcast content provider 102 may be configured as an FM radio station that broadcasts the audio content 108 using an FM signal as described above. The broadcast content provider 102 may also be configured as a satellite radio provider that broadcasts the audio content 108 via the wireless connection 106 using a satellite for direct receipt by the mobile media device 104. A variety of other examples are also contemplated, such as configuration for broadcast of audio content 108 in compliance with HD radio techniques, AM radio techniques, and so on.
The communication module 110 is further illustrated as including a user interface module 112. The user interface module 112 is representative of functionality of the mobile media device 104 to generate and maintain a user interface 114 for display on a display device 116 of the mobile media device 104. The user interface 114 may be configured in a variety of ways, such as to include a list of recently played content as illustrated in
For example, the broadcast content provider 102 may broadcast the audio content 108 via the wireless connection to the mobile media device 104, as well as other mobile media devices. The audio content 108 is illustrated as including metadata 118 that describes and is streamed with the audio content 108. For example, the metadata 118 may include a title, artist, album name, genre, contain links to a network site (e.g., an ecommerce website to purchase like content, a fan website, etc), a broadcast time, and so on.
The user interface module 112 of the communication module 110 may then parse the received broadcast to remove the metadata 118 and store it, an example of which is illustrated as metadata 120 stored in storage 122 (e.g., persistent memory) of the mobile media device 104. The user interface module 112 may then leverage the metadata 120 to generate the user interface 114.
As illustrated in the environment 100 of
The audio content 108 includes metadata 118 that is used by the mobile media device 104 to generate the user interface 114 that includes descriptions of the audio content 108. In the illustrated instance the audio content 108 is described in an order that follows an order in which the audio content 108 was output. A variety of other examples are also contemplated, such as displayed in groups arranged by broadcaster, and so on. In an implementation, at least a portion of the descriptions of the audio content 108 in the user interface 114 are selectable (e.g., via a cursor control device 206 to select a title shown in bold) to initiate an operation to purchase the audio content 108.
As shown in the second instance 204, the user interface 114 outputs an option that is selectable to cause the audio content 108 to be purchased. For example, the communication module 110 of the mobile media device 104 may include functionality to communicate via a two-way network, which is illustrated as network 208 in
Once the option is selected (e.g., by selecting “yes” in the user interface 114 using the cursor control device 206), the user interface 114 may be configured to provide navigation to a content provider 210 via the network 208 to purchase the audio content 108 that was described in the user interface 114. For example, the communication module 110 may include browser functionality to navigate to a website maintained by the content provider 210 to purchase content.
Thus, in the illustrated system 200 the mobile media device 114 outputs audio content 108 that was received via a one-way network (e.g., an FM signal, satellite signal, HD radio signal, and so on). Descriptions of the audio content 108 are then leveraged to provide an option to purchase the audio content 108 via a two-way network, e.g., via access to a two-way network such as when the mobile media device 102 includes telephone functionality. The audio content 108 may be purchased in a variety of ways, further discussion of which may be found in relation to the following figure.
As illustrated, the user interface 114 may be configured to accept credentials from the user, examples of which include a user name and password. The credentials may then be communicated via the network 208 to the authentication service 302 for comparison with credentials of an account 304. If authenticated, a token may be passed back to the mobile media device 104 that is usable to access a plurality of websites without reentering the credentials. Thus, the token may be used by the mobile media device 104 to access the content provider 210 (and more particularly a website maintained by the content provider 210) to purchase the audio content 108. In an implementation, the purchase may also be made in conjunction with the authentication service 302, e.g., by obtaining account information, charging the account 304, and so on.
Generally, any of the functions described herein can be implemented using software, firmware, hardware (e.g., fixed logic circuitry), manual processing, or a combination of these implementations. The terms “module,” “functionality,” and “logic” as used herein generally represent software, firmware, hardware, or a combination thereof. In the case of a software implementation, the module, functionality, or logic represents program code that performs specified tasks when executed on a processor (e.g., CPU or CPUs). The program code can be stored in one or more computer readable memory devices. The features of the user interface techniques described below are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.
Example Procedures
The following discussion describes user interface techniques that may be implemented utilizing the previously described systems and devices. Aspects of each of the procedures may be implemented in hardware, firmware, software, or a combination thereof. The procedures are shown as a set of blocks that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. In portions of the following discussion, reference will be made to the environment 100 of
Data is collected that describes the plurality of audio content (block 404). Continuing with the previous example, a user may interact with the mobile media player 104 to navigate between a plurality of broadcast channels e.g., radio frequencies, satellite channels, and so on. Accordingly, metadata 118 that is broadcast along with the audio content 108 may be collected during this navigation. The data (e.g., metadata 118) is then stored by the mobile media device (block 406), e.g., to memory or other computer-readable medium.
The user interface is displayed on a display device of the mobile media device, the user interface describing each of the plurality of audio content and the respective broadcast channel from which the audio content was received (block 408). As shown in
A selection is received of a portion of the user interface that describes a particular one of the audio content (block 410). The user, for instance, may interact with the cursor control device 206 of the mobile media device 104 to select a title or other portion of the information that describes the audio content 108, e.g., an icon including album art. Responsive to the selection, an operation is initiated to purchase the particular audio content via a network connection (block 412). As previously described in relation to
A user interface is displayed on a display device of a mobile media device, the user interface including one or more of the audio content that is an advertisement (block 506). For example, the audio content 108 broadcast to the mobile media device 104 may also include advertisements for goods or services. A description of the advertisements may also be included in the user interface 114, e.g., that identifies the good or service advertised (a brand name).
A selection is received via the user interface to navigate to a represented advertisement (block 508) and navigation is performed to the advertisement via a two-way network (block 510). The metadata 118 that was streamed with the audio content 108 that is an advertisement, for instance, may include a network address (URL) to a site (e.g., website) that includes additional information that relates to the advertisement. A variety of different information may be included at the site, such as the advertisement itself, a way to purchase the goods or services represented in the advertisement (e.g., via an ecommerce website), additional product literature, and so on. Therefore, two-way functionality of the mobile media device 104 included in the communication module 110 may be used to access the Internet in this example to further leverage advertisements received via a broadcast. A variety of other examples are also contemplated.
Even though the mobile media device is in the low-power mode, however, metadata 118 broadcast with the audio content 108 may be cached in storage 122 of the device as metadata 120 (block 606). Once an indication is received to exit the low-power mode (block 608), a user interface 114 is displayed on the display device 116 that includes at least a portion of the cached metadata (block 610). For example, an input may be received via a button of the mobile media device 104 to “wake up” the device, e.g., pressing the power button or other buttons of the device. Therefore, in this way the metadata 118 may be cached and used when appropriate yet still conserve resources of the mobile media device 104.
Although the invention has been described in language specific to structural features and/or methodological acts, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed invention.
This application claims priority under 35 U.S.C. Section 120 as a continuation of U.S. patent application Ser. No. 14/302,315, filed Jun. 11, 2014, and entitled “Mobile Media Device User Interface” which is a continuation of U.S. patent application Ser. No. 12/491,045, filed Jun. 24, 2009, and entitled “Mobile Media Device User Interface,” the entire disclosures of which are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
5918213 | Bernard et al. | Jun 1999 | A |
6317784 | Mackintosh et al. | Nov 2001 | B1 |
7099348 | Warwick | Aug 2006 | B1 |
7647419 | Deshpande | Jan 2010 | B2 |
7653342 | Nichols et al. | Jan 2010 | B2 |
7676203 | Chumbley et al. | Mar 2010 | B2 |
7801500 | Kraft et al. | Sep 2010 | B2 |
7835689 | Goldberg et al. | Nov 2010 | B2 |
7840178 | Hellman | Nov 2010 | B2 |
8244171 | Ingrassia et al. | Aug 2012 | B2 |
8249497 | Ingrassia et al. | Aug 2012 | B2 |
8291320 | Robin et al. | Oct 2012 | B2 |
8315950 | Conley et al. | Nov 2012 | B2 |
8423545 | Cort et al. | Apr 2013 | B2 |
8423582 | Griggs | Apr 2013 | B2 |
8443007 | Kindig et al. | May 2013 | B1 |
8515337 | Ingrassia et al. | Aug 2013 | B2 |
8527876 | Wood et al. | Sep 2013 | B2 |
8571466 | Ingrassia et al. | Oct 2013 | B2 |
8756507 | Fong et al. | Jun 2014 | B2 |
9888279 | Ishtiaq | Feb 2018 | B2 |
20050022237 | Nomura | Jan 2005 | A1 |
20050197906 | Kindig et al. | Sep 2005 | A1 |
20050240661 | Heller et al. | Oct 2005 | A1 |
20060003753 | Baxter, Jr. | Jan 2006 | A1 |
20060085751 | O'Brien et al. | Apr 2006 | A1 |
20060156343 | Jordan | Jul 2006 | A1 |
20060206582 | Finn | Sep 2006 | A1 |
20060212442 | Conrad et al. | Sep 2006 | A1 |
20070010195 | Brown et al. | Jan 2007 | A1 |
20070016654 | Bowles et al. | Jan 2007 | A1 |
20070021142 | Byeon et al. | Jan 2007 | A1 |
20070042762 | Guccione | Feb 2007 | A1 |
20070065794 | Mangum | Mar 2007 | A1 |
20070087686 | Holm et al. | Apr 2007 | A1 |
20070136446 | Rezvani et al. | Jun 2007 | A1 |
20070142090 | Rydenhag et al. | Jun 2007 | A1 |
20070204227 | Kretz | Aug 2007 | A1 |
20080086379 | Dion et al. | Apr 2008 | A1 |
20080134264 | Narendra et al. | Jun 2008 | A1 |
20080214236 | Harb | Sep 2008 | A1 |
20080218409 | Moinzadeh et al. | Sep 2008 | A1 |
20080222546 | Mudd et al. | Sep 2008 | A1 |
20080243923 | Mazor et al. | Oct 2008 | A1 |
20090023406 | Ellis et al. | Jan 2009 | A1 |
20090030537 | Hartle | Jan 2009 | A1 |
20090063292 | Cole et al. | Mar 2009 | A1 |
20090093259 | Edge et al. | Apr 2009 | A1 |
20100332988 | Fong et al. | Dec 2010 | A1 |
20160012465 | Sharp | Jan 2016 | A1 |
20160357509 | Alsina | Dec 2016 | A1 |
20170272824 | Bunner | Sep 2017 | A1 |
20170301001 | Wilkinson | Oct 2017 | A1 |
20180152976 | Baron | May 2018 | A1 |
20180183685 | Cook | Jun 2018 | A1 |
Number | Date | Country |
---|---|---|
1585512 | Feb 2005 | CN |
1747511 | Mar 2006 | CN |
09321645 | Dec 1997 | JP |
2000183835 | Jun 2000 | JP |
2000209681 | Jul 2000 | JP |
2001298430 | Oct 2001 | JP |
2001522121 | Nov 2001 | JP |
2001358679 | Dec 2001 | JP |
2002091906 | Mar 2002 | JP |
2004193726 | Jul 2004 | JP |
2008535004 | Aug 2008 | JP |
1020060073327 | Jun 2006 | KR |
1020070006689 | Jan 2007 | KR |
1020080040118 | May 2008 | KR |
2006067871 | Jun 2006 | WO |
2007101169 | Sep 2007 | WO |
2008131952 | Nov 2008 | WO |
2011005458 | Jan 2011 | WO |
Entry |
---|
“Office Action Issued in Indian Patent Application No. 10161/DELNP/2011”, dated Feb. 6, 2019, 5 Pages. |
“Client Self-Management”, Retrieved at: <<https://web.archive.org/web/20090311011255/http://www.iwayradio.com/streaming-control-panel.html>>, Apr. 28, 2009, 5 Pages. |
“Radio That Listens to Me: Y! Music Web Radio”, In Australian Journal of Emerging Technologies and Society, vol. 4, Issue 2, 2006, pp. 81-93. |
“Amendment filed in Korean Patent Application No. 10-2011-7030766”, Filed Date: Oct. 21, 2016, 15 Pages. |
“International Search Report and Written Opinion Issued in PCT Application No. PCT/US2010/39170”, dated Feb. 22, 2011, 8 Pages. |
“Office Action Issued in Korean Patent Application No. 10-2011-7030766”, dated Jan. 14, 2016, 11 Pages. |
“Office Action Issued in Korean Patent Application No. 10-2011-7030766”, dated Jul. 25, 2016, 4 Pages. |
“Amendment Filed in Korean Patent Application No. 10-2017-7004675”, Filed Date: Apr. 28, 2017,16 Pages. |
“Amendment/Argument Filed in Korean Patent Application No. 10-2017-7004675”, Filed Date: Oct. 26, 2017, 12 Pages. |
“Final Office Action Issued in Korean Patent Application No. 10-2017-7004675”, dated Nov. 23, 2017, 3 Pages. (W/o English Translation). |
“Final Rejection Issued in Korean Patent Application No. 10-2017-7004675”, dated Sep. 26, 2017, 4 Pages. |
“Office Action Issued in Korean Patent Application No. 10-2017-7004675”, dated Feb. 28, 2017, 8 Pages. |
“Amendment Filed in European Patent Application No. 10797531.0”, dated Aug. 25, 2016, 10 Pages. |
“Office Action Issued in European Patent Application No. 10797531.0”, dated May 14, 2014, 4 Pages. |
“Office Action Issued in European Patent Application No. 10797531.0”, dated Jul. 21, 2016, 4 Pages. |
“Response to Official Communication Filed in European Patent Application No. 10797531.0”, Filed Date: Jul. 11, 2014, 11 Pages. |
“Search Report Issued in European Patent Application No. 10797531.0”, dated Apr. 17, 2014, 3 Pages. |
“Final Office Action Issued in U.S. Appl. No. 12/491,045”, dated Dec. 19, 2013, 19 Pages. |
“Final Office Action Issued in U.S. Appl. No. 12/491,045”, dated Sep. 6, 2012, 21 Pages. |
“Non-Final Office Action Issued in U.S. Appl. No. 12/491,045”, dated Jun. 21, 2013, 18 Pages. |
“Non-Final Office Action Issued in U.S. Appl. No. 12/491,045”, dated Jan. 20, 2012, 21 Pages. |
“Notice of Allowance in U.S. Appl. No. 12/491,045”, dated Feb. 3, 2014, 6 Pages. |
“Final Office Action Issued in U.S. Appl. No. 14/302,315”, dated Aug. 9, 2017, 12 Pages. |
“Non-Final Office Action Issued in U.S. Appl. No. 14/302,315”, dated Feb. 28, 2017, 11 Pages. |
“Notice of Allowance Issued in U.S. Appl. No. 14/302,315”, dated Oct. 19, 2017, 5 Pages. |
“Amendment Filed in Chinese Patent Application No. 201080028190.8”, Filed Date: Aug. 22, 2016, 10 Pages. |
“Notice of Allowance Issued in Chinese Patent Application No. 201080028190.8”, dated Oct. 10, 2016, 4 Pages. |
“Office Action and Search Report Issued in Chinese Patent Application No. 201080028190.8”, dated Aug. 29, 2014, 16 Pages. |
“Office Action Issued in Chinese Patent Application No. 201080028190.8”, dated Dec. 7, 2015, 10 Pages. |
“Second Office Action Issued in Chinese Patent Application No. 201080028190.8”, dated May 15, 2015, 12 Pages. |
“Third Office Action and Search Report Issued in Chinese Patent Application No. 201080028190.8”, dated Jun. 6, 2016, 10 Pages. |
“Amendment Filed in Chinese Patent Application No. 201080028190.8”, Filed Date: Jan. 13, 2015, 9 Pages. |
“Amendment Filed in Chinese Patent Application No. 201080028190.8”, Filed Date: Jul. 30, 2015, 3 Pages. |
“Decision to Refuse Filed in Japanese Patent Application No. 2012-571610”, Filed Date: Aug. 27, 2015, 6 Pages. |
“Request for Reexamination Issued in Chinese Patent Application No. 201080028190.8”, dated Mar. 21, 2016, 10 Pages. |
“Amendment Filed in Japanese Patent Application No. 2012-517610”, Filed Date: Nov. 25, 2016, 12 Pages. |
“Final Rejection Issued in Japanese Patent Application No. 2012-517610”, dated Jan. 21, 2014, 6 Pages. |
“Office Action Issued in Japanese Patent Application No. 2012-517610”, dated Aug. 30, 2016, 14 Pages. |
“Office Action Issued in Japanese Patent Application No. 2012-517610”, dated Jan. 31, 2017, 16 Pages. |
“Office Action Issued in Japanese Patent Application No. 2012-517610”, dated Dec. 2, 2014, 9 Pages. |
“Office Action Issued in Japanese Patent Application No. 2012-517610”, dated Sep. 1, 2015, 3 Pages. |
“Amendment filed in Japanese Patent Application No. 2012-571610”, Filed Date: Apr. 18, 2014, 11 Pages. |
“Amendment Filed in Japanese Patent Application No. 2012-571610”, Filed Date: Feb. 27, 2015, 13 Pages. |
“Amendment Filed in Japanese Patent Application No. 2012-571610”, Filed Date: Dec. 17, 2015, 17 Pages. |
“Notice of Allowance in Korean Patent Application No. 10-2011-7030766”, dated Nov. 18, 2016, 7 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 14/302,315”, dated Jun. 28, 2018, 4 Pages. |
Number | Date | Country | |
---|---|---|---|
20180157392 A1 | Jun 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14302315 | Jun 2014 | US |
Child | 15880607 | US | |
Parent | 12491045 | Jun 2009 | US |
Child | 14302315 | US |