VIDEO INTERACTION WITH A MOBILE DEVICE AND A VIDEO DEVICE

Abstract
Interactive material is delivered at least partly wirelessly to a mobile device in the vicinity of a video device on which video content is being displayed, the interactive material being related to the video content.
Description
BACKGROUND

This invention relates to video interaction.


One rudimentary level of television interaction, for example,—is achieved with a simple remote controller, which enables local control of the channel, volume, picture quality, and other aspects of the television presentation.


A variety of schemes also have been proposed to enable a user to interact more actively with the television program content.


SUMMARY

In the invention, the user is provided a significantly enhanced interactive capability by establishing a second parallel channel of interactivity on a personal digital assistant or similar device.


In general, in one aspect, the invention features a method that includes delivering interactive material at least partly wirelessly to a mobile device in the vicinity of a video device on which video content is being displayed, the interactive material being related to the video content.


Implementations of the invention include one or more of the following features. The interactive material is presented to a user in time synchronization with the video content. The interactive material includes an element (e.g., a displayed icon) for receiving input from the user. A tag is generated representative of a user's interest in an item of interactive material. The tag may be generated by capturing a time and a channel of video content, or by sending a query from the mobile device to a server at the time when a user indicates an interest in an item of interactive material, or by a server in response to a query from the mobile device. Or the server may broadcast tags that are dynamically synchronized with the video content. The interactive material includes information that includes at least one of the following: text, images, video, audio. The interactive material is received wirelessly at the mobile device. The video device comprises a television or video player. The interactive material supplements the video content. The mobile device comprises at least one of the following: a remote controller, a personal digital assistant, or a mobile telephone. At least some of the interactive material is delivered based on preference information associated with a viewer of the video content. In general, in another aspect, the invention features a method that includes receiving interactive material at least partly wirelessly at a mobile device in the vicinity of a video device on which video content is being displayed, the interactive material being related to the video content.


In general, in another aspect, the invention features a method that includes creating interactive material that is adapted to be delivered to or received at a mobile device in the vicinity of a video device on which video content is being displayed, the interactive material being created to be related to the video content.


In general, in another aspect, the invention features apparatus that includes a mobile device including software configured to cause the mobile device to present interactive material in the vicinity of a video device on which video content is being displayed, the interactive material being related to the video content. In implementations of the invention, the mobile device may include a display screen, a user input/output facility, and a wireless communication facility. The mobile device may be a personal digital assistant, a telephone, or a remote controller.


In general, in another aspect, the invention features apparatus that includes a server including software configured to deliver interactive material at least partly wirelessly to a mobile device in the vicinity of a video device on which video content is being displayed, the interactive material being related to the video content.


In general, in another aspect, the invention features a storage medium bearing software adapted to configure a device to receive interactive material at least partly wirelessly at a mobile device in the vicinity of a video device on which video content is being displayed, the interactive material being related to the video content.


In general, in another aspect, the invention features a storage medium bearing software adapted to configure a device to deliver interactive material at a mobile device in the vicinity of a video device on which video content is being displayed, the interactive material being related to the video content.


In general, in another aspect, the invention features a method that includes receiving at a server an identification of video content, identifying interactivity material related to the video content, and enabling a user to interact with the interactivity material in connection with viewing of the video content. In implementations of the invention, the user is enabled to indicate interactivity material of interest to him, tags are generated in response to indications of the user of material that interests him, the material of interest is provided to the user at a later time based on the tags, and the interactivity material is presented to the user synchronously with the related video content.


Among the advantages of the invention are one or more of the following:


Instead of requiring modifications to the video content broadcast stream or demanding that a user purchase additional equipment the invention works with existing video systems and a user's handheld device, requiring only an Internet connection.


The invention incorporates an intuitive, aesthetic interface that allows a user simply to tap on the screen of his handheld when TV content of interest appears on the TV screen.


For the content provider, all that is required is to provide the information necessary for the interactive links on their own servers.


The invention allows PDA and mobile phone users to expand their TV viewing experience by letting them “grab” subjects or individual items in TV programs or commercials to expand the viewing experience and gather further information on content of interest. The invention resolves the design battle for screen real estate by allowing the enhanced television experience to occur in the palm of the hand, rather than on the television set. The invention avoids interrupting the viewing experience, which is protected because the user is not required to deal immediately with the grabbed content.


The invention enhances television commercials, allowing users to acquire additional information as well as purchase goods. Advertisers can offer users targeted promotions as well as gain instant feedback about the effectiveness of their advertising.


The invention enables a user's mobile device to act also as a universal remote control for the user's television and entertainment system. The mobile device can also display functions of an Electronic Programming Guide (EPG), give specifically targeted promotions for programming, and offer easily accessed program schedules, all within the same device that provides the content enhancements.


Given user permissions, the mobile device can organize content “grabbed” by the user so that content is hierarchically displayed according to a user's pre-set interests. Advertisers can use this information to offer user-targeted promotions. The system can also allow filtering to streamline the display based on the user's preferences. For instance, a hockey fan viewing enhanced sports content from a news broadcast may not want to see further information on the day's tennis match. He can set his profile to indicate this.


Other advantages and features will become apparent from the following description and from the claims.





DESCRIPTION


FIGS. 1 through 23 show user interface screens.



FIG. 24 is a block diagram.





As shown in FIG. 24, a user's experience in viewing and using video content on a video device 10 can be enhanced by enabling the user 12 to interact with a mobile device 14 (such as an advanced remote controller) that is synchronized with the video content, for example, by indicating other content that may be of interest to the user. As the user views the video content, he may be prompted periodically, e.g., by a “hot” icon 16 on a display 18 of the mobile device or in some other way, of an opportunity for interactivity that coincides with or is related to the video content.


The viewer may indicate an interest in the available interactivity by, for example, invoking the icon, which triggers the generation of a tag 20. The tag may be stored in the local memory 22 of the mobile device 14 and/or provided in some way to a remote server 24. The tags enable the server to provide the interactivity to the user either immediately or at some later time or in some other way.


Tagging can occur in any of at least the following four ways:


1. The tag can be a timestamp that captures the time (using a system clock on the mobile device) and the channel of video content that is being viewed at that time. The time and channel identity can together be called the “coordinates”. When the coordinates 23 are sent to a remote server 24, the server can retrieve interactivity information and functions 26 corresponding the coordinates and can return the information or provide the functions to the mobile device.


2. The tag can be in the form of a query 26 sent to the server at the exact time that the user indicates interest by invoking the icon. The server responds with the corresponding information or functions, which are then stored in the remote device for use in interacting with the user.


3. The tagging can be done by server in response to the mobile device querying the server at the exact time of the viewer's indicated interest. The server may generate the tag including embedded links 28 and return it to the mobile device, where it is stored. Later, by clicking on a displayed version of this tag and on displayed versions of the embedded links, the user can access the server to retrieve the interactivity information or functions.


4. Alternatively, the server may constantly broadcast a changing set of tags that are dynamically synchronized to the changing video content being viewed by the user. The mobile device displays the tags as they are received. When the user sees something of interest, he invokes the currently displayed tag, which corresponds to the current video content. The tag holds embedded links and the user can then access interactivity information or functions that correspond to the links.


The tagging methods allow the user to access and retrieve additional information and functions synchronously or asynchronously to the changing stream of video content. For example, the additional information or functions could involve (a) additional news that supplements news contained in the video content, (b) e-commerce opportunities, (c) web pages on the Internet, (d) participation in newsgroups, (e) an opportunity to ask questions, (e) setting the mobile device to perform some action such as recording future video content or (f) any other information, action or transaction that the host of the remote server 24 chooses to make available.


The mobile device 14 may be powered by a battery 30 and include an input/output capability 32, such as a touch screen, a keypad, or a microphone to receive voice. A wireless access capability 34 may be included to provide for wireless access to the video device both to control the video device (much as a conventional remote controller would do) or to display information on the video device. A second wireless access capability 36 provides wireless access to the Internet or other wide area network to carry the tags, queries, coordinates, and interactivity information and functions. (In some cases, wireless access 34 to the video device will be sufficient to provide the functions of wireless capability 36 if the video device has the ability to communicate through the network 40.)


The mobile device runs software 42 that is configured to perform the following functions: (1) provide secure networking capability to access the server 24 through the network 40, (2) display text and images, and perform sound and video, (3) tag information as explained earlier, and (4) store information as explained earlier.


The interactivity information and functions, which can together be called the “interaction content” 46, are created by an interaction content creator using content creation software 44, and then loaded into the server for use at run time. The content creator develops the interaction content based on the video content on various channels of the video feed 60 that are known to the creator in advance of the feed being delivered to the user's video device. Each item of interaction content may be related to one or more segment of video content and may be associated with that segment as part of the content creation.


The content creation software provides the following functions: (1) a graphical user interface that enables the content creator to author information and functions that are to be synchronized to video content that will be conveyed to the user, (2) a tag generation function that enables the creator to build tags that include audio, video, text, and images, (3) a function that enables a content creator to build icons and interactivity information and functions, (4) the testing of software to validate links, and (5) staging software to mount finished interactivity content to the staging server 24. The interaction content is loaded into the interactivity information and functions 26 stored at the server. At the same time, a table or map 29 that associates video content segments with related interaction segments can be loaded into the server.


The server runs server staging software 50 that (1) manages and hosts interactivity content 46 created using the content creation software, (2) affords networking capability to manage multiple simultaneous requests from mobile devices, (3) manages preference and personalization information 52 that customizes the interactivity content to the user, (4) builds reports on usage to enable feedback for advertising and marketing purposes, and (5) enables creation and recording of e-commerce transactions. At run-time, the server can use tags generated by user interaction with the mobile device indicating the video content being viewed to identify and serve related interaction content, based on associations stored in the map 29. The server can also base the interaction content on the more general preference and personalization information 52.


Referring in more detail to the mobile device, the wireless capability 34, 36 could use any of a wide variety of protocols including PCS, cellular, Bluetooth, 802.11, WiFi, Infrared, or alternative technology. The mobile device could be able to connect to and control the video feed 60 that drives the video device, for example, in the way that a universal remote controller controls a television, VCR, settop box, or other video device. The mobile device must also simultaneously connect to the network 44.


The mobile device could be, for example, a wireless-enabled PDA such as a Compaq iPaq or Palm Pilot or an advanced functionality remote controller of the kind that includes a color LCD screen, an infrared link to control the television, VCR, settop box, or other video device, and an 802.11b card to connect to a local area network to provides access to the Internet.


In one example of the use of the system, a user watching a television would also see a video or still picture on the screen of his remote controller that is serving as the mobile device. The video or still picture (which may be thought of as a mobile feed 62 that is separate from the video feed) changes continually (as arranged by the content creator) to correspond to the video content carried on the video feed.


If the user changes the channel of the video feed using the remote controller, the mobile feed is automatically changed by the server to the new channel. The information and functions on the remote controller are updated in real-time over the network and are synchronized with the video feed by the server.


From time to time, the mobile feed displayed on the remote controller will display a “hot” icon to prompt the user that interactivity is available. The interactivity opportunity is predetermined by the content author of the mobile feed. If the video content on the video feed piques the user's interest at the time of the prompting, the user can tap the hot icon on the screen (or invoke the interactivity opportunity by voice command or keyboard), which tags the content. From a user interface perspective, when the user tags the content, the hot icon moves from the mobile feed window to a folder or area where it is stored for later access. The stored icon then represents the link that the user can later click to access interactivity information or functions that the author of the tag has created. The user can access, save, store and send the information or function. The author of the tag can determine whether the user can edit the information or function. The user may choose to access the tagged information on the mobile device or may send it to another device such as a computer, television or other device that has a screen and input/output capability. The interactivity content is portable and may be sent to other people and other devices.


To enable the synchronization of content between the video feed and the mobile feed, the mobile device is able to identify the channel of the video feed is on and report the channel to the server. If the mobile device is one that has remote controller functionality, then when the user changes channel, the channel identity is automatically known to the controller and can be reported to the server.


Alternatively, the user may specifically enter the number of the channel into the mobile device at the time he changes channels, or the television, settop box, or VCR may send the current channel information to the mobile device at the time the user changes the channel. By determining the channel information, the mobile device can provide information to the server necessary to synchronize the mobile feed with the video feed.


The content creation software may be similar to a simple version of website development software.


The mobile feed controls the synchronization of the video feed with the information that the mobile device retrieves. Sometimes, the mobile feed can provide ancillary information to augment or complement the video feed. The content creation software also provides tools to embed in the mobile feed hot icons to indicate interactivity. It also has a module to create the links and to build the tags and information that are staged on the server and will be viewed on either the mobile device or other end user device.


The server stages the mobile feed and the embedded links The content creator uses available software to create both the mobile feed and the embedded information which the user accesses when the user tags the content. The mobile feed may be text, a picture or even full video.


The source of the video feed and the source of the mobile feed can be located in one place or in two different places


An example of a user interface that could be provided to a user of a mobile device is illustrated in FIGS. 1 through 23. During a television show, the PDA may display images or information that relate directly or indirectly to what is being shown on the television program.


The user can bookmark or flag (i.e., tag) pieces of information that come from the mobile feed as he sees them on the PDA. Items that are bookmarked or flagged can be retrieved later from the server at a time convenient to the user.


For example, if the user sees a short clip of a program, such as a baseball game, on the television, he can indicate through the PDA interface that he wishes to bookmark the program. Later he can retrieve additional information about the program from the server.


In general, the invention enables a user to work with a second parallel synchronized source of information and interaction while watching a television show, to identify items of interest that are displayed on the PDA, and to later retrieve or take some other action with respect to the items of interest.



FIG. 1 shows the channel and volume controls on the left side that would be used for remote control of the television. The current channel is MTV News, channel 082.



FIG. 2 shows a banner related to “The Source Hip-Hop Music Awards” which is associated with a related segment being shown on the television. If the user is interested in bookmarking this item, he presses on the place where the down arrow is shown and his request is transmitted to the server for storage.


On FIG. 3, a different banner is shown and the user again has the opportunity to bookmark the item by pressing the down arrow.


On the next screen, FIG. 4, an icon associated with the banner that appeared on FIG. 3 is shown in the bottom half of the screen indicating that the user has bookmarked this item.


On FIG. 7, the user has pressed the nine button icon in the upper left corner, indicating a desire to enter information on a numeric keypad and the buttons of the icon have turned black.


In FIGS. 8, 9, and 10, the keyboard is scrolled from left to right onto the screen. In FIGS. 11, 12, and 13, the user has pressed 043 to change the channel to CNN Headline Sports. The information is sent to the Internet server, which then changes the program material that is being transmitted to the user's television. The banner for that program is then displayed on the top of the screen.


In FIG. 15, the user sees a banner for the NASCAR Brickyard 400 program and has the opportunity to press a large down arrow to bookmark it or to press a small down arrow that is labeled ticker.


In FIG. 16, the NASCAR icon has been added to the bookmarked items.


In FIG. 17, the banner has changed to the Senior Burnet Classic in accordance with a short item being shown at the same time on the television. As before, the user has the chance to add this to his bookmarked set at the bottom half of the screen.


In FIG. 18, the Major League Baseball scores are shown. In FIG. 19, the user has highlighted the score of the Brewers-Dodgers game and added it to the bookmarked items.


In FIG. 21, the user is shown information about the Brickyard 400 race, including the three leading contenders. The user is given the chance to view the race or scores or to buy stuff


On FIG. 22, the user has pressed the item the RACE and is shown four thumbnails of race pictures.


By pressing on the lower left-hand picture, the user is shown an enlarged image on FIG. 23 together with a text caption.


The ticker arrow allows the user to scroll through an entire sequence of different short clips, just as, for example, the television may broadcast a series of short clips of baseball games. Because the Internet server has personalization information about the user on the server, the ticker can be altered to suit the user's tastes so that when he presses the ticker arrow, he sees a sequence of short clips that are interest to him.


Any icon that has been generated as a result of bookmarking can be invoked at any time by the user by simply pressing that icon. Then the Internet server will serve images, video, or information related to that icon.


Other implementations are within the scope of the following claims.

Claims
  • 1-34. (canceled)
  • 35. A method comprising: while primary video content is being presented through a first communication channel on a primary video device,on a handheld wireless device, displaying information describing video content available for viewing on the primary video device, the information received through a second communication channel; andin response to user input, displaying and storing a graphical element representing the information.
  • 36. The method of claim 35 comprising displaying second information describing second video content available for viewing on the primary video device, the second information received through the second communication channel.
  • 37. The method of claim 35 comprising displaying information on the handheld wireless device related to the primary video content displayed on the primary video device.
  • 38. The method of claim 35 in which the information describing video content available for viewing is displayed in response to user input.
  • 39. The method of claim 38 in which the user input comprises an invocation of the graphical element.
  • 40. The method of claim 35 comprising receiving the information describing video content from an Internet server.
  • 41. The method of claim 35 comprising displaying a graphical element representing interactivity material.
  • 42. The method of claim 35 in which the information describing video content includes an image.
  • 43. The method of claim 42 in which the image comprises a thumbnail image.
  • 44. The method of claim 35 in which the information describing video content includes video.
  • 45. The method of claim 35 in which the information describing video content includes text.
  • 46. The method of claim 35 in which the user input comprises a selection of a channel.
  • 47. The method of claim 35 in which the information describing video content is provided by an electronic program guide.
  • 48. The method of claim 35 comprising accepting a choice by the user of displaying, on the primary video device, the video content available for viewing.
  • 49. The method of claim 35 comprising generating a tag representative of the information represented by the graphical element.
  • 50. The method of claim 49 in which the tag is generated by capturing a time and a channel of video content.
  • 51. The method of claim 49 in which the tag is generated by sending a query from the handheld wireless device to a server in response to user input.
  • 52. The method of claim 51 in which information or functions corresponding to the query are received from the server and stored in the handheld wireless device for use in interacting with the user.
  • 53. The method of claim 49 in which the tag is generated by a server in response to a query from the handheld wireless device.
  • 54. The method of claim 35 in which the information describing video content comprises a promotion.
  • 55. The method of claim 54 in which the promotion is user-targeted.
  • 56. The method of claim 35 in which the information describing video content comprises an advertisement.
  • 57. The method of claim 56 in which the advertisement is user-targeted.
  • 58. The method of claim 56 in which the information comprises an advertisement for video content.
  • 59. The method of claim 58 comprising displaying video content represented by the advertisement.
  • 60. The method of claim 58 in which the advertisement for video content is a television advertisement.
  • 61. The method of claim 35 comprising sending, to another user, a reference to the information.
  • 62. The method of claim 35 comprising sending, to another mobile device, a reference to the information.
  • 63. The method of claim 35 comprising sending, to a computer, a reference to the information.
  • 64. The method of claim 35 comprising sending, to a video device, a reference to the information.
  • 65. The method of claim 35 comprising storing a reference to the information.
  • 66. The method of claim 35 in which the information is displayed in a first portion of a display of the handheld wireless device.
  • 67. The method of claim 66 in which the graphical element is displayed in a second portion of the display of the handheld wireless device.
  • 68. The method of claim 35 in which the graphical element is an icon.
  • 69. The method of claim 35 in which the graphical element is received through the second communication channel.
  • 70. A method comprising: while primary video content is being presented through a primary communication channel on a video device,on a handheld wireless device, displaying a graphical element representing interactivity material matched with the video program, the interactivity material received through a second communication channel; andin response to user input, displaying the interactivity material.
  • 71. The method of claim 70 in which the interactivity material is received from another user.
  • 72. The method of claim 70 in which the interactivity material is an advertisement.
  • 73. The method of claim 72 in which the advertisement is user-targeted.
  • 74. The method of claim 70 in which the interactivity material involves an e-commerce opportunity.
  • 75. The method of claim 70 in which the interactivity material is a promotion.
  • 76. The method of claim 75 in which the promotion is user-targeted.
  • 77. The method of claim 70 in which the user input comprises an invocation of the graphical element.
  • 78. The method of claim 70 in which the user input comprises a voice command.
  • 79. The method of claim 70 in which the user input comprises keyboard input.
  • 80. The method of claim 70 in which the user input comprises microphone input.
  • 81. The method of claim 70 comprising, in response to user input, storing a reference to the interactivity material.
  • 82. The method of claim 70 comprising, in response to user input, retrieving a stored reference to the interactivity material.
  • 83. The method of claim 70 comprising, in response to user input, sending a reference to the interactivity material to another user.
  • 84. The method of claim 70 comprising, in response to user input, sending a reference to the interactivity material to another device.
  • 85. The method of claim 70 comprising, in response to user input, sending a reference to the interactivity material to a web site.
  • 86. The method of claim 70 comprising receiving the interactivity material from another user.
  • 87. The method of claim 70 in which the interactivity material comprises an image.
  • 88. The method of claim 70 in which the interactivity material comprises video.
  • 89. The method of claim 70 in which the interactivity material comprises text.
  • 90. The method of claim 70 in which the interactivity material comprises an embedded link.
  • 91. The method of claim 70 in which the interactivity material is displayed in a first portion of a display of the handheld wireless device.
  • 92. The method of claim 57 in which the graphical element is displayed in a second portion of the display of the handheld wireless device.
  • 93. The method of claim 70 in which the graphical element is an icon.
  • 94. A method comprising: at a mobile device separate from a video device that is presenting first primary video content,communicating, to a server separate from the source of the first primary video content, an identification of the first primary video content,automatically receiving, from the server, and displaying first secondary content related to the first video content,directing the video device to present second primary video content different from the first primary video content and selected by a user of the mobile device,communicating an identification of the second primary video content to the server, andautomatically receiving, from the server, and displaying second secondary content related to the second primary video content.
  • 95. The method of claim 94 comprising, at the mobile device, while displaying the second secondary content, displaying an icon corresponding to the first secondary content.
  • 96. The method of claim 94 comprising, at the mobile device, in response to a user input, displaying the first secondary content and displaying an icon corresponding to the second secondary content.
  • 97. The method of claim 94 comprising, at the mobile device, generating an indication of user interest in the first secondary content or the second secondary content, and receiving from the server information usable to access interactivity material related to the primary video content corresponding to the indicated secondary content.
  • 98. The method of claim 97 comprising, at the mobile device, receiving the interactivity material.
  • 99. The method of claim 97 comprising, at the mobile device, in response to user input, displaying the interactivity material.
  • 100. The method of claim 99 comprising, at the mobile device, while displaying the interactivity material, displaying icons corresponding to one or more of the first or second primary video content or the first or second secondary content.
  • 101. The method of claim 94 in which the secondary content comprises an electronic programming guide.
  • 102. The method of claim 94 in which the secondary content comprises a program schedule.
  • 103. The method of claim 94 in which the secondary content is used to set the mobile device to record future video content.
  • 104. The method of claim 94 in which the secondary content relates directly or indirectly to what is being shown on a television program.
  • 105. The method of claim 104 in which the secondary content that relates to what is being shown on a television program comprises video content.
  • 106. The method of claim 94 in which the secondary content comprises ancillary information to complement the video feed.
  • 107. The method of claim 94 in which the secondary content comprises an advertisement.
  • 108. The method of claim 94 in which the secondary content comprises a targeted promotion.
  • 109. The method of claim 94 in which the secondary content allows a user to purchase goods.
  • 110. The method of claim 109 in which the goods are associated with the primary video content.
  • 111. The method of claim 94 in which the secondary content is based on a user profile.
  • 112. The method of claim 94 in which the secondary content comprises content available on a web page on the Internet.
  • 113. A method comprising: at a mobile device separate from a video device that is presenting primary video content,communicating, to a server separate from the source of the primary video content, an identification of the primary video content, andautomatically receiving, from the server, and displaying an electronic programming guide related to the primary video content.
  • 114. An apparatus comprising: a mobile device having a first display screen and configured tocommunicate, to a server separate from a second display screen, an identification of first primary video content being presented on the second display screen,automatically receive, from the server, and display on the first display screen first secondary content related to the first video content,direct the second display screen to present second primary video content different from the first primary video content and selected by a user of the mobile device,communicate an identification of the second primary video content to the server, andautomatically receive, from the server, and display on the first display screen second secondary content related to the second primary video content.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation (and claims the benefit of priority under 35 USC 120) of U.S. application Ser. No. 09/950,321, filed on Sep. 10, 2001, which claims the benefit U.S. patent application No. 60/231,285, filed on Sep. 8, 2000, both of which are incorporated herein by reference.

Provisional Applications (1)
Number Date Country
60231285 Sep 2000 US
Continuations (1)
Number Date Country
Parent 09950321 Sep 2001 US
Child 12771367 US