This invention relates to the field of content playback and more particularly to using gestures to move content from a first device to a second device.
Devices such as televisions, media players, cellular phones, etc are all capable of reproducing various contents such as movies, video clips, music, etc. Many times, multiple devices in the same general area are capable of reproducing the same content, often with better or worst quality. For example, a portable music player is used in the same room as a stereo system or a cellular phone showing a video in the same room as a high-definition television. Many situations occur in which a user is enjoying a particular content on one device and desires to continue watching that content on a different device. For example, a user is listening to a playlist of songs while out and wants to continue with the current song and the rest of the songs on their home stereo when they return home. In another example, the user is watching a movie on a portable movie player or cellular phone and relocates to their living room where there is a large-screen, high-definition television. The user may desire to continue watching the movie on their television.
In the past, the user was able to listen to the music on their stereo or watch the same movie on their television, but the particular content and/or playlist needed to be also available on the destination device. Notwithstanding, to do so, the user needed to start the content (e.g. play the movie or listen to music in the playlist) then maneuver to the location in the content at which they left off using the other device. For example, if watching a movie, the user would have to fast forward to find the location at which the user left off on the first device. This takes time and effort to find the correct location in the content to continue play.
What is needed is a device cooperation system that transfers content from a first device to a second device.
The present invention comprises a system that enables cooperating devices to transfer presentation of content from one device to the other by sending either the content or an identification of content from a source device to a destination device. In some embodiments, the actual content is transferred while in other embodiments, an identification of the content and a position within the content is transferred from the source device to the destination device.
In one embodiment, a system for transferring presentation of content from a source device to a destination device is disclosed. The source device has a display and a mechanism for detecting a gesture. The destination device also has a display and is communicatively coupled to the source device. Content is currently being presented on the display of the source device. Upon receiving the gesture, it is determined if the gesture indicates a request to transfer presentation of the content from the source device to the destination device and if so, the presentation of the content is transferred from the source device to the destination device.
In another embodiment, a method of transferring the presentation of content from a source device to a destination device is disclosed. A source device has a display and a means for detecting a gesture and a destination device has a second display. The source device is communicatively coupled to the destination device. Some form of content is currently presented at the source device and a gesture is received by a device for detecting gestures. If the gesture indicates moving a presentation of the content from the source device to the destination device, a current index into the content is determined and the destination device is commanded to initiate presentation of the content at the current index. If the gesture indicates moving presentation of the content from the destination device to the source device, the destination device is commanded to initiate a transfer of the presentation of the content from the destination device to the source device, the source device receives the transfer of the presentation of the content from the destination device and the source device presents the content.
In another embodiment, a system for transferring presentation of content from a source device to a destination device is disclosed. The source device and the destination device have displays. A wireless interface communicatively couples the destination device and the source device. Some form of content is currently being presented on the source device. The source device has a way to detect a gesture. If the gesture indicates a request to transfer presentation of the content from the source device to the destination device, presentation of the content is transferred from the source device to the destination device.
The invention can be best understood by those having ordinary skill in the art by reference to the following detailed description when considered in conjunction with the accompanying drawings in which:
Reference will now be made in detail to the presently preferred embodiments of the invention, examples of which are illustrated in the accompanying drawings. Throughout the following detailed description, the same reference numerals refer to the same elements in all figures. The examples described in this document show portable display devices such as media players (e.g. video player, music player, etc) and television devices. Any device is anticipated that has any type of display (e.g. LCD, LED, Plasma, CRT, OLED, e-paper, etc) and/or audio output device (e.g. speakers, headphones, etc). Devices include, but are not limited to, televisions, portable televisions, cellular phones, media players, video players, music players, monitors, computer systems, notebook computer systems, electronic books, tablet computers, etc. Although the examples show examples of video content, any type of content is anticipated such as television programs, movies, songs, albums, CDs, play lists, etc.
Referring to
After communication is established, in one embodiment, the source device 20 transfers the presentation of the content by sending information regarding the content to the destination device 10 over the wireless path 25. For example, if the user is watching a broadcast movie on the source device 20, sufficient information is transferred from the source device 20 to the destination device 10 such that the destination device 10 knows how to tune into the particular broadcast, cable or satellite channel. In this embodiment, unless the source device 20 includes a personal video recorder, the destination device 10 tunes to the program (e.g. show, movie, program, music) and the program is at the same point in the program as being viewed on the source device 20.
In another embodiment, in which the same content is available to the destination device 10 as the source device 20, the source device 20 sends the identity of the content and an index into the content to the destination device 10. The destination device 10 finds and opens the content then seeks to the index position within the content to begin presentation of the content at the same position as was being viewed or listened to on the source device 20. For example, the identity is a file name or content title and the index is time in the example of a movie or an integer indicating an index into a play list, etc.
In another embodiment, in which the content is accessed by the source device 20 over a network and the same access is available to the destination device 10, the source device 20 sends the address of the content to the destination device 10 (e.g. a URL or network address such as X:\movies\agaig.mov). In some embodiments, the source device 20 also sends an index to the destination device 10. Upon reception, the destination device 10 accesses the content using the address transferred from the source device 10 and begins presentation of the content at the location indicated by the index.
In yet another embodiment, the remaining portion of the content is transferred from the source device 20 to the destination device 10. In such, it is expected that the bandwidth of the wireless connection 25 is sufficient to transfer the content and/or the content is buffered at the destination device 10 until sufficient content is available so as to present the content in a continuous fashion.
Referring to
In
It is anticipated that in this direction, the destination device 10 need only send the content identification and index, being that it is known that the content is available to the source device 20 because the source device 20 originally was viewing/listening to the content. The index indicates where in the location in the content to which the viewer has already viewed on the destination device 10. In this way, the viewer, through use of two different gestures 24/26, transfers presentation of the content back and forth between the source device 20 and the destination device 10.
Referring to
Although using a gesture, it is anticipated that in some embodiments, a keyboard command such as “enter” invokes the transfer of the recipe 32 or part of the recipe to the stove/oven 40.
Referring to
Once connected, either by command from the destination device 20 (e.g. a remote control command, etc) or by gesture 24, presentation of the content 27 is transferred from the source device 20 to the destination device 10 as previously described.
Referring to
If the destination device 10 receives a move from command 224 (e.g. packet, etc) 216, the destination device 20 determines the current index 232 into the content and sends the identification of the content and the index 236 to the source device 20 and, in some embodiments, stops 240 displaying the content on the display 7.
Referring to
If the gesture 24/26 is a move from 320, a request is made 324 to the destination device 10 to retrieve the current content and/or index. Once a response is received 328, presentation of the content is started 332 on the source device 20 at the same spot as was being watched on the destination device 10.
Equivalent elements can be substituted for the ones set forth above such that they perform in substantially the same manner in substantially the same way for achieving substantially the same result.
It is believed that the system and method of the present invention and many of its attendant advantages will be understood by the foregoing description. It is also believed that it will be apparent that various changes may be made in the form, construction and arrangement of the components thereof without departing from the scope and spirit of the invention or without sacrificing all of its material advantages. The form herein before described being merely exemplary and explanatory embodiment thereof. It is the intention of the following claims to encompass and include such changes.
Number | Name | Date | Kind |
---|---|---|---|
6041045 | Alterman et al. | Mar 2000 | A |
6549818 | Ali | Apr 2003 | B1 |
7623892 | Hawkins | Nov 2009 | B2 |
8046701 | Chiu et al. | Oct 2011 | B2 |
8200265 | Carlton et al. | Jun 2012 | B2 |
8676175 | Cheng et al. | Mar 2014 | B2 |
8775850 | Moy | Jul 2014 | B2 |
8799496 | Phillips et al. | Aug 2014 | B2 |
20070012307 | Wiker et al. | Jan 2007 | A1 |
20070016476 | Hoffberg et al. | Jan 2007 | A1 |
20070146347 | Rosenberg | Jun 2007 | A1 |
20090054108 | Kito | Feb 2009 | A1 |
20090115723 | Henty | May 2009 | A1 |
20090140986 | Karkkainen et al. | Jun 2009 | A1 |
20090259688 | Do et al. | Oct 2009 | A1 |
20090307780 | Dubhashi et al. | Dec 2009 | A1 |
20090319672 | Reisman | Dec 2009 | A1 |
20100138743 | Chou | Jun 2010 | A1 |
20100281395 | Apted | Nov 2010 | A1 |
20110065459 | Cheng et al. | Mar 2011 | A1 |
20110090953 | Melnyk et al. | Apr 2011 | A1 |
20110131520 | Al-Shaykh et al. | Jun 2011 | A1 |
20110196826 | Retief et al. | Aug 2011 | A1 |
20110197147 | Fai | Aug 2011 | A1 |
20110219340 | Pathangay et al. | Sep 2011 | A1 |
20110239114 | Falkenburg et al. | Sep 2011 | A1 |
20120072401 | Johri et al. | Mar 2012 | A1 |
20130007499 | Moy | Jan 2013 | A1 |
20140229858 | Bleker et al. | Aug 2014 | A1 |
Entry |
---|
W3C, Extensible Markup Language, Nov. 26, 2008, http://www.w3.org/TR/REC-xml/. |
Garate et al., Genio: An Ambient Intelligence Application in Home Automation and Entertainment Environment, Oct. 2005, Joint sOc-EUSAI conference. |
Number | Date | Country | |
---|---|---|---|
20120030632 A1 | Feb 2012 | US |