System and method of displaying a video stream

Information

  • Patent Grant
  • 9571702
  • Patent Number
    9,571,702
  • Date Filed
    Tuesday, February 5, 2013
    11 years ago
  • Date Issued
    Tuesday, February 14, 2017
    7 years ago
Abstract
A particular method includes receiving a video stream to be displayed on a display device. The method also includes, during a first time period, sending a low resolution version of the video stream to the display device while recovering a full resolution version of the video stream. The method further includes synchronizing the low resolution version of the video stream with the full resolution version of the video stream. The method also includes, during a second time period after the first time period, switching from sending the low resolution version of the video stream to the display device to sending the full resolution version of the video stream to the display device.
Description
FIELD OF THE DISCLOSURE

The present disclosure relates to a system and method of displaying a video stream.


BACKGROUND

In a variety of situations in which television content is transmitted digitally, there can be a significant delay that occurs when switching from one source of video to another. Typically, the delay occurs while changing a television channel, but the delay also may occur when starting a new video stream in a number of situations. The delay depends on the specific situation, such as the type of video compression, the network bandwidth, and the decompression hardware. Further, the delay can occur when a new network stream is initiated and the delay can also occur when accessing video that is stored locally at a Digital Video Recorder (DVR), a Personal Video Recorder (PVR), or another local video storage device. Customers and users find this delay disconcerting. Moreover, the delay is of particular concern since analog televisions do not exhibit this type of delay.


Certain proposed solutions to the delay may be defined to fall into two classes. The simplest approach is to provide access to videos through a user interface that effectively hides the delay. This is the approach used by a variety of products in which the customer is effectively encouraged to navigate through an Electronic Program Guide (EPG) and avoid directly switching channels. This method does not really solve the problem; it simply encourages the user to behave in a manner that causes the delay to be less noticeable. The second approach is to overpower the delay by bursting a very large amount of video data in a short amount of time. This technique requires the transmission of up to 10 times the normal amount of data in order to provide sufficient video data to begin rendering the video in a short amount of time after a channel is changed (e.g. less than 100 msec).


Both of these approaches attempt to address the problem of video delays, but the first requires customers to change their behavior and the second places significant and expensive requirements on the local network and the rendering device.


Accordingly, there is a need for an improved system and method of displaying a video stream.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of a video processing system;



FIG. 2 is a flow chart that illustrates a first embodiment of a method of displaying a video stream;



FIG. 3 is a flow chart that illustrates a second embodiment of a method of displaying a video stream; and



FIG. 4 is a flow chart that illustrates a third embodiment of a method of displaying a video stream.





DETAILED DESCRIPTION

The present disclosure is generally directed to a video stream processing system and to a method of displaying a video stream. In a particular embodiment, the method includes, during a first time period, displaying a first version of a received video stream while recovering a second version of the received video stream, the first version of the received video stream having a lower video display quality than the second version of the received video stream. The first time period begins no more than approximately 100 milliseconds after a detected channel change. The method also includes switching from display of the first version of the received video stream to display of the second version of the received video stream during a second time period.


In a particular embodiment, a video stream processing system includes a video generation module responsive to a video detection module. The video generation module is to communicate to a display device a first version of a received video stream while the video detection module recovers a second version of the received video stream. The first version of the received video stream has a lower video display quality than the second version of the received video stream. The video generation module is to switch from communicating the first version of the received video stream to communicating the second version of the received video stream to the display device. The video generation module is to communicate to the display device the first version of the received video stream during a first time period, and an audio signal at full resolution is provided to the display device during the first time period.


In a particular embodiment, a method of displaying a video stream includes displaying a first portion of a video stream according to first display characteristics while recovering a second portion of the video stream. The first display characteristics include a first resolution, and the first portion of the video stream is displayed during a first time period. An audio signal is output with full resolution during the first time period. The method includes switching from displaying the first portion of the video stream to displaying the second portion of the video stream. The second portion of the video stream is displayed according to second display characteristics, and the second display characteristics include a second resolution.


Referring to FIG. 1, a video stream processing system is illustrated and is designated 100. The video stream processing system 100 includes a video processing system 102 coupled via a communication link 120 to a video display device 130, such as a television. An example of an appropriate communication link is a coaxial cable. The video processing system 102 includes a video input 110 and includes a remote control interface 108. The remote control interface 108 receives signals, such as infrared signals 114, from a remote control device 112. The video processing system 102 further includes a video stream detection module 106 and a video generation module 104. The video stream detection module 106 receives the video input 110 and is responsive to signals received at the remote control interface 108.


The video stream detection module 106 forwards detected and processed video streams to the video generation module 104. The video generation module 104 provides a video signal to be communicated over the coaxial cable 120 for display at the video display device 130.


In a particular embodiment, the remote control device 112 provides a channel request signal to request a channel change. The channel request signal may be communicated using the infrared or other wireless communication signal 114 to the remote control interface 108. The remote control interface 108 then communicates the received channel change request to the video stream detection module 106. The video stream detection module 106, in turn, tunes to the requested channel within the video input 110. Further, the video stream detection module 106 provides instructions and processed signals to the video generation module 104. Then, the video generation module 104 provides a video image signal that corresponds to the newly selected channel.


Referring to FIG. 2, a particular embodiment of a method of displaying a video stream is illustrated. The method includes detecting a channel change request, at 202. The method further includes receiving a video stream to be displayed on a display device, at 204. Further, the method includes displaying a low resolution version of the video stream on a display device during a first time period while recovering a full resolution version of the video stream, as shown at 206. Typically, the first time period is in the range of 1-3 seconds, based on the delay. As illustrated in FIG. 2, the method further includes synchronizing the low resolution version of the video stream with the full resolution version of the video stream, as shown at 208. During a second time period, at 210, the method includes switching between display of the low resolution version of the video stream and display of the full resolution version of the video stream. The second time period occurs after the first time period. Optionally, the method includes temporarily displaying a blurred image on the display device while switching between the low resolution version of the video stream and the full resolution version of the video stream, as shown at 212.


Referring to FIG. 3, another embodiment of a method of displaying a video stream is illustrated. The method includes detecting a channel change request, at 302, and receiving a video stream to be displayed on a display device, at 304. The method further includes, during a first time period, displaying a first version of the video stream on a display device while recovering a second version of the video stream, at 306. Typically, the first time period is in the range of 1-3 seconds based on the delay. In a particular exemplary embodiment, the video generation module 104 provides a low resolution version of the video stream of the newly selected channel while the video stream detection module 106 is processing the video stream to recover a full resolution version.


In a particular illustrative embodiment, an audio signal is provided to the display during the first time period and the audio signal is provided with full resolution such that a video display device user hears full audio while the first version of the video stream is being displayed. In a particular embodiment, the first version of the video stream has a lower video display quality than the second version of the video stream. For example, the first version of the video stream may have a reduced color set when compared to the second version of the video stream. As another example, the first version may have a reduced spectral frequency or other visual parameter that is degraded when compared to the second version of the video stream. Thus, the first version of the video stream consumes less bandwidth and may be displayed on the display device more quickly in response to the channel change request.


The method further includes synchronizing the first version of the video stream with the second version of the video stream, as shown at 308. During a second time period after the first time period, the method includes switching between display of the first version of the video stream and display of the second version of the video stream, as shown at 310. Optionally, during a third time period, the method includes displaying a third version of the video stream, where the third version has a higher video display quality than the second version of the video stream. In a particular example, the first, second, and third versions of the video stream together comprise a portion of a progressive video stream. The progressive video stream initially presents a low quality image and then, presents added resolution and higher quality images over a sequence of successive time periods.


Referring to FIG. 4, another exemplary embodiment of a method of displaying a video stream is illustrated. The method includes detecting a channel change, at 402, and receiving a video stream to be displayed on the display device, at 404. During a first time period, a still image is displayed, as shown at 406. The still image is associated with the video stream. The still image is displayed while recovering the full motion video stream after the request for the channel change. During a second time period after the first time period, the method includes switching between display of the still image and display of the video stream as a full motion video, as shown at 408. In a particular illustrative embodiment, an audio signal is provided to the display during the first time period and the audio signal is provided with full resolution such that the video display device user hears full audio while the still image is being displayed. In another embodiment, the still image may include a title screen associated with the video stream or may include a frame derived from the video stream. The title screen may include information regarding the video stream such as the title of an episode, the title of a series, or the title of a movie that is to be displayed on the screen.


While particular illustrative methods of displaying a video stream have been described, it should be understood that many alternate methods may also be used with the system illustrated in FIG. 1. In a particular illustrative embodiment, the first period of time when a low resolution video stream is displayed may be less than three seconds. In another embodiment, a full resolution audio signal is provided to the display device during the first period of time while displaying the low resolution version of the video stream, such as when displaying a still image or a degraded or reduced bandwidth type video stream as illustrated. In a particular exemplary embodiment, the first time period in which a low resolution version of a video stream is displayed occurs within 100 milliseconds after detecting a channel change request, such as a channel change request initiated by a user of the remote control device 112 shown in FIG. 1. Thus, a user surfing through channels of a display device may quickly determine the particular content of individual channels.


The methods described provide a user experience with reduced delay, and allows the user to quickly determine desirability of continuing to watch a particular channel. Thus, the disclosed method and system offers an improved video user experience.


The above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended claims are intended to cover all such modifications, enhancements, and other embodiments which fall within the true spirit and scope of the present invention. Thus, to the maximum extent allowed by law, the scope of the present invention is to be determined by the broadest permissible interpretation of the following claims and their equivalents, and shall not be restricted or limited by the foregoing detailed description.

Claims
  • 1. A method comprising: receiving, at a media device, a video stream request to initiate a channel change operation, the video stream request indicating a requested video stream, and the requested video stream comprising a sequence of frames; andin response to the video stream request: receiving a first subset of frames of the sequence of frames;generating a first low resolution video stream comprising a first sequence of first low resolution frames by processing the first subset of frames of the sequence of frames;after receiving the first subset of frames, receiving a second subset of frames of the sequence of frames, wherein the second subset of frames is subsequent to the first subset of frames in the sequence of frames;generating a second low resolution video stream comprising a second sequence of second low resolution frames by processing the second subset of frames of the sequence of frames;after receiving the second subset of frames, receiving a third subset of frames of the sequence of frames, wherein the third subset of frames is subsequent to the second subset of frames in the sequence of frames;generating a full resolution video stream comprising a third sequence of full resolution video frames by processing the third subset of frames of the sequence of frames;during a first time period, sending the first low resolution video stream to a display device;during a second time period after the first time period, sending the second low resolution video stream to the display device; synchronizing the second low resolution video stream with the full resolution video stream; andduring a third time period after the second time period, sending the full resolution video stream to the display device, wherein a first resolution of the first low resolution video stream is lower than a second resolution of the second low resolution video stream, and wherein the second resolution is lower than a third resolution of the full resolution video stream.
  • 2. The method of claim 1, wherein generating the second low resolution video stream occurs during the first time period, and wherein generating the full resolution video stream occurs during the first time period and the second time period.
  • 3. The method of claim 1, wherein generating the first low resolution video stream does not occur during the second time period and does not occur during the third time period, and wherein generating the second low resolution video stream does not occur during the third time period.
  • 4. The method of claim 1, wherein a full resolution audio signal corresponding to the requested video stream is sent to the display device during the first time period and during the second time period.
  • 5. The method of claim 1, wherein a video quality of the full resolution video stream is higher than a second video quality associated with the second low resolution video stream, and wherein the second video quality is higher than a first video quality associated with the first low resolution video stream.
  • 6. The method of claim 1, further comprising temporarily sending a blurred image to the display device while switching from the second low resolution video stream to the full resolution video stream.
  • 7. The method of claim 1, wherein the first low resolution video stream has a reduced color set with respect to the full resolution video stream.
  • 8. A system comprising: a processor;a display interface; andmemory storing instructions that, when executed by the processor, cause the processor to perform operations comprising: receiving a video stream request to initiate a channel change operation, the video stream request indicating a requested video stream, and the requested video stream comprising a sequence of frames;receiving a first subset of frames of the sequence of frames;generating a first low resolution video stream comprising a first sequence of first low resolution frames by processing the first subset of frames of the sequence of frames;after receiving the first subset of frames, receiving a second subset of frames of the sequence of frames, wherein the second subset of frames is subsequent to the first subset of frames in the sequence of frames;generating a second low resolution video stream comprising a second sequence of second low resolution frames by processing the second subset of frames of the sequence of frames;after receiving the second subset of frames, receiving a third subset of frames of the sequence of frames, wherein the third subset of frames is subsequent to the second subset of frames in the sequence of frames;generating a full resolution video stream comprising a third sequence of full resolution video frames by processing the third subset of frames of the sequence of frames;during a first time period, sending the first low resolution video stream via the display interface to a display device;during a second time period after the first time period, sending the second low resolution video stream via the display interface to the display device; synchronizing the second low resolution video stream with the full resolution video stream; andduring a third time period after the second time period, sending the full resolution video stream via the display interface to the display device, wherein a first resolution of the first low resolution video stream is lower than a second resolution of the second low resolution video stream, and wherein the second resolution is lower than a third resolution of the full resolution video stream.
  • 9. The system of claim 8, further comprising a user interface configured to detect the video stream request.
  • 10. The system of claim 8, wherein the first time period begins within one hundred milliseconds after receiving the video stream request.
  • 11. The system of claim 8, wherein the video stream request is received via an infrared signal from a remote control device.
  • 12. The system of claim 8, wherein the first low resolution video stream has a reduced spectral frequency with respect to the full resolution video stream.
  • 13. The system of claim 8, wherein the operations further comprise sending a blurred image to the display device during an interim period while switching from the second low resolution video stream to the full resolution video stream.
  • 14. The system of claim 8, wherein the first low resolution video stream has a reduced color set with respect to the full resolution video stream.
  • 15. The system of claim 8, wherein the operations further comprise sending a full resolution audio signal to the display device during the first time period and during the second time period.
  • 16. The system of claim 8, wherein the display device comprises a television.
  • 17. A method comprising: receiving, at a media device, a channel change request indicating a requested video stream, the requested video stream comprising a sequence of frames;receiving, at the media device, a first subset of frames of the sequence of frames;generating, at the media device, a first low resolution video stream comprising a first sequence of first low resolution frames by processing the first subset of frames of the sequence of frames;after receiving the first subset of frames, receiving a second subset of frames of the sequence of frames, wherein the second subset of frames is subsequent to the first subset of frames in the sequence of frames;generating a second low resolution video stream comprising a second sequence of second low resolution frames by processing the second subset of frames of the sequence of frames;after receiving the second subset of frames, receiving a third subset of frames of the sequence of frames, wherein the third subset of frames is subsequent to the second subset of frames in the sequence of frames;generating a full resolution video stream comprising a third sequence of full resolution video frames by processing the third subset of frames of the sequence of frames;during a first time period, sending the first low resolution video stream to a display device;during a second time period after the first time period, sending the second low resolution video stream to the display device;synchronizing the second low resolution video stream with the full resolution video stream; andduring a third time period after the second time period, switching from sending the second low resolution video stream to the display device to sending the full resolution video stream, wherein a first resolution of the first low resolution video stream is lower than a second resolution of the second low resolution video stream, and wherein the second resolution is lower than a third resolution of the full resolution video stream.
  • 18. The method of claim 17, wherein a blurred image is sent to the display device during an interim period while switching from the second low resolution video stream to the full resolution video stream.
  • 19. The method of claim 17, wherein the first low resolution video stream has a reduced spectral frequency with respect to the full resolution video stream.
  • 20. The method of claim 17, wherein a full resolution audio signal is sent to an audio device coupled to the display device during the first time period and during the second time period.
CLAIM OF PRIORITY

This application is a continuation application of, and claims priority from, U.S. patent application Ser. No. 12/349,352, filed on Jan. 6, 2009, which is a continuation of U.S. patent application Ser. No. 11/005,496, filed on Dec. 6, 2004, now issued as U.S. Pat. No. 7,474,359, each of which is hereby incorporated herein in its entirety.

US Referenced Citations (394)
Number Name Date Kind
4243147 Twitchell et al. Jan 1981 A
4356509 Skerlos et al. Oct 1982 A
4768926 Gilbert, Jr. Sep 1988 A
4907079 Tumer et al. Mar 1990 A
5126731 Cromer, Jr. et al. Jun 1992 A
5163340 Bender Nov 1992 A
5452023 Kim Sep 1995 A
5475835 Hickey Dec 1995 A
5493329 Ohguchi Feb 1996 A
5532748 Naimpally Jul 1996 A
5541917 Farris Jul 1996 A
5589892 Knee et al. Dec 1996 A
5592477 Farris et al. Jan 1997 A
5610916 Kostreski et al. Mar 1997 A
5613012 Hoffman et al. Mar 1997 A
5650831 Farwell Jul 1997 A
5651332 Moore et al. Jul 1997 A
5656898 Kalina Aug 1997 A
5675390 Schindler et al. Oct 1997 A
5708961 Hylton et al. Jan 1998 A
5722041 Freadman Feb 1998 A
5724106 Autry et al. Mar 1998 A
5729825 Kostreski et al. Mar 1998 A
5734853 Hendricks et al. Mar 1998 A
5774357 Hoffberg et al. Jun 1998 A
5786845 Tsuria Jul 1998 A
5793438 Bedard Aug 1998 A
5805719 Pare, Jr. Sep 1998 A
5818438 Howe et al. Oct 1998 A
5838384 Schindler et al. Nov 1998 A
5838812 Pare, Jr. et al. Nov 1998 A
5864757 Parker Jan 1999 A
5867223 Schindler et al. Feb 1999 A
5892508 Howe et al. Apr 1999 A
5900867 Schindler et al. May 1999 A
5910970 Lu Jun 1999 A
5933498 Schneck et al. Aug 1999 A
5953318 Nattkemper et al. Sep 1999 A
5956024 Strickland et al. Sep 1999 A
5956716 Kenner et al. Sep 1999 A
5970088 Chen Oct 1999 A
5987061 Chen Nov 1999 A
5990927 Hendricks et al. Nov 1999 A
5995155 Schindler et al. Nov 1999 A
5999518 Nattkemper et al. Dec 1999 A
5999563 Polley Dec 1999 A
6002722 Wu Dec 1999 A
6014184 Knee et al. Jan 2000 A
6021158 Schurr et al. Feb 2000 A
6021167 Wu Feb 2000 A
6028600 Rosin et al. Feb 2000 A
6029045 Picco et al. Feb 2000 A
6038251 Chen Mar 2000 A
6038257 Brusewitz et al. Mar 2000 A
6044107 Gatherer et al. Mar 2000 A
6052120 Nahi et al. Apr 2000 A
6055268 Timm et al. Apr 2000 A
6072483 Rosin et al. Jun 2000 A
6084584 Nahi et al. Jul 2000 A
6111582 Jenkins Aug 2000 A
6118498 Reitmeier Sep 2000 A
6122660 Baransky et al. Sep 2000 A
6124799 Parker Sep 2000 A
6137839 Mannering et al. Oct 2000 A
6166734 Nahi et al. Dec 2000 A
6181335 Hendricks et al. Jan 2001 B1
6192282 Smith et al. Feb 2001 B1
6195692 Hsu Feb 2001 B1
6215483 Zigmond Apr 2001 B1
6237022 Bruck et al. May 2001 B1
6243366 Bradley et al. Jun 2001 B1
6252588 Dawson Jun 2001 B1
6252989 Geisler et al. Jun 2001 B1
6260192 Rosin et al. Jul 2001 B1
6269394 Kenner et al. Jul 2001 B1
6275268 Ellis et al. Aug 2001 B1
6275989 Broadwin et al. Aug 2001 B1
6281813 Vierthaler et al. Aug 2001 B1
6286142 Ehreth Sep 2001 B1
6295057 Rosin et al. Sep 2001 B1
6311214 Rhoads Oct 2001 B1
6314409 Schneck et al. Nov 2001 B2
6344882 Shim et al. Feb 2002 B1
6357043 Ellis et al. Mar 2002 B1
6359636 Schindler et al. Mar 2002 B1
6363149 Candelore Mar 2002 B1
6385693 Gerszberg et al. May 2002 B1
6396480 Schindler et al. May 2002 B1
6396531 Gerszberg et al. May 2002 B1
6396544 Schindler et al. May 2002 B1
6397387 Rosin et al. May 2002 B1
6400407 Zigmond et al. Jun 2002 B1
6411307 Rosin et al. Jun 2002 B1
6414725 Clarin et al. Jul 2002 B1
6442285 Rhoads et al. Aug 2002 B2
6442549 Schneider Aug 2002 B1
6449601 Freidland et al. Sep 2002 B1
6450407 Freeman et al. Sep 2002 B1
6460075 Krueger et al. Oct 2002 B2
6463585 Hendricks et al. Oct 2002 B1
6481011 Lemmons Nov 2002 B1
6486892 Stern Nov 2002 B1
6492913 Vierthaler et al. Dec 2002 B2
6496983 Schindler et al. Dec 2002 B1
6502242 Howe et al. Dec 2002 B1
6505348 Knowles et al. Jan 2003 B1
6510519 Wasilewski et al. Jan 2003 B2
6515680 Hendricks et al. Feb 2003 B1
6516467 Schindler et al. Feb 2003 B1
6519011 Shendar Feb 2003 B1
6522769 Rhoads et al. Feb 2003 B1
6526577 Knudson et al. Feb 2003 B1
6529949 Getsin et al. Mar 2003 B1
6535590 Tidwell et al. Mar 2003 B2
6538704 Grabb et al. Mar 2003 B1
6542740 Olgaard et al. Apr 2003 B1
6557030 Hoang Apr 2003 B1
6567982 Howe et al. May 2003 B1
6587873 Nobakht et al. Jul 2003 B1
6593973 Sullivan et al. Jul 2003 B1
6598231 Basawapatna et al. Jul 2003 B1
6599199 Hapshie Jul 2003 B1
6607136 Atsmon et al. Aug 2003 B1
6609253 Swix et al. Aug 2003 B1
6611537 Edens et al. Aug 2003 B1
6614987 Ismail et al. Sep 2003 B1
6622148 Noble et al. Sep 2003 B1
6622307 Ho Sep 2003 B1
6631523 Matthews, III Oct 2003 B1
6640239 Gidwani Oct 2003 B1
6643495 Gallery et al. Nov 2003 B1
6643684 Malkin et al. Nov 2003 B1
6650761 Rodriguez et al. Nov 2003 B1
6658568 Ginter et al. Dec 2003 B1
6665453 Scheurich Dec 2003 B2
6678215 Treyz et al. Jan 2004 B1
6678733 Brown et al. Jan 2004 B1
6690392 Wugoski Feb 2004 B1
6693236 Gould et al. Feb 2004 B1
6701523 Hancock et al. Mar 2004 B1
6704931 Schaffer et al. Mar 2004 B1
6710816 Minami Mar 2004 B1
6714264 Kempisty Mar 2004 B1
6725281 Zintel et al. Apr 2004 B1
6731393 Currans et al. May 2004 B1
6732179 Brown et al. May 2004 B1
6745223 Nobakht et al. Jun 2004 B1
6745392 Basawapatna et al. Jun 2004 B1
6754206 Nattkemper et al. Jun 2004 B1
6756997 Ward, III et al. Jun 2004 B1
6760918 Rodriguez et al. Jul 2004 B2
6763226 McZeal, Jr. Jul 2004 B1
6765557 Segal et al. Jul 2004 B1
6766305 Fucarile et al. Jul 2004 B1
6769128 Knee et al. Jul 2004 B1
6771317 Ellis et al. Aug 2004 B2
6773344 Gabai et al. Aug 2004 B1
6778559 Hyakutake Aug 2004 B2
6779004 Zintel Aug 2004 B1
6781518 Hayes et al. Aug 2004 B1
6784804 Hayes et al. Aug 2004 B1
6785716 Nobakht Aug 2004 B1
6788709 Hyakutake Sep 2004 B1
6804824 Potrebic et al. Oct 2004 B1
6826775 Howe et al. Nov 2004 B1
6828993 Hendricks et al. Dec 2004 B1
6909874 Holtz et al. Jun 2005 B2
6938021 Shear et al. Aug 2005 B2
7110025 Loui et al. Sep 2006 B1
7237251 Oz et al. Jun 2007 B1
7307574 Kortum Dec 2007 B2
7310807 Pearson Dec 2007 B2
7401351 Boreczky et al. Jul 2008 B2
7436346 Walter Oct 2008 B2
7474359 Sullivan et al. Jan 2009 B2
7716714 Kortum May 2010 B2
7873102 Vleck Jan 2011 B2
7908627 Ansari Mar 2011 B2
8054849 Nadarajah Nov 2011 B2
8086261 Radpour Dec 2011 B2
8190688 Kortum May 2012 B2
8214859 Kortum Jul 2012 B2
8282476 Walter Oct 2012 B2
8390744 Sullivan et al. Mar 2013 B2
20010011261 Mullen-Schultz Aug 2001 A1
20010016945 Inoue Aug 2001 A1
20010016946 Inoue Aug 2001 A1
20010034664 Brunson Oct 2001 A1
20010044794 Nasr et al. Nov 2001 A1
20010048677 Boys Dec 2001 A1
20010049826 Wilf Dec 2001 A1
20010054008 Miller et al. Dec 2001 A1
20010054009 Miller et al. Dec 2001 A1
20010054067 Miller et al. Dec 2001 A1
20010056350 Calderone et al. Dec 2001 A1
20020001303 Boys Jan 2002 A1
20020001310 Mai et al. Jan 2002 A1
20020002496 Miller et al. Jan 2002 A1
20020003166 Miller et al. Jan 2002 A1
20020007307 Miller et al. Jan 2002 A1
20020007313 Mai et al. Jan 2002 A1
20020007485 Rodriguez et al. Jan 2002 A1
20020010639 Howey et al. Jan 2002 A1
20020010745 Schneider Jan 2002 A1
20020010935 Sitnik Jan 2002 A1
20020016736 Cannon et al. Feb 2002 A1
20020022963 Miller et al. Feb 2002 A1
20020022970 Noll et al. Feb 2002 A1
20020022992 Miller et al. Feb 2002 A1
20020022993 Miller et al. Feb 2002 A1
20020022994 Miller et al. Feb 2002 A1
20020022995 Miller et al. Feb 2002 A1
20020023959 Miller et al. Feb 2002 A1
20020026357 Miller et al. Feb 2002 A1
20020026358 Miller et al. Feb 2002 A1
20020026369 Miller et al. Feb 2002 A1
20020026475 Marmor Feb 2002 A1
20020027541 Cairns et al. Mar 2002 A1
20020029181 Miller et al. Mar 2002 A1
20020030105 Miller et al. Mar 2002 A1
20020032603 Yeiser Mar 2002 A1
20020035404 Ficco et al. Mar 2002 A1
20020040475 Yap et al. Apr 2002 A1
20020042915 Kubischta et al. Apr 2002 A1
20020046093 Miller et al. Apr 2002 A1
20020049635 Mai et al. Apr 2002 A1
20020054087 Noll et al. May 2002 A1
20020054750 Ficco et al. May 2002 A1
20020059163 Smith May 2002 A1
20020059425 Belfiore May 2002 A1
20020059599 Schein et al. May 2002 A1
20020065717 Miller et al. May 2002 A1
20020067438 Baldock Jun 2002 A1
20020069220 Tran Jun 2002 A1
20020069282 Reisman Jun 2002 A1
20020069294 Herkersdorf et al. Jun 2002 A1
20020072970 Miller et al. Jun 2002 A1
20020078442 Reyes et al. Jun 2002 A1
20020097261 Gottfurcht et al. Jul 2002 A1
20020106119 Foran et al. Aug 2002 A1
20020112239 Goldman Aug 2002 A1
20020116392 McGrath et al. Aug 2002 A1
20020120931 Huber et al. Aug 2002 A1
20020124055 Reisman Sep 2002 A1
20020128061 Blanco Sep 2002 A1
20020129094 Reisman Sep 2002 A1
20020133402 Faber et al. Sep 2002 A1
20020138840 Schein et al. Sep 2002 A1
20020152264 Yamasaki Oct 2002 A1
20020169611 Guerra et al. Nov 2002 A1
20020170063 Ansari et al. Nov 2002 A1
20020173344 Cupps et al. Nov 2002 A1
20020188955 Thompson et al. Dec 2002 A1
20020191116 Kessler et al. Dec 2002 A1
20020193997 Fitzpatrick et al. Dec 2002 A1
20020194601 Perkes et al. Dec 2002 A1
20020198874 Nasr et al. Dec 2002 A1
20030005445 Schein et al. Jan 2003 A1
20030009771 Chang Jan 2003 A1
20030012365 Goodman Jan 2003 A1
20030014750 Kamen et al. Jan 2003 A1
20030016304 Norsworthy et al. Jan 2003 A1
20030018975 Stone Jan 2003 A1
20030023435 Josephson Jan 2003 A1
20030023440 Chu Jan 2003 A1
20030028890 Swart et al. Feb 2003 A1
20030033416 Schwartz Feb 2003 A1
20030043915 Costa et al. Mar 2003 A1
20030046091 Arneson et al. Mar 2003 A1
20030046689 Gaos Mar 2003 A1
20030056223 Costa et al. Mar 2003 A1
20030058277 Bowman-Amuah Mar 2003 A1
20030061611 Pendakur et al. Mar 2003 A1
20030071792 Safadi Apr 2003 A1
20030093793 Gutta May 2003 A1
20030100340 Cupps et al. May 2003 A1
20030110161 Schneider Jun 2003 A1
20030110503 Perkes Jun 2003 A1
20030126136 Omoigui Jul 2003 A1
20030135771 Cupps et al. Jul 2003 A1
20030141987 Hayes Jul 2003 A1
20030145321 Bates et al. Jul 2003 A1
20030149989 Hunter et al. Aug 2003 A1
20030153353 Cupps et al. Aug 2003 A1
20030153354 Cupps et al. Aug 2003 A1
20030159026 Cupps et al. Aug 2003 A1
20030160830 DeGross Aug 2003 A1
20030163601 Cupps et al. Aug 2003 A1
20030163666 Cupps et al. Aug 2003 A1
20030172380 Kikinis Sep 2003 A1
20030182237 Costa et al. Sep 2003 A1
20030182420 Jones et al. Sep 2003 A1
20030185232 Moore Oct 2003 A1
20030187641 Moore et al. Oct 2003 A1
20030187646 Smyers et al. Oct 2003 A1
20030187800 Moore et al. Oct 2003 A1
20030189509 Hayes et al. Oct 2003 A1
20030189589 LeBlanc et al. Oct 2003 A1
20030194141 Kortum et al. Oct 2003 A1
20030194142 Kortum et al. Oct 2003 A1
20030208396 Miller Nov 2003 A1
20030208758 Schein et al. Nov 2003 A1
20030226044 T. Cupps et al. Dec 2003 A1
20030226145 Marsh Dec 2003 A1
20030229900 Reisman Dec 2003 A1
20040003041 Moore et al. Jan 2004 A1
20040003403 Marsh Jan 2004 A1
20040006769 Ansari et al. Jan 2004 A1
20040006772 Ansari et al. Jan 2004 A1
20040010602 Van Vleck et al. Jan 2004 A1
20040015997 Ansari et al. Jan 2004 A1
20040030750 Moore et al. Feb 2004 A1
20040031058 Reisman Feb 2004 A1
20040031856 Atsmon et al. Feb 2004 A1
20040034877 Nogues Feb 2004 A1
20040049728 Langford Mar 2004 A1
20040064351 Milrurak Apr 2004 A1
20040068740 Fukuda et al. Apr 2004 A1
20040068753 Robertson et al. Apr 2004 A1
20040070491 Huang et al. Apr 2004 A1
20040073918 Ferman et al. Apr 2004 A1
20040098571 Falcon May 2004 A1
20040107125 Guheen et al. Jun 2004 A1
20040107439 Hassell Jun 2004 A1
20040111745 Schein et al. Jun 2004 A1
20040111756 Stuckman et al. Jun 2004 A1
20040117813 Karaoguz et al. Jun 2004 A1
20040117824 Karaoguz Jun 2004 A1
20040128342 Maes et al. Jul 2004 A1
20040139173 Karaoguz et al. Jul 2004 A1
20040143600 Musgrove et al. Jul 2004 A1
20040143652 Grannan et al. Jul 2004 A1
20040148408 Nadarajah Jul 2004 A1
20040150676 Gottfurcht et al. Aug 2004 A1
20040183839 Gottfurcht et al. Sep 2004 A1
20040194136 Finseth et al. Sep 2004 A1
20040198386 Dupray Oct 2004 A1
20040201600 Kakivaya et al. Oct 2004 A1
20040210633 Brown et al. Oct 2004 A1
20040210935 Schein et al. Oct 2004 A1
20040213271 Lovy et al. Oct 2004 A1
20040221302 Ansari et al. Nov 2004 A1
20040223485 Arellano et al. Nov 2004 A1
20040226035 Hauser, Jr. Nov 2004 A1
20040226045 Nadarajah Nov 2004 A1
20040239624 Ramian Dec 2004 A1
20040252119 Hunleth Dec 2004 A1
20040252120 Hunleth et al. Dec 2004 A1
20040252769 Costa et al. Dec 2004 A1
20040252770 Costa et al. Dec 2004 A1
20040260407 Wimsatt Dec 2004 A1
20040261116 McKeown et al. Dec 2004 A1
20040267729 Swaminathan et al. Dec 2004 A1
20040268393 Hunleth et al. Dec 2004 A1
20050027851 McKeown et al. Feb 2005 A1
20050038814 Iyengar et al. Feb 2005 A1
20050044280 Reisman Feb 2005 A1
20050097612 Pearson et al. May 2005 A1
20050132295 Noll et al. Jun 2005 A1
20050149988 Grannan Jul 2005 A1
20050195961 Pasquale et al. Sep 2005 A1
20060026663 Kortum Feb 2006 A1
20060037043 Kortum Feb 2006 A1
20060037083 Kortum Feb 2006 A1
20060048178 Kortum Mar 2006 A1
20060077921 Radpour Apr 2006 A1
20060114360 Kortum Jun 2006 A1
20060117374 Kortum Jun 2006 A1
20060156372 Cansler, Jr. Jul 2006 A1
20060161953 Walter Jul 2006 A1
20060168610 Williams Jul 2006 A1
20060174279 Sullivan Aug 2006 A1
20060174309 Pearson Aug 2006 A1
20060179466 Pearson Aug 2006 A1
20060179468 Pearson Aug 2006 A1
20060184991 Schlamp Aug 2006 A1
20060184992 Kortum Aug 2006 A1
20060190402 Patron Aug 2006 A1
20060218590 White Sep 2006 A1
20060230421 Pierce Oct 2006 A1
20060236343 Chang Oct 2006 A1
20060268917 Nadarajah Nov 2006 A1
20060282785 McCarthy Dec 2006 A1
20060290814 Walter Dec 2006 A1
20060294559 Ansari Dec 2006 A1
20060294561 Grannan Dec 2006 A1
20060294568 Walter Dec 2006 A1
20070011133 Chang Jan 2007 A1
20070011250 Kortum Jan 2007 A1
20070021211 Walter Jan 2007 A1
20070025449 Vleck Feb 2007 A1
20070098079 Boyce et al. May 2007 A1
20070211800 Shi et al. Sep 2007 A1
20090073321 Sullivan et al. Mar 2009 A1
Foreign Referenced Citations (15)
Number Date Country
1176831 Jan 2002 EP
WO9963759 Dec 1999 WO
WO0028689 May 2000 WO
WO0160066 Aug 2001 WO
WO0217627 Feb 2002 WO
WO02058382 Jul 2002 WO
WO03003710 Jan 2003 WO
WO03025726 Mar 2003 WO
WO03063507 Jul 2003 WO
WO2004018060 Mar 2004 WO
WO2004032514 Apr 2004 WO
WO2004062279 Jul 2004 WO
WO2004066706 Aug 2004 WO
WO2005045554 May 2005 WO
2006062708 Jun 2006 WO
Non-Patent Literature Citations (20)
Entry
“Final Office Action for U.S. Appl. No. 11/005,496”, United States Patent and Trademark Office (USPTO), mail date Oct. 31, 2007, 26 pages.
“Final Office Action for U.S. Appl. No. 12/275,384”, United States Patent and Trademark Office (USPTO), mail date Oct. 13, 2011, 16 pages.
“Final Office Action for U.S. Appl. No. 12/275,384”, United States Patent and Trademark Office (USPTO), mail date Dec. 9, 2010, 17 pages.
“Final Office Action for U.S. Appl. No. 12/349,352”, U.S. Patent and Trademark Office (USPTO), mail date Sep. 16, 2010, 19 pages.
“Final Office Action for U.S. Appl. No. 12/349,352”, United States Patent and Trademark Office (USPTO), mail date Apr. 19, 2012, 10 pages.
“International Search Report and Written Opinion for International Application No. PCT/US2005/41477”, International Searching Authority, mail date Sep. 12, 2007.
“Non-Final Office Action for U.S. Appl. No. 11/005,496”, United States Patent and Trademark Office (USPTO), mail date Apr. 14, 2008, 9 pages.
“Non-Final Office Action for U.S. Appl. No. 11/005,496”, United States Patent and Trademark Office (USPTO), mail date May 14, 2007, 9 pages.
“Non-Final Office Action for U.S. Appl. No. 12/275,384”, United States Patent and Trademark Office (USPTO), mail date Apr. 14, 2011, 14 pages.
“Non-Final Office Action for U.S. Appl. No. 12/275,384”, U.S. Patent and Trademark Office (USPTO), mail date Jul. 2, 2010, 11 pages.
“Non-Final Office Action for U.S. Appl. No. 12/275,384”, U.S. Patent and Trademark Office (USPTO), mail date Nov. 2, 2012, 15 pages.
“Non-Final Office Action for U.S. Appl. No. 12/349,352”, U.S. Patent and Trademark Office (USPTO), mail date Apr. 7, 2010, 30 pages.
“Non-Final Office Action for U.S. Appl. No. 12/349,352”, U.S. Patent and Trademark Office (USPTO), mail date Nov. 14, 2011, 24 pages.
“Non-Final Office Action for U.S. Appl. No. 12/349,352”, U.S. Patent and Trademark Office (USPTO), mail date Jan. 5, 2011, 10 pages.
“Notice of Allowance and Fee(s) Due for U.S. Appl. No. 12/349,352”, United States Patent and Trademark Office (USPTO), mail date Nov. 6, 2012, 5 pages.
“Notice of Allowance and Fee(s) Due for U.S. Appl. No. 12/349,352”, U.S. Patent and Trademark Office (USPTO), mail date May 11, 2011, 10 pages.
“Notice of Allowance and Fees Due for U.S. Appl. No. 11/005,496”, United States Patent and Trademark Office (USPTO), mail date Oct. 30, 2008, 10 pages.
“Restriction Requirement for U.S. Appl. No. 12/275,384”, United States Patent and Trademark Office (USPTO) mail date May 21, 2010, 6 pages.
“Supplemental European Search Report for corresponding EP Application No. 05825578.7”, European Patent Office, mail date Aug. 12, 2009, 8 pages.
Kapinos, S., “Accenda Universal Remote Control Targets Needs of Elderly, Visually Impaired, Physically Challenged . . . and the Rest of Us”, Innotech Systems, Inc., Press Release, Port Jefferson, NY, Dec. 15, 2002.
Related Publications (1)
Number Date Country
20130148023 A1 Jun 2013 US
Continuations (2)
Number Date Country
Parent 12349352 Jan 2009 US
Child 13759187 US
Parent 11005496 Dec 2004 US
Child 12349352 US