Multi-video receiving method and apparatus

Abstract
A method and apparatus is disclosed for receiving a plurality of video signals using a single video signal transmitted using a point-to-multipoint connection; the plurality of video signals originate from various locations of an event. Information received using a reception unit comprises at least one of the plurality of video signals, at least one of the plurality of sound track signals related to the at least one of the plurality of video signals, data transmitted using the vertical blanking interval (VBI) of the single video signal and graphics.
Description
FIELD OF THE INVENTION

This invention relates to multi-video program receiving method and apparatus, and more precisely, to the displaying and access to more than one video program related to a live event.


BACKGROUND OF THE INVENTION

Assisting to a live event is usually an enjoyable experience as all our senses are overwhelmed by various sensations. The assisting is therefore more enjoyable than a remote assisting via a television program related to said event. Unfortunately, it is usually difficult to have more than one view of the event when assisting to said event. This is particularly true during Formula 1 Grand Prix competitions, where a viewer is usually seated at a particular point of the circuit. Usually in that case, the viewer cannot have access to other parts of the circuit unless a large screen is available for instance. This may avoid the viewer to enjoy an action that is taking place at another location of the circuit unavailable to him.


Various solutions have been implemented in order to enable the viewer at the live event to access more information about the event.


For instance, in some major car racing events, such as CART/NASCAR, radio scanners are available. Such radio scanners enable the live event viewer to scan and listen to conversations between a pilot and his crew. Unfortunately the device lacks excitement and entertainment because it only offers sound, making it difficult for the user to easily understand who they are listening to and being potentially more disruptive than enhancing.


Mini portable television sets have been developed and are already available on the market. Via such mini portable television sets, the live event user can watch the race to the extent that it is broadcast, but, in most cases, the live events are broadcast on cable TV and are therefore inaccessible to these devices. These mini portable television sets have therefore limited applications and do not offer the excitement or any added value of the event that would motivate the users to widely use it.


Wireless handled devices based on web technology have also been developed. These devices run on PDA and are linked to the web. Such devices provide VIP guests and general audience live timing and scoring, radio feeds and background info on team cars, car drivers, golf players, etc. They integrate rich multimedia and interactive interfaces. Unfortunately, these devices lack with many drawbacks. A first drawback is the fact that the technology is usually implemented using a point-to-point connection, such point-to-point connection does not allow a massive use of the device at a live event as an important frequency bandwidth would be necessary to serve an important number of users. Such devices have a limited effectiveness when delivery video content due to the limited bandwidth used. Moreover, data compression schemes are mandatory in order to cope with limited bandwidth. It is difficult, in these conditions, to have a real time transmission of video data streams. These devices are also very expensive due to the fact that they are based on an expensive hardware architecture.


There is a need for a method and apparatus to overcome the above-mentioned drawbacks.


SUMMARY OF THE INVENTION

It is an object of the invention to provide a method for broadcasting more than one video program and data to a plurality of reception units;


It is another object of the invention to provide an apparatus for broadcasting more than one video program and data to a plurality of reception units using a point-to-multipoint connection;


It is another object of the invention to provide a method for receiving more than one video program and data transmitted using a wireless point-to-multipoint connection;


It is another object of the invention to provide an apparatus for receiving more than one video program and data transmitted using a wireless point-to-multipoint connection;


According to the above objects, from a broad aspect, the present invention provides a method for providing a plurality of video programs to a plurality of video program receivers having a first resolution using a single frequency band enabling transmission of a video program having a second resolution substantially higher than the first resolution, the method comprising the steps of receiving a plurality of video programs from a plurality of video program sources, the plurality of video programs having a resolution substantially equal to the second resolution; formatting each of the plurality of video programs received from the plurality of video program sources to have a resolution substantially equal to the first resolution; inserting each of the formatted video programs at a specific location in a main video signal having a resolution substantially equal to the second resolution and transmitting the main video signal to each of the plurality of video program receivers.


According to another broad aspect of the invention, there is provided an apparatus for broadcasting at least one of a plurality of video program signals inserted in a main video signal to a plurality of receiving units, each receiving unit having a first resolution, the main signal having a second resolution substantially higher than the first resolution, the apparatus comprising a formatting unit receiving the at least one of a plurality of video programs signals, the formatting unit formatting the at least one of a plurality of video program signals to have a resolution substantially equal to a resolution of a receiving unit, the resolution of the receiving unit being substantially lower than the resolution of the main video signal, a combining unit, receiving the formatted at least one of a plurality of video signals and incorporating each of the formatted at least one of a plurality of video signals into a specific location in the main video signal and a transmitter, receiving the main video signal and transmitting the main video signal to the plurality of receiving units.


According to another broad aspect of the invention, there is provided an apparatus for displaying at least one of a plurality of video program signals inserted in a main video signal, the at least one of a plurality of video program signals having a first resolution, the main signal having a second resolution substantially higher than the first resolution, the apparatus comprising a receiving unit receiving the main video signal, a user interface for enabling a user to select at least one of the plurality of video programs contained in said main video signal for a display, the user interface providing a command signal representative of the selected at least one of the plurality of video programs to display, a display screen, receiving the selected video signal for display and a processing unit receiving the main video signal and selecting at least one part of the main video signal provided by the receiving unit according to said command signal provided by the user interface, the selected one of the plurality of video programs to display being displayed on said display screen.





BRIEF DESCRIPTION OF THE DRAWINGS

These and other features, aspects and advantages of the present invention will become more clearly understood with the following description and accompanying drawings, wherein:



FIG. 1 is a diagram that shows a transmitting unit and a plurality of reception units;



FIG. 2 is a diagram that shows a perspective view of a reception unit; the reception unit comprises a plurality of buttons for enabling a user to control it;



FIG. 3 is a diagram that shows one image of a main video signal; the image comprises four different images originating from different video signals;



FIG. 4 is diagram which shows how four sound track signals are embodied in a single main video signal;



FIG. 5 is a flow chart that shows how various video program signals are transmitted to a plurality of reception unit;



FIG. 6 shows a block diagram of the preferred embodiments of a reception unit, the reception unit comprises a tunable receiver, a video digitizer, a Field Programmable Gate Array (FPGA), a frame buffer, a graphics storage unit, a microprocessor, a user interface, an audio unit, a display unit and an input/output interface;



FIG. 7 is a flow chart that shows how the reception unit operates in the preferred embodiment of the invention;





DETAILED DESCRIPTION OF DIFFERENT EMBODIMENTS

Now referring to FIG. 1, there is shown the preferred embodiment of the invention. The system comprises a broadcasting unit 17, a plurality of sound sources, a plurality of video sources, a data source 24, and a plurality of reception unit 28.


The broadcasting unit 17 comprises a sound formatting unit 18, a video formatting unit 20, a combining unit 22, a transmitting unit 26.


The sound formatting unit 18 receives a plurality of sound track signals from the plurality of sound sources. The sound formatting unit 18 provides a plurality of formatted soundtrack signals. In the preferred embodiment of the invention, the formatted soundtrack signals have a specific bandwidth as explained below.


The video formatting unit 20 receives a plurality of video signals from a plurality of video sources. The video formatting unit 20 provides a plurality of formatted video signals. In the preferred embodiment of the invention, the formatted video signals have a pixel resolution of 360 by 240.


The combining unit 22 receives the plurality of formatted soundtrack signals, the plurality of formatted video signals, and a data signal provided by the data source 24. The combining unit 22 combines the plurality of formatted soundtrack signals, the plurality of formatted video signals, and the data signals into a single video signal. In the preferred embodiment of the invention, the combined video signal is a NTSC compatible video signal. Still in the preferred embodiment of the invention, four sound sources 10 are used, and four video sources are used in order to combine the video signals. In another embodiment of the invention, more than one video sources and more than one sound sources may be used. The transmitting unit 26 receives the combined video signal and transmits the combined video signal in the preferred embodiment of the invention. The transmitting unit 26 operates under the North American NTSC broadcast standard, which includes the luminance carrier, the chrominance carrier, and the audio carrier. The total bandwidth of the transmitted signal does not exceed 6 MHz. Still in the preferred embodiment of the invention, the transmitting unit 26 is capable of covering a range of 3 km, which is necessary to suit the needs of the application. In the case of an event requiring more a larger coverage, a more powerful transmitting unit 26 may be used.


A plurality of reception units 28 receive a transmitted signal transmitted using the transmitting unit 26. As explained below, each of the reception units 28 are able to receive at least one of the plurality of video sources and at least one of the corresponding sound sources.


Now referring to FIG. 2, there is shown a reception unit 28. The reception unit 28 comprises a display screen, which is a LCD display screen 96. Still in the preferred embodiment of the invention, the display screen 96 is visible in the light and at night using front light TFT active matrix technology. The display measures 3.8 inches in diagonal. The display screen 96 has a pixel resolution of 320 by 240. The aspect ratio of the display screen is the North American TV standard 4:3. It will be appreciated by someone skilled in the art that other television standards may be used.


The reception unit 28 comprises a user interface having a plurality of buttons. In the preferred embodiment of the invention, the user interface comprises a power on/off button 30, a back button 32, a graphics menu 34, a volume up button 36, a volume down button 38, a left button 40, a right button 42, an up button 46, a down button 44 and a enter button 48.


The power on button 30 enables the user to switch on, switch off the reception unit 28. The volume up button 36 enables the user to increase the output volume of the reception unit 28. The volume down button 38 enables the user to decrease the output volume of the reception unit 28. The menu button 34 enables the user to access graphics. The back button 32 enables the user to access a screen display prior a last command. The left button 40, the right button 42, the down button 38, and the up button 46 enable the user to navigate in a plurality of menus displayed on the display screen 96. The enter button 48 enables the user to confirm a selection in a menu displayed on the display screen of the reception unit 28.


Now referring to FIG. 3, there is shown a diagram which presents a single image of the combined video signal generated by the combining unit 22. The single image shown in this figure comprises four sub-images, respectively sub-image 80, sub-image 82, sub-image 84 and sub-image 86. Sub-image 80 refers to an image originating from the first video source. Sub-image 82 refers to an image originating from the second video source. Sub-image 84 refers to an image originating from the third video source. Sub-image 86 refers to an image originating from the fourth video source.


In the preferred embodiment of the invention, the pixel resolution of the sub-image 80, the sub-image 82, the sub-image 84 and the sub-image 86 is 320 by 240.


Now referring to FIG. 4, there is shown how audio is formatted in the preferred embodiment of the invention.


In the preferred embodiment of the invention, the first sound track signal is stored in the main audio channel portion of the combined video signal, which is referred as the L+R portion of the combined video signal. Still in the preferred embodiment of the invention, the second sound track signal is stored in the L−R portion of the combined video signal; the third sound track signal is stored in the SAP portion of the combined video signal and the fourth sound track signal is stored in the cue channel portion of the combined video signal. It will be appreciated by someone skilled in the art that alternatives storing schemes may be used depending on the transmission standard used. Furthermore, it will be appreciated that the first sound track signal, the second sound track signal, the third sound track signal and the fourth sound track signal may be compressed using a CODEC prior being inserted in the combined video signal. Someone skilled in the art will appreciate that compression may provide a higher sound quality.


Now referring to FIG. 5, there is shown a flow chart which shows the operating of the broadcasting unit 17. According to step 50, a first video signal is provided by a video source. According to step 52, a first soundtrack signal is provided by a first sound source. According to step 54, a second video signal is provided by a second video source. According to step 56, a second soundtrack signal is provided by a second sound source. According to step 58, a third video signal is provided by a third video source. According to step 60, a third soundtrack signal is provided by a third sound source. According to step 62, a fourth video signal is provided by a fourth video source. According to step 64, a fourth soundtrack signal is provided by a fourth sound source. According to step 66, the first video signal provided in step 50, the second video signal provided in step 54, the third video signal provided in step 58 and the fourth video signal provided in step 62 are formatted in a suitable format by the video formatting unit 20 as explained below. According to step 68, the first soundtrack signal provided in step 52, the second soundtrack signal provided in step 56, the third soundtrack signal provided in step 60, and the fourth soundtrack signal provided in step 64 are formatted by the sound formatting unit 18 into a suitable format as explained below. According to step 70 the combining is performed using the plurality of formatted soundtrack signal generated according to step 68 and the plurality of formatted radio signals generated according to step 66 and using a data signal providing according to step 48. According to step 72, the combined signal generated to step 70 is transmitted using the transmitting unit 26.


In an alternative embodiment of the invention, at least one graphics source may be used instead of at least one video source.


Now referring to FIG. 6, there is shown a block diagram of the reception unit 28 in the preferred embodiment of the invention. In the preferred embodiment of the invention, the reception unit 28 comprises a tunable receiver 80, a video digitizer 82, a Field Programmable Gate Array (FPGA) 84, a frame buffer 86, a graphics storage unit 88, a microprocessor 90, an audio unit 92, a display unit 102, a user interface 98, and an input/output interface 100. The display unit 102 comprises a LCD driving circuit 94 and a LCD screen 96.


In the preferred embodiment of the invention, the LCD screen 96 is a Sanyo TM038QV-67A02A; the FPGA 84 is a Xilinx Spartan II. Still in the preferred embodiment of the invention, the microprocessor 90 is a Microchip PIC16F877. The tunable receiver 80 is a Samsung TCPN9081DA10C. The video digitizer 82 is a Philipps SAA7111. The frame buffer 86 is a ISSI IS62LV12816L. The graphics storage unit 88 is an AMD 29LV640D.


The combined video signal transmitted using the transmitting unit 26 is received by an antenna of the tunable receiver 80. The microprocessor 90 sends a selected channel signal to the tunable receiver 80. The tunable receiver 80 extracts a video signal according to the selected channel signal. The video digitizer 82 digitizes the video signal received from the tunable receiver 80. The FPGA 84 receives the digitized video signal provided by the video digitizer 82 and stores at least one frame of the digitized video signal into the frame buffer 86; the digitized video signal is stored into the frame buffer 86 at a rate of 30 frames per second in the preferred embodiment of the invention. The FPGA 84 further extracts the embedded data comprised in the Vertical Blanking Interval (VBI) portion of the digitized video signal and stores the data in the FPGA 84.


The microprocessor 90 controls the audio unit 92. In the preferred embodiment of the invention, the audio unit 92 comprises an audio amplifier receiving an audio signal received from the tunable receiver 80 according to a selected portion of the video signal transmitted in accordance with the storing scheme presented in FIG. 4. The user may provide information using the user interface 98 to the microprocessor 90. An input/output interface 100 is connected to said microprocessor 90 in order to upload/download information from a remote device. Information downloaded from a remote device may comprise graphics data which, after being collected by said microprocessor 90 via said input/output interface 100, are provided to the graphics storage unit 88 via the FPGA 84. The FPGA 84 provides a video signal to display to the LCD driving circuit 94 of the display unit 102. The LCD driving circuit 94 provides a final signal to the LCD screen 96.


Now referring to FIG. 7, there is shown the operation of the reception unit 28. According to step 110, a configuration set up is performed. The configuration set up 110 comprises the providing of a selected channel signal to the tunable receiver 80. In the preferred embodiment of the invention, the reception unit 28 receives simultaneously a single channel signal, in an alternative embodiment, the reception unit 28 may receive, simultaneously, a plurality of channel signals using a plurality of tunable receivers 80. Still according to the configuration set up, the display unit 102 is carefully tuned by the microprocessor 90 via the FPGA 84. According to step 112, the user performs a mode selection. If the user selects the video mode, a first channel is selected according to step 114. According to step 116, video is processed by the FPGA 84. More precisely, the FPGA 84 retrieves a frame from the frame buffer 86. The FPGA 84 descrambles the retrieved frame and provides the video signal to display to the LCD driving circuit 94. The tunable receiver 80 extracts a corresponding audio signal to the video signal selected from the corresponding portion of the transmitted video signal. The audio unit 92 provides an amplified audio signal, which may be listened by the user using headsets for instance. The amplified audio signal and the displayed video signal are provided to the user according to step 118.


If the user selects the graphics mode using the menu button 34, according to step 120, a menu is provided to the user. The menu is created using data extracted from the digitized video signal in the Vertical Blanking Interface (VBI) portion of the signal and stored in the FPGA 84 and/or using graphics data stored in the graphics storage unit 88. In the preferred embodiment, graphics are stored using BMP file format.


The microprocessor 90 controls the FPGA 84 to generate a video signal to be displayed which is then provided to the LCD driving circuit 94, the video signal to be displayed comprises the above-mentioned data provided by the FPGA 84 together combined with graphics data stored in the graphics storage unit. According to step 122, the user selects via the user interface 98 a new data to obtain. In one application of the invention, the data may be the ranking of a participant of the live event, a summary of a part of the event. According to step 124, the microprocessor 90 controls the FPGA 84 to retrieve data, which is stored in the FPGA 84. According to step 126, the data are provided to the user as explained previously. In the preferred embodiment of the invention, the video signal to be displayed comprises the data as well as graphics data retrieved from the graphics storage unit 88 by the FPGA 84. The video signal to be displayed is therefore created by the FPGA 84.


Now referring back to FIG. 1, the sound formatting unit 18 converts the first soundtrack signal, the second soundtrack signal, the third soundtrack signal and the fourth soundtrack signal into formatted soundtrack signals that will be combined and modulated to create a combined video signal having the first soundtrack signal, the second soundtrack signal, the third soundtrack signal and the fourth soundtrack signal modulated according to the scheme displayed in FIG. 4. Still referring to FIG. 1, the video formatting unit 20 comprises four digitizers receiving respectively the first video signal, the second video signal, the third video signal and the fourth video signal and providing a first digitized video signal, a second digitized video signal, a third digitized video signal and a fourth digitized video signal. The video formatting unit 20 further comprises four interpolating units respectively connected to the first digitizer, the second digitizer, the third digitizer and the fourth digitizer. The first interpolating unit receives the first digitized video signal, the second interpolating unit receives the second digitized video signal, the third interpolating unit receives the third digitized video signal and the fourth interpolating unit receives the fourth digitized video signal. Each interpolating unit dismisses one pixel about two horizontally and one pixel about two vertically. The first interpolating unit provides therefore an interpolated image that has a pixel resolution of one about four of the original pixel resolution, the resolution of the first digitized image provided to the first interpolating unit. The second interpolating unit, the third interpolating and the fourth interpolating unit operate in the same manner as the first interpolating unit. The combining unit 22 receives the formatted video signals and the formatted soundtrack signals and provides a combined video signal.


In order to protect the network against eavesdropping and uncontrolled use, the combining unit 22 further comprises a scrambling unit which scrambles the combined video signal prior providing it to the transmitting unit 26; this scrambling unit is not shown in FIG. 1.


In the preferred embodiment of the invention, switching vertical lines two by two performs the scrambling. The FPGA comprised in the reception unit is used to descramble the transmitted video signal. Someone skilled in the art will appreciate that other scrambling schemes may be used. A code related to said scrambling is stored in a code storing unit connected to said FPGA.


Alternatively, a single sound track is provided by a single sound source to the sound formatting unit 18, and a single video signal is provided by a single video signal to the video formatting unit 20. In such embodiment, the transmitted video signal comprises the single sound track signal and the single video signal. However, in such embodiment, an encoding is performed prior the transmitting in order to avoid unauthorized reception of the transmitted signal. The signal may be transmitted using in analog or digital techniques. It will be appreciated that such alternative embodiment is particularly well-suited for an event requiring a short geographic scale coverage.

Claims
  • 1. A method for providing a plurality of video programs to a plurality of portable video program receivers at a venue hosting a live event, comprising: receiving a plurality of video streams derived from video cameras capturing images of the live event;reducing image resolution of the plurality of video streams to produce multiple resolution reduced video streams;accessing data content that includes menu data and computer generated graphics related to the live event, the menu data for use by the portable program receivers to create and display a menu, the menu includes a list of options to make selections, one or more of the options is to access and display the computer generated graphics related to the live event, the computer generated graphics are not video data captured from a video camera, the data content is independent of the video streams;combining the multiple resolution reduced video streams and data content into a combined signal that includes the multiple resolution reduced video streams, the menu data and the computer generated graphics combined such that the plurality of portable video program receivers can extract one of the video streams and the plurality of portable video program receivers can extract the data content; andgenerating a wireless RF signal to wirelessly convey the combined signal to the plurality of portable video program receivers.
  • 2. A method as defined in claim 1, including receiving a plurality of sound tracks associated with respective video streams and wirelessly conveying audio information derived from the sound tracks to the plurality of portable video program receivers.
  • 3. A method as defined in claim 2, including receiving a data signal from a data source, the data content is derived from the data signal.
  • 4. A method as defined in claim 3, wherein the live event is a sporting event, the data content conveying information about ranking of a participant in the live event.
  • 5. A method as defined in claim 4, wherein the live event is a motor sports event.
  • 6. A method as defined in claim 1, wherein each video stream has an image resolution of 720 pixels by 480 pixels before said reducing.
  • 7. A method as defined in claim 6, wherein each resolution reduced video stream has a resolution of 360 pixels by 240 pixels.
  • 8. A method as defined in claim 3, wherein the wireless RF signal is digital.
  • 9. A method as defined in claim 8, wherein the wireless RF signal is contained within a single frequency band.
  • 10. A method as defined in claim 9, wherein the single frequency band has a bandwidth of about 6 MHz.
  • 11. An apparatus for broadcasting multiple video programs to a plurality of portable video program receivers at a venue hosting a live event, said apparatus comprising: a first input interface that receives a plurality of video streams derived from video cameras capturing images of the live event;a second input interface that receives a data signal from a data source, the data signal includes menu data and computer generated graphics related to the live event, the menu data for use by the portable program receivers to create and display a menu, the menu includes a list of options to make selections, the computer generated graphics are not video data captured from a video camera, the data content is independent of the video streams; anda transmitter that generates and transmits a wireless RF signal to the plurality of portable video program receivers, the wireless RF signal includes the plurality of video streams, the menu data and the computer generated graphics in a common wireless RF signal such that the plurality of portable video program receivers can extract one of the video streams to display and the plurality of portable video program receivers can extract the menu data to create a menu based on the menu data.
  • 12. An apparatus as defined in claim 11, wherein the wireless RF signal is digital.
  • 13. An apparatus as defined in claim 12, wherein the wireless RF signal is contained within a single frequency band.
  • 14. An apparatus as defined in claim 13, wherein the single frequency band has a bandwidth of about 6 MHz.
  • 15. A method as recited in claim 1, further comprising: receiving the wireless RF signal at a portable video program receiver of the plurality of portable video program receivers;automatically identifying the data content in the RF signal; andstoring the data content in a buffer to be used when in a graphics mode of the portable video program receiver.
  • 16. A method for providing a plurality of video programs to a plurality of portable video program receivers at a venue hosting a live event, comprising: receiving a plurality of video streams derived from video cameras capturing images of the live event;reducing image resolution of the plurality of video streams to produce multiple resolution reduced video streams;accessing data content that includes menu data, the menu data for use by the portable video program receivers to create a menu, the menu includes a list of options that can be selected on the portable video program receivers;combining the multiple resolution reduced video streams and data content into a combined signal that includes the multiple resolution reduced video streams and the menu data combined such that the plurality of portable video program receivers can extract one of the multiple resolution reduced video streams and the plurality of portable video program receivers can extract the menu data; andgenerating a wireless RF signal to wirelessly convey the combined signal to the plurality of portable video program receivers using a point-to-multipoint transmission.
  • 17. A method as defined in claim 16, wherein the wireless RF signal is contained within a single frequency band.
  • 18. A method for providing a plurality of video programs of a live event, comprising: receiving a wireless signal at a portable receiver, the wireless signal comprises a plurality of video streams from video cameras capturing images of the live event combined with data content, the data content includes menu data, the menu data provides for the portable receivers to display a menu, the menu includes a list of options for selection;receiving a selection of one of the video streams;extracting the selected video stream from the combined video streams and data content;displaying the selected video stream on the portable receiver;extracting the data content from the combined video streams and data content;creating a menu on the portable receiver using the data content extracted from the combined video streams and data content;receiving a selection in the menu to access data;retrieving the data in response to the selection; andproviding the retrieved data to the user.
  • 19. The method as defined in claim 18, wherein the wireless signal is received at a single frequency band.
  • 20. The method as defined in claim 18, wherein: the data content includes computer generated graphics related to the live event; andthe menu includes a list of options to make selections, one or more of the options is to access and display the computer generated graphics related to the live event, the computer generated graphics are not video data captured from a video camera, the data content is independent of the video streams.
  • 21. A portable video program receiver for use at a event, comprising: a receiver for wirelessly receiving a plurality of video streams from video cameras capturing images of the event and data content in a combined wireless signal, the data content includes menu data;a display;data storage; anda processing system in communication with the display, the data storage and the receiver;the processing system receives a selection of one of the video streams, extracts the selected video stream from the combined wireless signal, causes the selected video stream to be displayed, extracts the data content from the combined wireless signal, creates a menu on the display using the data content extracted from the combined wireless signal, receives a selection in the menu to access data, retrieves the data in response to the selection and provides the retrieved data to the user.
  • 22. The portable video program receiver as defined in claim 21, wherein: the processing system comprises a microprocessor and a FPGA.
Priority Claims (1)
Number Date Country Kind
2348353 May 2001 CA national
PCT Information
Filing Document Filing Date Country Kind 371c Date
PCT/CA02/00757 5/22/2002 WO 00 5/6/2004
Publishing Document Publishing Date Country Kind
WO02/096097 11/28/2002 WO A
US Referenced Citations (407)
Number Name Date Kind
4139860 Micic et al. Feb 1979 A
4259690 Nakanishi et al. Mar 1981 A
4853764 Sutter Aug 1989 A
4866515 Tagawa et al. Sep 1989 A
4887152 Matsuzaki et al. Dec 1989 A
5003300 Wells Mar 1991 A
5012350 Streck et al. Apr 1991 A
5023706 Sandberg Jun 1991 A
5045948 Streck et al. Sep 1991 A
5047860 Rogalski Sep 1991 A
5068733 Bennett Nov 1991 A
5138722 Urella et al. Aug 1992 A
5161250 Ianna et al. Nov 1992 A
5189562 Greene Feb 1993 A
5223987 Muller Jun 1993 A
5263156 Bowen et al. Nov 1993 A
5289272 Rabowsky et al. Feb 1994 A
5392158 Tosaki Feb 1995 A
5434590 Dinwiddie, Jr. et al. Jul 1995 A
5485504 Ohnsorge Jan 1996 A
5504535 Abe Apr 1996 A
5508707 LeBlanc et al. Apr 1996 A
5510828 Lutterbach et al. Apr 1996 A
5513384 Brennan et al. Apr 1996 A
5534912 Kostreski Jul 1996 A
5539465 Xu et al. Jul 1996 A
5546099 Quint et al. Aug 1996 A
5563931 Bishop et al. Oct 1996 A
5570412 LeBlanc Oct 1996 A
5574964 Hamlin Nov 1996 A
5596625 LeBlanc Jan 1997 A
5598208 McClintock Jan 1997 A
5600365 Kondo et al. Feb 1997 A
5600368 Matthews, III Feb 1997 A
5602903 LeBlanc et al. Feb 1997 A
5617331 Wakai et al. Apr 1997 A
5621456 Florin et al. Apr 1997 A
5664880 Johnson et al. Sep 1997 A
5666151 Kondo et al. Sep 1997 A
5696521 Robinson et al. Dec 1997 A
5708961 Hylton et al. Jan 1998 A
5720037 Biliris et al. Feb 1998 A
5729471 Jain et al. Mar 1998 A
5742263 Wang et al. Apr 1998 A
5768686 LeBlanc et al. Jun 1998 A
5779566 Wilens Jul 1998 A
5790121 Sklar et al. Aug 1998 A
5793413 Hylton et al. Aug 1998 A
5797809 Hyuga Aug 1998 A
5806005 Hull et al. Sep 1998 A
5812937 Takahisa et al. Sep 1998 A
5815216 Suh Sep 1998 A
5822527 Post Oct 1998 A
5847771 Cloutier et al. Dec 1998 A
5894320 Vancelette Apr 1999 A
5903395 Rallison et al. May 1999 A
5907322 Kelly et al. May 1999 A
5912644 Wang Jun 1999 A
5915020 Tilford et al. Jun 1999 A
5921780 Myers Jul 1999 A
5945972 Okumura et al. Aug 1999 A
5960341 LeBlanc et al. Sep 1999 A
5987380 Backman et al. Nov 1999 A
5999808 LaDue Dec 1999 A
6009336 Harris et al. Dec 1999 A
6020851 Busack Feb 2000 A
6029195 Herz Feb 2000 A
6043777 Bergman et al. Mar 2000 A
6075527 Ichihashi et al. Jun 2000 A
6078594 Anderson et al. Jun 2000 A
6078874 Piety et al. Jun 2000 A
6078954 Lakey et al. Jun 2000 A
6080063 Khosla Jun 2000 A
6097441 Allport Aug 2000 A
6100925 Rosser et al. Aug 2000 A
6124862 Boyken et al. Sep 2000 A
6125259 Perlman Sep 2000 A
6133946 Cavallaro et al. Oct 2000 A
6137525 Lee et al. Oct 2000 A
6182084 Cockrell et al. Jan 2001 B1
6192257 Ray Feb 2001 B1
6195090 Riggins, III Feb 2001 B1
6236365 LeBlanc et al. May 2001 B1
6271752 Vaios Aug 2001 B1
6292828 Williams Sep 2001 B1
6301514 Canada et al. Oct 2001 B1
6332024 Inoue et al. Dec 2001 B1
6351252 Atsumi et al. Feb 2002 B1
6397147 Whitehead May 2002 B1
6400264 Hsieh Jun 2002 B1
6415289 Williams et al. Jul 2002 B1
6424369 Adair et al. Jul 2002 B1
6434403 Ausems et al. Aug 2002 B1
6434530 Sloane et al. Aug 2002 B1
6466202 Suso et al. Oct 2002 B1
6469663 Whitehead et al. Oct 2002 B1
6509908 Croy et al. Jan 2003 B1
6516466 Jackson Feb 2003 B1
6522352 Strandwitz et al. Feb 2003 B1
6525762 Mileski et al. Feb 2003 B1
6526335 Treyz et al. Feb 2003 B1
6526575 McCoy et al. Feb 2003 B1
6535493 Lee et al. Mar 2003 B1
6544121 DeWeese et al. Apr 2003 B2
6564070 Nagamine et al. May 2003 B1
6570889 Stirling-Gallacher et al. May 2003 B1
6571279 Herz et al. May 2003 B1
6578203 Anderson, Jr. et al. Jun 2003 B1
6624846 Lassiter Sep 2003 B1
6628971 Yoon et al. Sep 2003 B1
6633232 Trajkovic et al. Oct 2003 B2
6651253 Dudkiewicz Nov 2003 B2
6657654 Narayanaswami Dec 2003 B2
6669346 Metcalf Dec 2003 B2
6675386 Hendricks et al. Jan 2004 B1
6681398 Verna Jan 2004 B1
6688973 Satloff et al. Feb 2004 B2
6697103 Fernandez et al. Feb 2004 B1
6725303 Hoguta et al. Apr 2004 B1
6741856 McKenna et al. May 2004 B2
6760595 Inselberg Jul 2004 B2
6782102 Blanchard et al. Aug 2004 B2
6807367 Durlach Oct 2004 B1
6813608 Baranowski Nov 2004 B1
6825875 Strub et al. Nov 2004 B1
6831907 Dolman et al. Dec 2004 B2
6907023 McKenna Jun 2005 B2
6952181 Karr et al. Oct 2005 B2
6952558 Hardacker Oct 2005 B2
6961586 Barbosa et al. Nov 2005 B2
6965937 Gaddis et al. Nov 2005 B2
6973665 Dudkiewicz Dec 2005 B2
6990681 Wang et al. Jan 2006 B2
6996413 Inselberg Feb 2006 B2
7003792 Yuen Feb 2006 B1
7013110 Carpenter Mar 2006 B1
7035804 Saindon et al. Apr 2006 B2
7062795 Skiba et al. Jun 2006 B2
7069573 Brooks et al. Jun 2006 B1
7079176 Freeman et al. Jul 2006 B1
7124425 Anderson, Jr. et al. Oct 2006 B1
7132932 Namm et al. Nov 2006 B2
7133837 Barnes, Jr. Nov 2006 B1
7139586 Kreitzer et al. Nov 2006 B2
7149549 Ortiz et al. Dec 2006 B1
7155199 Zalewski et al. Dec 2006 B2
7158079 Motoyama Jan 2007 B2
7162532 Koehler et al. Jan 2007 B2
7164930 Korneluk et al. Jan 2007 B2
7194395 Genovese Mar 2007 B2
7194687 Sezan et al. Mar 2007 B2
7209733 Ortiz et al. Apr 2007 B2
7210160 Anderson, Jr. et al. Apr 2007 B2
7248888 Inselberg Jul 2007 B2
7263378 Inselberg Aug 2007 B2
7289793 Norwood Oct 2007 B2
7292723 Tedesco et al. Nov 2007 B2
7305691 Cristofalo Dec 2007 B2
7321655 Skakkabaek et al. Jan 2008 B2
7337462 Dudkiewicz Feb 2008 B2
7343157 Mitchell Mar 2008 B1
7346150 Frifeldt et al. Mar 2008 B2
7367043 Dudkiewicz Apr 2008 B2
7376388 Ortiz et al. May 2008 B2
7386870 Lu Jun 2008 B2
7421477 Pettinato Sep 2008 B2
7434247 Dudkiewicz Oct 2008 B2
7444660 Dudkiewicz Oct 2008 B2
7451401 Tanskanen et al. Nov 2008 B2
7458093 Dukes et al. Nov 2008 B2
7483049 Aman et al. Jan 2009 B2
7487112 Barnes, Jr. Feb 2009 B2
7493368 Raverdy Feb 2009 B2
7496344 Stadelmann et al. Feb 2009 B2
7564954 Frifeldt et al. Jul 2009 B2
7565153 Alcock et al. Jul 2009 B2
7603321 Gurvey Oct 2009 B2
7610062 Beeman Oct 2009 B2
7611409 Muir Nov 2009 B2
7617272 Bulson Nov 2009 B2
7640303 Blumofe Dec 2009 B2
7647614 Krikorian et al. Jan 2010 B2
7657920 Arseneau et al. Feb 2010 B2
7683937 Blumenfeld Mar 2010 B1
7707614 Krikorian et al. Apr 2010 B2
7761048 Bichot Jul 2010 B2
7792539 Inselberg Sep 2010 B2
7802724 Nohr Sep 2010 B1
20010010541 Fernandez et al. Aug 2001 A1
20010022615 Fernandez et al. Sep 2001 A1
20010029613 Fernandez et al. Oct 2001 A1
20020028690 McKenna et al. Mar 2002 A1
20020040475 Yap et al. Apr 2002 A1
20020042743 Ortiz et al. Apr 2002 A1
20020042918 Townsend Apr 2002 A1
20020057340 Fernandez et al. May 2002 A1
20020057364 Anderson, Jr. et al. May 2002 A1
20020058499 Ortiz May 2002 A1
20020063697 Amano May 2002 A1
20020063799 Ortiz et al. May 2002 A1
20020065074 Cohn et al. May 2002 A1
20020069243 Raverdy Jun 2002 A1
20020069419 Raverdy et al. Jun 2002 A1
20020073421 Levitan et al. Jun 2002 A1
20020077974 Ortiz Jun 2002 A1
20020083468 Dudkiewicz Jun 2002 A1
20020087979 Dudkiewicz Jul 2002 A1
20020087987 Dudkiewicz Jul 2002 A1
20020092019 Marcus Jul 2002 A1
20020095357 Hunter et al. Jul 2002 A1
20020108125 Joao Aug 2002 A1
20020124249 Shintani Sep 2002 A1
20020133247 Smith et al. Sep 2002 A1
20020138587 Koehler et al. Sep 2002 A1
20020152462 Hoch et al. Oct 2002 A1
20020152476 Anderson, Jr. et al. Oct 2002 A1
20020161579 Saindon et al. Oct 2002 A1
20020166119 Cristofalo Nov 2002 A1
20020167442 Taylor Nov 2002 A1
20020174430 Ellis et al. Nov 2002 A1
20020184641 Johnson et al. Dec 2002 A1
20020188943 Freeman et al. Dec 2002 A1
20020194589 Cristofalo et al. Dec 2002 A1
20020194601 Perkes et al. Dec 2002 A1
20020199198 Stonedahl Dec 2002 A1
20030005455 Bowers Jan 2003 A1
20030007464 Balani Jan 2003 A1
20030014412 Collart Jan 2003 A1
20030017826 Fishman et al. Jan 2003 A1
20030043769 Dolman et al. Mar 2003 A1
20030051253 Barone Mar 2003 A1
20030065805 Barnes, Jr. Apr 2003 A1
20030069762 Gathman Apr 2003 A1
20030069829 Gathman et al. Apr 2003 A1
20030070182 Pierre et al. Apr 2003 A1
20030088873 McCoy et al. May 2003 A1
20030093794 Thomas et al. May 2003 A1
20030100326 Grube et al. May 2003 A1
20030105558 Steele Jun 2003 A1
20030110503 Perkes Jun 2003 A1
20030112354 Ortiz et al. Jun 2003 A1
20030189589 LeBlanc et al. Oct 2003 A1
20030189668 Newnam et al. Oct 2003 A1
20030220091 Farrand et al. Nov 2003 A1
20030220835 Barnes, Jr. Nov 2003 A1
20030222819 Karr et al. Dec 2003 A1
20040003398 Donian et al. Jan 2004 A1
20040006774 Anderson, Jr. et al. Jan 2004 A1
20040024812 Park Feb 2004 A1
20040032495 Ortiz Feb 2004 A1
20040042103 Mayer Mar 2004 A1
20040073927 Knudson et al. Apr 2004 A1
20040093265 Ramchandani et al. May 2004 A1
20040117829 Karaoguz et al. Jun 2004 A1
20040133467 Siler Jul 2004 A1
20040136547 Anderson, Jr. et al. Jul 2004 A1
20040137891 Clark Jul 2004 A1
20040158638 Peters et al. Aug 2004 A1
20040171381 Inselberg Sep 2004 A1
20040185856 McKenna Sep 2004 A1
20040186813 Tedesco et al. Sep 2004 A1
20040192329 Barbosa et al. Sep 2004 A1
20040193371 Koshiji et al. Sep 2004 A1
20040193499 Ortiz et al. Sep 2004 A1
20040194134 Gunatilake Sep 2004 A1
20040196181 Huston et al. Oct 2004 A1
20040203338 Zilliacus Oct 2004 A1
20040203663 Bowman Oct 2004 A1
20040210923 Hudgeons et al. Oct 2004 A1
20040212731 Sie et al. Oct 2004 A1
20040220753 Tabe Nov 2004 A1
20040229568 Lowe et al. Nov 2004 A1
20040229671 Stronach et al. Nov 2004 A1
20040235542 Stronach et al. Nov 2004 A1
20040261127 Freeman et al. Dec 2004 A1
20050021364 Nakfoor Jan 2005 A1
20050021365 Nakfoor Jan 2005 A1
20050021467 Franzdonk Jan 2005 A1
20050028190 Rodriguez et al. Feb 2005 A1
20050033506 Peterson Feb 2005 A1
20050042591 Bloom et al. Feb 2005 A1
20050050151 Mitchell et al. Mar 2005 A1
20050086079 Graves Apr 2005 A1
20050097595 Lipsanen May 2005 A1
20050104958 Egnal et al. May 2005 A1
20050114324 Mayer May 2005 A1
20050120369 Matz Jun 2005 A1
20050136949 Barnes, Jr. Jun 2005 A1
20050160465 Walker Jul 2005 A1
20050169253 Hu Aug 2005 A1
20050172706 Paulsen et al. Aug 2005 A1
20050188010 Valk Aug 2005 A1
20050201302 Gaddis et al. Sep 2005 A1
20050203927 Sull Sep 2005 A1
20050210512 Anderson, Jr. et al. Sep 2005 A1
20050216299 Anderson et al. Sep 2005 A1
20050243755 Stephens Nov 2005 A1
20050251827 Ellis et al. Nov 2005 A1
20050251835 Scott et al. Nov 2005 A1
20050273830 Silver et al. Dec 2005 A1
20050273911 Skiba et al. Dec 2005 A1
20050275626 Mueller et al. Dec 2005 A1
20050280705 Anderson et al. Dec 2005 A1
20050289597 Kawahara Dec 2005 A1
20060004643 Stadelmann et al. Jan 2006 A1
20060015904 Marcus Jan 2006 A1
20060025158 Leblanc et al. Feb 2006 A1
20060038818 Steele Feb 2006 A1
20060064716 Sull et al. Mar 2006 A1
20060069749 Herz et al. Mar 2006 A1
20060094409 Inselberg May 2006 A1
20060095471 Krikorian et al. May 2006 A1
20060095472 Krikorian et al. May 2006 A1
20060104600 Abrams May 2006 A1
20060107295 Margis et al. May 2006 A1
20060117365 Ueda et al. Jun 2006 A1
20060117371 Margulis Jun 2006 A1
20060123053 Scannell Jun 2006 A1
20060126544 Markel Jun 2006 A1
20060126556 Jiang Jun 2006 A1
20060149633 Voisin et al. Jul 2006 A1
20060154657 Inselberg Jul 2006 A1
20060156219 Haot Jul 2006 A1
20060173701 Gurvey Aug 2006 A1
20060174288 Bichot Aug 2006 A1
20060174297 Anderson, Jr. et al. Aug 2006 A1
20060179462 Willame et al. Aug 2006 A1
20060184431 Rosenberg Aug 2006 A1
20060184538 Randall Aug 2006 A1
20060190250 Saindon et al. Aug 2006 A1
20060200842 Chapman et al. Sep 2006 A1
20060212585 Eaton et al. Sep 2006 A1
20060223528 Smith Oct 2006 A1
20060242680 Johnson et al. Oct 2006 A1
20060244839 Glatron et al. Nov 2006 A1
20060252526 Walker et al. Nov 2006 A1
20060253330 Maggio et al. Nov 2006 A1
20060253542 McCausland et al. Nov 2006 A1
20060259924 Boortz Nov 2006 A1
20060268363 Meinders Nov 2006 A1
20060268828 Yarlagadda Nov 2006 A1
20060276174 Katz et al. Dec 2006 A1
20060277308 Morse Dec 2006 A1
20060282319 Maggio Dec 2006 A1
20060288375 Ortiz et al. Dec 2006 A1
20070014536 Hellman Jan 2007 A1
20070015586 Huston Jan 2007 A1
20070018880 Huston Jan 2007 A1
20070018952 Arseneau et al. Jan 2007 A1
20070019068 Arseneau et al. Jan 2007 A1
20070019069 Arseneau et al. Jan 2007 A1
20070021055 Arseneau et al. Jan 2007 A1
20070021056 Arseneau et al. Jan 2007 A1
20070021057 Arseneau et al. Jan 2007 A1
20070021058 Arseneau et al. Jan 2007 A1
20070022289 Alt Jan 2007 A1
20070022438 Arseneau et al. Jan 2007 A1
20070022445 Arseneau et al. Jan 2007 A1
20070022446 Arseneau et al. Jan 2007 A1
20070022447 Arseneau et al. Jan 2007 A1
20070050191 Weider et al. Mar 2007 A1
20070058041 Arseneau et al. Mar 2007 A1
20070061266 Moore et al. Mar 2007 A1
20070061487 Moore et al. Mar 2007 A1
20070061845 Barnes, Jr. Mar 2007 A1
20070094698 Bountour et al. Apr 2007 A1
20070095887 Barbosa et al. May 2007 A1
20070117576 Huston May 2007 A1
20070118426 Barnes, Jr. May 2007 A1
20070121534 James et al. May 2007 A1
20070156443 Gurvey Jul 2007 A1
20070173266 Barnes, Jr. Jul 2007 A1
20070180062 Southerland et al. Aug 2007 A1
20070197247 Inselberg Aug 2007 A1
20070202900 Inselberg Aug 2007 A1
20070233585 Ben Simon et al. Oct 2007 A1
20070279494 Aman et al. Dec 2007 A1
20070286596 Lonn Dec 2007 A1
20080016534 Ortiz et al. Jan 2008 A1
20080036653 Huston Feb 2008 A1
20080065735 Szeto et al. Mar 2008 A1
20080065768 Ortiz et al. Mar 2008 A1
20080065997 Szeto et al. Mar 2008 A1
20080133421 Myers et al. Jun 2008 A1
20080191009 Gressel et al. Aug 2008 A1
20080192116 Tamir et al. Aug 2008 A1
20080198230 Huston Aug 2008 A1
20080200161 Morse Aug 2008 A1
20080259096 Huston Oct 2008 A1
20080270579 Herz et al. Oct 2008 A1
20080281903 Kwiatkowski Nov 2008 A1
20080288355 Rosen Nov 2008 A1
20080294434 Pettinato Nov 2008 A1
20090009605 Ortiz Jan 2009 A1
20090018903 Iyer Jan 2009 A1
20090029780 Amaitis Jan 2009 A1
20090046152 Aman Feb 2009 A1
20090069040 Wiesmuller et al. Mar 2009 A1
20090083448 Craine et al. Mar 2009 A1
20090144624 Barnes, Jr. Jun 2009 A1
20090191962 Hardy et al. Jul 2009 A1
20090256817 Perlin et al. Oct 2009 A1
20090281392 Brown Nov 2009 A1
20100023865 Fulker Jan 2010 A1
20100100915 Krikorian et al. Apr 2010 A1
20100150525 Walker Jun 2010 A1
20100274614 Fraley Oct 2010 A1
Foreign Referenced Citations (30)
Number Date Country
779175 Sep 2000 AU
2237939 Aug 1998 CA
2369832 Sep 2000 CA
2 361 659 May 2003 CA
0 578 201 Jan 1994 EP
0 578 201 Jan 1994 EP
1166596 Jan 2002 EP
2 355 135 Apr 2001 GB
WO 9303571 Feb 1993 WO
WO 9411855 May 1994 WO
WO9708896 Mar 1997 WO
WO 9708896 Mar 1997 WO
WO9841020 May 1997 WO
WO 9831148 Jul 1998 WO
WO 9841020 Sep 1998 WO
9939299 Aug 1999 WO
WO 0054554 Sep 2000 WO
WO0108417 Feb 2001 WO
WO 0108417 Feb 2001 WO
0120572 Mar 2001 WO
WO 02096097 Nov 2002 WO
WO 02096104 Nov 2002 WO
WO 03042939 May 2003 WO
WO 2004034617 Apr 2004 WO
WO 2005011254 Feb 2005 WO
WO 2005076625 Aug 2005 WO
2006067545 Jun 2006 WO
WO 2006085844 Aug 2006 WO
WO 2007009225 Jan 2007 WO
PCTCA2006001969 Mar 2007 WO
Related Publications (1)
Number Date Country
20050050575 A1 Mar 2005 US