Video/audio system and method enabling a user to select different views and sounds associated with an event

Information

  • Patent Grant
  • 9374548
  • Patent Number
    9,374,548
  • Date Filed
    Friday, April 4, 2014
    10 years ago
  • Date Issued
    Tuesday, June 21, 2016
    8 years ago
Abstract
A video/audio system includes an interface device that receives a plurality of audio and video signals from a plurality of sources. The interface device combines these signals into various combinations and transmits the combinations to a receiver. The receiver is configured to interface one of the combinations of signals with a user. In this regard, the receiver allows the user to select one of the combinations, and in response, the receiver separates the video signal(s) of the selected combination from the audio signal(s) of the selected combination. Then, the receiver renders the video signal(s) via a display device and produces a sound defined by the audio signal(s) via a speaker. Accordingly, the user is able to control which set of audio and video signals are interfaced with the user.
Description
BACKGROUND OF THE INVENTION

1. Field of the Invention


The present invention generally relates to video and audio signal processing techniques and, in particular, to a system and method for receiving video and audio signals from a plurality of sources and for providing a user with multiple combinations of these signals to select from.


2. Related Art


Audio and video signals are generated from a plurality of sources during many events. For example, at an auto race, television crews usually position cameras at various locations within view of a race track. These cameras generate video signals defining views of the race track from various perspectives. In addition, microphones positioned at various locations generate audio signals defining different sounds at the-auto race. For example, microphones may be located close to the race track to receive sounds produced by the vehicles participating in the race, and microphones may be located close to television commentators to receive the comments of the commentators as they observe and comment on the race.


One of the video signals and one or more of the audio signals are usually selected and combined together at a television station to form a combined video/audio signal. This signal is then modulated and transmitted so that users having a television can receive the combined signal via the television. The television demodulates the combined signal and displays an image defined by the video signal on a display screen and reproduces the sounds defined by the audio signals via speakers. Therefore, the sights and sounds of the race can be viewed and heard via the television.


In addition, one or more of the audio signals, such as audio signals defining the comments of radio commentators, are usually selected and modulated at a radio station to form a radio signal. This radio signal is then transmitted as a wireless signal so that users having radios can receive the signal via a radio. The radio demodulates the signal and reproduces the sounds defined by the radio signal via speakers.


However, users viewing and/or hearing the sights and sounds of the race via televisions and/or radios are not usually given the opportunity to select which video and/or audio signals are modulated and transmitted to the television and/or radio. Therefore, the user is only able to receive the signals modulated and transmitted to the television and/or radio, even though the user may prefer to receive the other audio and/or video signals that are generated at the auto race.


Spectators who actually attend the auto race are usually given more options to view and/or hear the sights and/or sounds of the race from different perspectives. In this regard, a plurality of monitors are usually located at a particular location in the stadium. As used herein, “stadium” shall be defined to mean any non-movable structure having a large number (i.e., thousands) of seats, wherein an event occurs at (i.e., within a close proximity) of the seats such that spectators sitting in the seats can view the event. An “event” is any occurrence viewed by a spectator.


Each monitor within the stadium receives one of the aforementioned video signals and displays an image defined by the received video signal. Therefore, a spectator can view the monitor displaying the image that has a perspective desirable to the spectator. However, the monitor having the desired perspective is often not located in a convenient location for the spectator. In this regard, the spectator usually must leave his seat (or other location) in the stadium and go to a location where the spectator, along with other spectators, can view the monitor displaying the desired perspective.


Thus a heretofore unaddressed need exists in the industry for providing a system and method that enables a spectator to conveniently view an event from different perspectives.


SUMMARY OF THE INVENTION

The present invention overcomes the inadequacies and deficiencies of the prior art as discussed hereinbefore. Generally, the present invention provides a video/audio system and method for receiving video and audio signals from a plurality of sources and for providing a user with multiple combinations of these signals to select-from.


The present invention includes an interface device that receives a plurality of audio and video signals from a plurality of sources. The interface device combines these signals into various combinations and transmits the combinations to a receiver. The receiver is configured to interface one of the combinations of signals with a user. In this regard, the receiver allows the user to select one of the combinations, and in response, the receiver separates the video signal(s) of the selected combination from the audio signal(s) of the selected combination. Then, the receiver renders the video signal(s) via a display device and produces a sound defined by the audio signal(s) via a speaker. Accordingly, the user is able to control which set of audio and video signals are interfaced with the user.


Other features and advantages of the present invention will become apparent to one skilled in the art upon examination of the following detailed description, when read in conjunction with the accompanying drawings. It is intended that all such features and advantages be included herein within the scope of the present invention and protected by the claims.





BRIEF DESCRIPTION OF THE DRAWINGS

The invention can be better understood with reference to the following drawings. The elements of the drawings are not necessarily to scale relative to each other, emphasis instead being placed upon clearly illustrating the principles of the invention. Furthermore, like reference numerals designate corresponding parts throughout the several views.



FIG. 1 is a block diagram illustrating a video/audio system in accordance with the present invention.



FIG. 2 is a block diagram illustrating a detailed view of an interface device depicted in FIG. 1.



FIG. 3 is a block diagram illustrating a detailed view of a receiver depicted in FIG. 1.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT

The preferred embodiment of the present invention will be described hereafter in the context of auto racing applications. However, the scope of the present invention should not be so limited, and it should be apparent to one skilled in the art that the principles of the present invention may be employed in the context of other applications, particularly in the context of other sporting events (e.g., football games, basketball games, baseball games, hockey matches, etc.).



FIG. 1 depicts a video/audio system 20 implementing the principles of the present invention. At least one video signal 22 and at least one audio signal 25 are received by an interface device 28. Each of the received video signals 22 defines a view of the race from a different perspective. For example, the video signals 22 may be generated by different video cameras located at different locations around the stadium, including inside at least some of the vehicles participating in the race.


Furthermore, each of the audio signals 25 defines different sounds associated with the race. For example, at least one of the audio signals 25 may be generated from a microphone located close to the track or in one of the vehicles such that the audio signal 25 defines noise from the vehicles participating in the race. Alternatively, at least one of the audio signals 25 may define the comments of television commentators, and at least one of the audio signals 25 may define the comments of radio commentators. Furthermore, at least one of the audio signals 25 may define the comments between one of the drivers participating in the race and the driver's pit crew.


Some of the video and audio signals 22 and 25 can be unmodulated when transmitted to the interface device 28 and, therefore, do not need to be demodulated by the system 20. However, some of the video and audio signals 22 and 25 may need to be demodulated by the system 20. For example, at least one of the audio signals 25 defining the comments of the radio commentators may be modulated as a radio signal for transmission to radios located at or away from the stadium, and at least one of the video signals 25 may be modulated as a television signal for transmission to televisions located at or away from the stadium. In addition, the comments between a driver and the driver's pit crew are usually transmitted via ultra high frequency (UHF) radio waves, which are known to be modulated signals. Therefore, as shown by FIG. 1, the system 20 preferably includes demodulators 32 configured to receive and demodulate the video and/or audio signals 22 and 25.


It is possible for some of the video and audio signals 22 and 25 to be received from a combined signal 35, which is comprised of at least one video signal 22 combined with at least one audio signal 25. For example, the combined signal 35 may be a television signal modulated for transmission to televisions located at or away from the track stadium. To facilitate the combination of different audio signals 25 with the video signal(s) 22 defined by the combined signal 35, a separator 37 preferably separates the combined signal 35 into its respective video signal 22 and audio signal 25, as shown by FIG. 1.


Various configurations of the separator 37 may exist without departing from the principles of the present invention. FIG. 1 depicts a possible implementation of the separator 37. In this regard, the separator 37 includes an audio signal filter 41 designed to filter out any audio signals 25 from the combined signal 35 and to transmit the resulting video signal(s) 22 to interface device 28. Furthermore, the separator 37 also includes a video signal filter 43 designed to filter out any video signals 22 from the combined signal 35 and to transmit the resulting audio signal(s) 25 to interface device 28. If more than one video signal 22 or more than one audio signal 25 is included in the combined signal 35, then the separator 37 may include additional filters (not shown) to separate the multiple video and/or audio signals 22 and 25 into individual signals before transmitting the signals 22 and 25 to the interface device 28.



FIG. 2 depicts a more detailed view of the interface device 28. The interface device 28 includes audio combiners 52 configured to receive audio signals 25 and to combine the received audio signals 25 into a single combined audio signal 55. As shown by FIG. 2, each audio combiner 52 preferably receives a different combination of audio signals 25, although it is possible for any one of the combined signals 55 to include the same combination of audio signals 25 as any other combined signal 55. Note that when an audio combiner 52 receives only one audio signal 25, the combined signal 55 output by the combiner 52 matches the one signal 25 received by the combiner 52.


As an example, one of the combined signals 55 may include an audio signal 25 defining comments between a driver and the driver's pit crew and also an audio signal 25 defining sounds (i.e., vehicular noises) received by a microphone located in the driver's vehicle. Another of the combined signals 55 may include the aforementioned audio signals 25 as welt as an audio signal 25 defining a radio commentator's comments. Another combined signal 55 may only include an audio signal 25 defining a television commentator's comments. Accordingly, the combined signals 55 preferably define different combinations of sounds. It should be noted that combinations of audio signals 25 other than those described hereinabove are possible.


As shown by FIG. 2, each combined signal 55 is transmitted to a respective signal modulator 61. Each signal modulator 61 is also configured to receive a respective one of the video signals 25 received by the interface device 28. Each signal modulator 61 is configured to combine the received combined signal 55 and video signal 25 and to modulate the received signals 55 and 25 on a unique frequency range. The signal modulator 61 is then designed to transmit the modulated signal 64, which comprises the combined signal 55 and the video signal 25 received by the signal modulator 61, to a combiner 67. The combiner 67 is configured to combine each of the modulated signals 64 transmitted from each of the signal modulators 61 into a single combined (i.e., multiplexed) signal 71. This combined signal 71 is then transmitted to a plurality of receivers 75.


Various techniques exist for transmitting combined signal 71 to receivers 75. For example, a coaxial cable may be used to transmit the combined signal 71 to each of the receivers 75. In another example, the system 20 may include a wireless transmitter (not shown) that transmits the combined signal 71 to the receivers 75. Any technique for transmitting the combined signal 71 to the receivers 75 should be suitable for implementing the present invention.


A more detailed view of receiver 75 is shown by FIG. 3. Receiver 75 preferably includes a demodulator 82. The demodulator 82 is configured to demodulate the combined signal 71 and to separate (i.e., demultiplex) the combined signal 71 into signals 84 based on frequency, such that each signal 84 respectively corresponds with one of the modulated signals 64. In this regard, the demodulator 82 recovers the individual signals 64 as signals 84, and each signal 84 is, therefore, defined by the same video and audio signals 22 and 25 that define its corresponding modulated signal 64. Therefore, like modulated signals 64, each signal 84 is preferably comprised of a unique combination of video and audio signals 22 and 25.


Signals 84 are transmitted from demodulator 82 to a multiplexer 88, which also receives control signals 92 from a user interface 94. The user interface 94 preferably includes buttons or other types of switches that enable a spectator to select one of the signals 84 via control signals 92. In this regard, the multiplexer 88, through techniques well known in the art, selects one of the signals 84 based on control signals 92 and outputs the selected signal 84 as output signal 97, as shown by FIG. 3.


The receiver 75 includes an audio signal filter 41 configured to filter the audio signal(s) 25 out of signal 97. Therefore, only the video signal(s) 22 within signal 97 are transmitted to a display screen 101, which is configured to render the received video signal(s) 22 (i.e., display an image defined by the received video signal(s) 22) to the spectator.


The receiver 75 also includes a video signal filter 43 configured to filter the video signal(s) 22 out of signal 97. Therefore, only the audio signal(s) 25 within signal 97 are transmitted to a speaker 103, which is configured to produce sounds defined by the received audio signal(s) 25, through techniques well known in the art.


In the preferred embodiment, the display screen 101 and speaker 103 are included within a head mounted display (HMD), which is a well known device of the prior art. An example of a head mounted display suitable for implementing the present invention is fully described in U.S. Pat. No. 5,844,656, entitled “Head Mounted Display with Adjustment Components” and filed on Nov. 7, 1996, by Ronzani et al., which is incorporated herein by reference. Furthermore, when the combined signal 71 is transmitted via a coaxial cable, the receiver 75 may be located at a spectator's stadium seat or other convenient location. When the combined signal 71 is transmitted via a wireless transmitter, the receiver 75 is portable, and a spectator may carry the receiver 75 with him and choose where he would like to view the images and hear the sounds produced by the receiver 75.


Accordingly, the spectator may remain in his seat (or other convenient location) and control, by manipulating buttons or other types of switches in the user interface 94, which combination of video and audio signals 22 and 25 are respectively transmitted to display screen 101 and speaker 103. Therefore, the system 20 gives the spectator more flexibility in how the spectator views the race and, as a result, makes the race a more enjoyable experience.


Operation


The preferred use and operation of the video/audio system 20 and associated methodology are described hereafter.


Assume for illustrative purposes only that a spectator would like to attend an auto race and would like to have access to an in-car view from a camera within his favorite driver's car. In addition, the spectator would also like to continuously hear the dialogue between the aforementioned driver and the driver's pit crew, as well as the comments provided by his favorite radio commentator. It should be apparent that other views and/or sounds may be desirable in other examples.


In the past, the spectator would have to attend the race and acquire (as well as tune) a radio to receive the commentator's comments and a radio to receive the radio signals transmitted between the driver and the driver's pit crew. Then, the spectator would have to locate a monitor at the stadium displaying the in-car view that he desires to see. The spectator would then remain within sight of the monitor and listen to the two radios. If the monitor is not located in a desirable location for viewing the race, the spectator would have to choose between viewing the monitor and viewing the race at a desirable location. Furthermore, the handling of multiple radios is generally cumbersome and distracting.


However, in accordance with the present invention, the user attends the race and is provided a receiver 75 for his individual use. In the preferred embodiment, the receiver 75 is located at the spectator's seat within the stadium. However, the receiver 75 may be located at other convenient locations, and when the combined signal 71 is transmitted via a wireless transmitter, the spectator may carry the receiver 75 around with him to any desirable location in or around the stadium.


The spectator then manipulates buttons or other types of switches at user interface 94 to control which signal 84 is output by multiplexer 88 and, therefore, which signals 22 and 25 are respectively received by display 101 and speaker 103. Accordingly, the spectator may use the receiver 75 to see the desired view of the race (i.e., the in-car view) and to hear the desired sounds of the race (i.e., the sounds received by the microphone in his favorite driver's car, the dialogue between the driver and the driver's pit crew, and the comments from the radio commentator).


In this regard, the interface device 28 preferably receives at least a video signal 22 defining the in-car view of his favorite driver and a plurality of audio signals 25 defining the sounds received by the microphone in his favorite driver's car, the dialogue between the driver and the driver's pit crew, and the comments from the radio commentator. At least one of the audio combiners 52 combines these audio signals 25 into a combined signal 55. One of the signal modulators 61 receives this combined signal 55 and the video signal 22 defining the desired in-car view. This video signal 22 is modulated and combined with the foregoing combined signal 55 by one of the signal modulators 61 to create a modulated signal 64. This modulated signal 64 is combined with other modulated signals 64 and transmitted to the spectator's receiver 75 via combiner 67.


The demodulator 82 in the spectator's receiver 75 demodulates and separates the received signal 71 into separate signals 84. Based on the control signals 92 received from user interface 94, the multiplexer 88 allows only the signal 84 defined by the aforementioned video and audio signals 22 and 25 to pass. Therefore, these video and audio signals 22 and 25 are respectively transmitted to the display 101 and speaker 103 and the spectator may enjoy the view and sounds that he selected.


It should be noted that it is not necessary for the spectator to keep the receiver 75 within the stadium. In this regard, the signal 71 may be transmitted via satellites and/or communication networks to various locations around the world, and the spectator may select the view and sounds he prefers the most from just about any location capable of receiving signal 71.


It should also be noted that the receiver 75 may be retrieved from the spectator after the spectator is finished viewing the event so that the receiver can be provided to another spectator for another event at the stadium. Each spectator is preferably charged a usage fee for spectator's use of the receiver 75.


Furthermore, the present invention has been described herein in the context of auto racing. However, the system 20 may be useful in other applications as well. The system 20 would be useful in any application where it is desirable for the user to control the types of views and sounds of an event that are presented to the user. For example, the present invention could be particularly useful in any type of sporting event or other type of event attended by a large number of people.


It should be emphasized that the above-described embodiments of the present invention, particularly, any “preferred” embodiments, are merely possible examples of implementations, merely set forth for a clear understanding of the principles of the invention. Many variations and modifications may be made to the above-described embodiment(s) of the invention without departing substantially from the spirit and principles of the invention. All such modifications and variations are intended to be included herein within the scope of the present invention and protected by the claims.

Claims
  • 1. A content delivery network for delivering content to portable wireless handheld devices used by users while attending a live event, each of the portable wireless handheld devices having a wireless receiver, a display and a user interface, the network comprising: an interface configured to receive video signals comprising at least first and second video signals captured by at least first and second video cameras from a plurality of video cameras located at the live event, the plurality of video cameras being located at different locations around the live event, the plurality of video cameras configured to produce video signals containing live video content of the live event from different viewing perspectives, wherein the first and second video signals containing corresponding first and second video content of the live event from different first and second viewing perspectives; anda transmitter configured to wirelessly transmit said first video signal locally, in an area encompassing the live event, in order for reception by the wireless receiver of a first portable wireless handheld device, at said live event and in order for selective display of the first video content; andwherein, in response to a first indication from a user indicating a desire to have access to the second video content,the transmitter is configured to transmit said second video signal to the wireless receiver of the first portable wireless handheld device in a manner that enables continuous display of the second video content on the display of the first portable wireless handheld until a second indication from the user indicating a desire to view another video content captured by another camera from said plurality of video cameras.
  • 2. The content delivery network of claim 1, wherein said live event is a sporting event of one of a football game, a basketball game, a baseball game, a hockey game, golf, and an automobile race.
  • 3. The content delivery network of claim 1, wherein one of said cameras is located in a vehicle participating in an automobile race.
  • 4. The content delivery network of claim 1, wherein said at least first and second video signals include a video signal from one of said cameras and a television transmission signal.
  • 5. The content delivery network of claim 1, wherein said transmitter is configured to transmit said at least first and second video signals continuously over an RF link.
  • 6. The content delivery network of claim 1, wherein said interface is configured to receive a plurality of audio signals associated with said event, said transmitter configured to locally transmit, for reception by the portable wireless handheld devices at said live event, corresponding said video signals and corresponding said plurality of audio signals.
  • 7. The content delivery network of claim 6, wherein said video and audio signals define different views and different sounds of said live event and said live event is an automobile race, with said different views including in-car views, said different sounds including audio between a driver of a car and a pit crew for said driver.
  • 8. The content delivery network of claim 1, further comprising a plurality of portable wireless handheld devices including wireless receivers configured to receive at least a corresponding one of said first and second video signals, and user interfaces configured to permit a user to select and view video content from a single video camera uninterrupted until the user chooses to select another one of said video cameras.
  • 9. The content delivery network of claim 8, wherein said plurality of portable wireless handheld devices each comprises a display configured to display video defined by a selected one of said video signals continuously under the control of the user through the user interface.
  • 10. The content delivery network of claim 1, wherein said transmitter is configured to transmit both of said at least first and second video signals to the first portable wireless handheld device.
  • 11. The content delivery network of claim 1, further comprising a communication network configured to transmit said video signals within a predetermined area adjacent said live event.
  • 12. The content delivery network of claim 1, wherein said interface comprises a plurality of audio combiners configured to combine a plurality of audio signals.
  • 13. The content delivery network of claim 1, wherein said interface comprises a plurality of signal modulators configured to combine an audio signal with at least one of said first and second video signals.
  • 14. The content delivery network of claim 1, wherein said interface comprises a signal combiner configured to combine said video signals to form at least one combined signal for transmission.
  • 15. The content delivery network of claim 1, further comprising a multiplexer configured to combine the video signals into a plurality of multiplexed video signals, wherein said transmitter is configured to transmit the plurality of multiplexed video signals carried over a carrier frequency.
  • 16. A method providing video and audio for a live event to portable wireless handheld devices used by users while attending the live event, the method comprising: receiving, at an interface, first and second video signals captured by first and second video recording devices from a plurality of recording devices at the live event, the first and second video signals containing corresponding first and second video content of the live event from different first and second viewing perspectives;wirelessly transmitting, from a transmitter, said first video signal in an area encompassing the live event, to a wireless receiver of a first portable wireless handheld device at said live event and for selective display of the first video content;wherein, in response to a first indication from a user indicating a desire to have access to the second video content,i) transmitting said second video signal to the wireless receiver of the first portable wireless handheld device in a manner that enables continuous display of the second video content on a display of the first portable wireless handheld device; andii) enabling continuous display of the second video content on the display of the first portable wireless handheld device until receiving a second indication from the user indicating a desire to view another video content captured by another recording device from said plurality of recording devices.
  • 17. The method of claim 16, further providing portable wireless handheld devices in combination with seats for users at said live event.
  • 18. The method of claim 16, further configuring a plurality of video recording devices to provide a plurality of views of different locations within said live event and configuring a plurality of audio devices to provide a plurality of sounds from different locations within said event, and configuring said portable wireless handheld devices to display at least one of said plurality of views and produce audio of at least one of said plurality of sounds.
  • 19. The method of claim 18, wherein each of said portable wireless handheld devices comprises a user interface to select any one of said received video and audio signals based upon a user input.
  • 20. The method of claim 19, wherein each of said portable wireless handheld devices comprises at least one signal conditioner configured to condition at least one of said received video and audio signals.
  • 21. The method of claim 20, wherein said at least one signal conditioner comprises at least one signal demodulator.
  • 22. The method of claim 20, wherein the transmitting further comprises transmitting both of said first and second video signals to the first portable wireless handheld device.
  • 23. The method of claim 20, wherein the transmitting further comprises transmitting only one of said first and second video signals to the first portable wireless handheld device.
CROSS REFERENCE TO RELATED APPLICATION

This document is a continuation of and claims priority to copending non-provisional U.S. Patent Application entitled “VIDEO/AUDIO SYSTEM AND METHOD ENABLING A USER TO SELECT DIFFERENT VIEWS AND SOUNDS ASSOCIATED WITH AN EVENT,” assigned Ser. No. 13/554,347, filed Jul. 20, 2012, which is a continuation of non-provisional U.S. Patent Application entitled “VIDEO/AUDIO SYSTEM AND METHOD ENABLING A USER TO SELECT DIFFERENT VIEWS AND SOUNDS ASSOCIATED WITH AN EVENT,” assigned Ser. No. 11/932,449, filed Oct. 31, 2007, now issued as U.S. Pat. No. 8,239,910, which is a continuation of U.S. patent application Ser. No. 10/453,385, filed Jun. 3, 2003, now abandoned, which is a continuation of non-provisional U.S. Patent Application entitled “VIDEO/AUDIO SYSTEM AND METHOD ENABLING A USER TO SELECT DIFFERENT VIEWS AND SOUNDS ASSOCIATED WITH AN EVENT” assigned Ser. No. 09/322,411, filed May 28, 1999, now issued as U.S. Pat. No. 6,578,203 which are incorporated herein by reference in their entirety. This document also claims priority to and the benefit of the filing date of provisional application entitled “Audio/Video Signal Distribution System for Head Mounted Displays,” assigned Ser. No. 60/123,341, and filed Mar. 8, 1999, which is hereby incorporated by reference in their entirety.

US Referenced Citations (229)
Number Name Date Kind
4472830 Nagai Sep 1984 A
4479150 Ilmer et al. Oct 1984 A
4486897 Nagai Dec 1984 A
4504861 Dougherty Mar 1985 A
4572323 Randall Feb 1986 A
4580174 Tokunaka Apr 1986 A
4605950 Goldberg et al. Aug 1986 A
4615050 Lonnstedt Oct 1986 A
4620068 Wieder Oct 1986 A
4665438 Miron May 1987 A
4727585 Flygstad Feb 1988 A
4764817 Blazek et al. Aug 1988 A
4802243 Griffiths Feb 1989 A
4809079 Blazek et al. Feb 1989 A
4855827 Best Aug 1989 A
4856118 Sapiejewski Aug 1989 A
4864425 Blazek et al. Sep 1989 A
4866515 Tagawa et al. Sep 1989 A
4887152 Matsuzaki et al. Dec 1989 A
4965825 Harvey et al. Oct 1990 A
4982278 Dahl et al. Jan 1991 A
5023707 Briggs Jun 1991 A
5023955 Murphy, II et al. Jun 1991 A
5109414 Harvey et al. Apr 1992 A
5119442 Brown Jun 1992 A
5128765 Dingwall et al. Jul 1992 A
5133081 Mayo Jul 1992 A
5138440 Radice Aug 1992 A
5138722 Urella et al. Aug 1992 A
5173721 Green Dec 1992 A
5179736 Scanlon Jan 1993 A
5189630 Barstow et al. Feb 1993 A
5237648 Mills et al. Aug 1993 A
5243415 Vance Sep 1993 A
5252069 Lamb et al. Oct 1993 A
5289272 Rabowsky et al. Feb 1994 A
5289288 Silverman et al. Feb 1994 A
5297037 Ifuku Mar 1994 A
5321416 Bassett et al. Jun 1994 A
5359463 Shirochi et al. Oct 1994 A
5392158 Tosaki Feb 1995 A
5408686 Mankovitz Apr 1995 A
5414544 Aoyagi et al. May 1995 A
5440197 Gleckman Aug 1995 A
5448291 Wickline Sep 1995 A
5463428 Lipton et al. Oct 1995 A
5481478 Palmieri et al. Jan 1996 A
5485504 Ohnsorge Jan 1996 A
5506705 Yamamoto et al. Apr 1996 A
5510828 Lutterbach Apr 1996 A
5513384 Brennan et al. Apr 1996 A
5524195 Clanton et al. Jun 1996 A
5546099 Quint et al. Aug 1996 A
5583562 Birch et al. Dec 1996 A
5585850 Schwaller Dec 1996 A
5585858 Harper et al. Dec 1996 A
5594551 Monta Jan 1997 A
5598208 McClintock Jan 1997 A
5600365 Kondo et al. Feb 1997 A
5600368 Matthews, III Feb 1997 A
5613191 Hylton et al. Mar 1997 A
5617331 Wakai Apr 1997 A
5627915 Rosser et al. May 1997 A
5631693 Wunderlich et al. May 1997 A
5642221 Fischer et al. Jun 1997 A
5663717 DeLuca Sep 1997 A
5668339 Shin Sep 1997 A
5671320 Cookson et al. Sep 1997 A
5682172 Travers et al. Oct 1997 A
5696521 Robinson et al. Dec 1997 A
5708961 Hylton et al. Jan 1998 A
5712950 Cookson et al. Jan 1998 A
5719588 Johnson Feb 1998 A
5729471 Jain et al. Mar 1998 A
5729549 Kostreski et al. Mar 1998 A
5742263 Wang et al. Apr 1998 A
5742521 Ellenby et al. Apr 1998 A
5754254 Kobayashi et al. May 1998 A
5760819 Sklar et al. Jun 1998 A
5760824 Hicks, III Jun 1998 A
5760848 Cho Jun 1998 A
5767820 Bassett et al. Jun 1998 A
5793416 Rostoker et al. Aug 1998 A
5806005 Hull et al. Sep 1998 A
5808695 Rosser et al. Sep 1998 A
5812224 Maeda et al. Sep 1998 A
5815126 Fan et al. Sep 1998 A
5841122 Kirchhoff Nov 1998 A
5844656 Ronzani et al. Dec 1998 A
5847612 Birleson Dec 1998 A
5847762 Canfield et al. Dec 1998 A
5867223 Schindler et al. Feb 1999 A
5867579 Saito Feb 1999 A
5878324 Borth et al. Mar 1999 A
5880773 Suzuki Mar 1999 A
5892554 DiCicco et al. Apr 1999 A
5894320 Vancelette Apr 1999 A
5900849 Gallery May 1999 A
5903395 Rallison et al. May 1999 A
5920827 Baer et al. Jul 1999 A
5946635 Dominguez Aug 1999 A
D413881 Ida et al. Sep 1999 S
5953076 Astle et al. Sep 1999 A
5982445 Eyer et al. Nov 1999 A
5986803 Kelly Nov 1999 A
5990958 Bheda et al. Nov 1999 A
5999808 LaDue Dec 1999 A
6002720 Yurt et al. Dec 1999 A
6002995 Suzuki et al. Dec 1999 A
6009336 Harris et al. Dec 1999 A
6016348 Blatter et al. Jan 2000 A
6020851 Busack Feb 2000 A
6034716 Whiting et al. Mar 2000 A
6035349 Ha et al. Mar 2000 A
6043837 Driscoll, Jr. et al. Mar 2000 A
6052239 Matsui et al. Apr 2000 A
6060995 Wicks et al. May 2000 A
6064860 Ogden May 2000 A
6069668 Woodham, Jr. et al. May 2000 A
D426527 Sakaguchi Jun 2000 S
6078954 Lakey et al. Jun 2000 A
6080063 Khosla Jun 2000 A
6084584 Nahi et al. Jul 2000 A
6088045 Lumelsky et al. Jul 2000 A
6095423 Houdeau et al. Aug 2000 A
6097441 Allport Aug 2000 A
6100925 Rosser et al. Aug 2000 A
6104414 Odryna et al. Aug 2000 A
6112074 Pinder Aug 2000 A
6121966 Teodosio et al. Sep 2000 A
6124862 Boyken et al. Sep 2000 A
6125259 Perlman Sep 2000 A
6128143 Nalwa Oct 2000 A
6131025 Riley et al. Oct 2000 A
6133946 Cavallaro et al. Oct 2000 A
6137525 Lee et al. Oct 2000 A
6144375 Jain et al. Nov 2000 A
6166734 Nahi et al. Dec 2000 A
6192257 Ray Feb 2001 B1
6195090 Riggins et al. Feb 2001 B1
6209028 Walker et al. Mar 2001 B1
6215475 Meyerson et al. Apr 2001 B1
6327570 Stevens Dec 2001 B1
6330021 Devaux Dec 2001 B1
6347301 Bearden, III et al. Feb 2002 B1
6351252 Atsumi et al. Feb 2002 B1
6356905 Gershman et al. Mar 2002 B1
6380978 Adams et al. Apr 2002 B1
6401085 Gershman et al. Jun 2002 B1
6417853 Squires et al. Jul 2002 B1
6421031 Ronzani et al. Jul 2002 B1
6424369 Adair et al. Jul 2002 B1
6434403 Ausems et al. Aug 2002 B1
6434530 Sloane et al. Aug 2002 B1
6463299 Macor Oct 2002 B1
6466202 Suso et al. Oct 2002 B1
6505055 Kahn et al. Jan 2003 B1
6522352 Strandwitz et al. Feb 2003 B1
6525762 Mileski et al. Feb 2003 B1
6526580 Shimomura et al. Feb 2003 B2
6532152 White et al. Mar 2003 B1
6535254 Olsson et al. Mar 2003 B1
6535493 Lee et al. Mar 2003 B1
6549229 Kirby et al. Apr 2003 B1
6564070 Nagamine et al. May 2003 B1
6567079 Smailagic et al. May 2003 B1
6570889 Stirling-Gallacher et al. May 2003 B1
6574672 Mitchell et al. Jun 2003 B1
6578203 Anderson, Jr. et al. Jun 2003 B1
6597346 Havey et al. Jul 2003 B1
6624846 Lassiter Sep 2003 B1
6669346 Metcalf Dec 2003 B2
6681398 Verna Jan 2004 B1
6781635 Takeda Aug 2004 B1
6782238 Burg et al. Aug 2004 B2
6785814 Usami et al. Aug 2004 B1
6912517 Agnihotri et al. Jun 2005 B2
6931290 Forest Aug 2005 B2
6934510 Katayama Aug 2005 B2
6961430 Gaske et al. Nov 2005 B1
7006164 Morris Feb 2006 B1
7149549 Ortiz Dec 2006 B1
7210160 Anderson, Jr. et al. Apr 2007 B2
7227952 Qawami et al. Jun 2007 B2
7268810 Yoshida Sep 2007 B2
7448063 Freeman et al. Nov 2008 B2
8732781 Anderson, Jr. May 2014 B2
20010016486 Ko Aug 2001 A1
20010030612 Kerber et al. Oct 2001 A1
20010034734 Whitley et al. Oct 2001 A1
20010039180 Sibley et al. Nov 2001 A1
20010039663 Sibley Nov 2001 A1
20010042105 Koehler et al. Nov 2001 A1
20010047516 Swain et al. Nov 2001 A1
20020007490 Jeffery Jan 2002 A1
20020014275 Blatt et al. Feb 2002 A1
20020046405 Lahr Apr 2002 A1
20020052965 Dowling May 2002 A1
20020057365 Brown May 2002 A1
20020063799 Ortiz et al. May 2002 A1
20020069416 Stiles Jun 2002 A1
20020069419 Raverdy et al. Jun 2002 A1
20020090217 Limor et al. Jul 2002 A1
20020091723 Traner et al. Jul 2002 A1
20020095682 Ledbetter Jul 2002 A1
20020104092 Arai et al. Aug 2002 A1
20020108125 Joao Aug 2002 A1
20020115454 Hardacker Aug 2002 A1
20020122137 Chen et al. Sep 2002 A1
20020130967 Sweetser Sep 2002 A1
20020138582 Chandra et al. Sep 2002 A1
20020138587 Koehler Sep 2002 A1
20020152476 Anderson et al. Oct 2002 A1
20030004793 Feuer et al. Jan 2003 A1
20030005052 Feuer et al. Jan 2003 A1
20030005437 Feuer et al. Jan 2003 A1
20030005457 Faibish et al. Jan 2003 A1
20030014275 Bearden, III et al. Jan 2003 A1
20030023974 Dagtas et al. Jan 2003 A1
20030204630 Ng Oct 2003 A1
20040034617 Kaku Feb 2004 A1
20040073437 Halgas et al. Apr 2004 A1
20040073915 Dureau Apr 2004 A1
20040203630 Wang Oct 2004 A1
20040207719 Tervo et al. Oct 2004 A1
20040243922 Sirota et al. Dec 2004 A1
20050076387 Feldmeier Apr 2005 A1
20060174297 Anderson et al. Aug 2006 A1
20070107028 Monroe May 2007 A1
Foreign Referenced Citations (11)
Number Date Country
1241860 Apr 1999 EP
2372892 Sep 2002 GB
10136277 May 1998 JP
20010275101 Oct 2001 JP
9411855 May 1994 WO
WO 9411855 May 1994 WO
WO 9966670 Dec 1999 WO
WO 0054554 Sep 2000 WO
03001772 Jan 2003 WO
WO-2004002130 Dec 2003 WO
2004034617 Apr 2004 WO
Non-Patent Literature Citations (73)
Entry
Ron Glover; “Armchair Baseball From the Web—Or Your Stadium Seat”; copyright 1998; The McGraw-Hill Companies, Inc.; 2 pgs.
CHOICESEAT™ Fact Sheet; Jun. 13, 2007; 4 pgs.
ChoiceSeat—Events Operations Manual for Madison Square Garden; Dec. 15, 1999; Intel Corporation; 91 pgs.
ChoiceSeat™; www.choiceseat.net; 1999 Williams Communications; 71 pgs.
ChoiceSeat—System Adminstrator's Binder for Madison Square Garden; Dec. 17, 1999; 80 pgs.
ChoiceSeat—In Your Face Interactive Experience—1998 Superbowl; Broncos v. Packers; 15 pgs.
In-Seat Interactive Advertising Device Debuts; Nov. 19, 1999; Williams; 2 pgs.
Reality Check Studios Goes Broadband with Production for Choiceseat at Madison Square Garden; Dec. 1, 1999; 3 pgs.
Press Release: Vela Research LP to Supply Encoding for ChoiceSeat at SuperBowl XXXII; Jan. 13, 1998; 2 pgs.
Ruel's Report: ChoiceSeat; ChoiceSeat makes Worldwide Debut at the 1998 Super Bowl in San Diego California; Sep. 1, 1997; 9 pgs.
San Diego Metropolitan; Jan. 1998; 29 pgs.
Stadium fans touch the future—Internet Explorer and touch screens add interactivity to Super Bowl XXXII; Jan. 26, 1998; 2 pgs.
Telephony online Intelligence for the Broadband Economy; Fans take to ChoiceSeats: Interactive technology, e-commerce expand to sporting events; Jan. 10, 2000; 2 pgs.
Williams ChoiceSeat interactive network launches inaugural season with Tampa Bay Devil Rays; expands features for second season; Mar. 30, 1998; 2 pgs.
Williams Communications; ChoiceSeat™ demonstrates the interactive evolution of sports at Super Bowl™ XXXIII; Jan. 20, 1999; 2 pgs.
ChoiceSeat The Premiere Provider of Interactive Event Entertainment; 18 pgs.
Choice Seat Specification; Version 2.2; Ethernet Model; Williams Communications Group; Oct. 10, 1997; 13 pgs.
ChoiceSeat Intellectual Property List; 3 pgs.
CSI Incorporated Draft; Schedule A-IP; Schedule of Patents; 2 pgs.
HK-388P/PW Color Camera Operation Manual; vol. 2; Ikegami; 280 pgs.
Eric Breier; Computer age comes to ballpark; Quallcomm is test site for ChoiceSeat's sports television network; Aug. 1997; 2 pgs.
Robert Carter; Web Technology: It's in the Game; SiteBuilder network; Dec. 15, 1997; 1 pg.
CholceSeat™ Fact Sheet; Project: Super Bowl XXXII; Qualcomm Stadium, San Diego, Calif., USA; Jan. 25, 1998; 1 pg.
Screen Shot Super Bowl XXXII; Jan. 25, 1998; 1 pg.
Vyvx® ChoiceSeat Cover; 1 pg.
Welcome to the Interactive Evolution of Sports. ChoiceSeat™; Jan. 1998; 1 pg.
The Ultimate Super Bowl Experience! Williams ChoiceSeat™ Jan. 1998; 1 pg.
Bradley J. Fikes; Super Bowl XXXII; It's just business; For lucky 600 fans, there'll be TV sets at the seats; San Diego North County Times; Jan. 1998; 1 pg.
D.R. Stewart; Williams Interactive Video Gives Football Fans Choice; Tulsa World; Jan. 1998; tulsaworld.com; 2 pgs.
ChoiceSeat Handout; Welcome to the Interactive Evolution of Sports. www.choiceseat.net; 1 pg.
Cyberscope; Just Call It Wired Bowl; Jan. 28, 1998; 1 pg.
Ruel.Net Set-Top p. Interactive TV Top.Box.News; Ruel's Report: ChoiceSeat; Fall 1998; 7 pgs.
Williams ChoiceSeat interactive network launches inaugural season with Tampa Bay Devil Rays; expands features for second season with San Diego Padres; www.williams.com/newsroom/news—releases; Mar. 30, 1998; 2 pgs.
The Herald: Super Bowl Turns Techno Bowl; Jan. 24, 1999; 1 pg.
Williams Communications' ChoiceSeat™ demonstrates the interactive evolution of sports at Super Bowl™ XXXIII; http://www.williams.com/newsroom/news—releases; Jan. 20, 1999; 3 pgs.
NTN Interactive games available on ChoiceSeat™ during Super Bowl XXXIII; Jan. 1999;. 1 pg.
Williams Fact Sheet; Super Bowl™ XXXIII; Pro Player Stadium, Miami, Florida, USA; Jan. 31, 1999; 1 pg.
Super Bowl XXXIII Game Recap; http://www.nfl.com/superbowl/history/recap/sbxxxiii; 8 pgs.
ChoiceSeat™ User Guide; New York Knicks; The Garden Fanlink; 8 pgs.
ChoiceSeat™ User Guide; New York Rangers; The Garden Fanlink; 8 pgs.
ChoiceSeat™ Flow Chart; New York Knicks; The Garden Fanlink; 1 pg.
ChoiceSeat™ Presentation Document; The “Be There” Experience; 15 pgs.
In-Seat Interactive Advertising Device Debuts; http://www.williams.com/newsroom/news—releases; Nov. 19, 1999; 2 pgs.
Intel and ChoiceSeat™ collaborate to advance interactive sports technology; http://www.williams.com/newsroom/news—releases; Nov. 29, 1999; 3 pgs.
Media Coverage; ChoiceSeat The Interactive Evolution of Sports; Good News Travels Fast; 1 pg.
Screen Shot: ChoiceSeat The Interactive Evolution of Sports; 1 pg.
Digital Video; ChoiceSeat Coverage; www.dv.com; Apr. 2000; 11 pgs.
Wall Street Journal; With Wired Seats, Fans Get Replays, Rules, Snacks; May 21, 2000; 1 pg.
Wireless History; www.jhsph.edu/wireless/story; 5 pgs.
Wikipedia; Wireless LAN; 4 pgs.
Proposed ChoiceSeat Client Specification Summary; Initial Draft Aug. 29, 1997; Updated Sep. 30, 1997; 2 pgs.
Proposed ChoiceSeat Network Specification Summary; Initial Draft Aug. 25, 1997; 2 pgs.
Proposed ChoiceSeat Network Specification Summary; Updated Draft Sep. 30, 1997; 4 pgs.
Quallcomm Stadium ChoiceSeat Network Diagram; May 11, 1998; 5 pgs.
Schedule of Personal Property; Patents; Software and Trademarks etc Draft; 3 pgs.
PCT International Search Report dated Feb. 5, 2004; In re International Application No. PCT/US03/31696.
Written Opinion cited document in International Application No. PCT/US03/31696.
Office Action dated Aug. 10, 2007; U.S. Appl. No. 10/630,069, filed Jul. 30, 2003; Applicant: Tazwell L. Anderson, Jr.; 11 pages.
Office Action dated Aug. 23, 2007; U.S. Appl. No. 09/837,128, filed Apr. 18, 2001; Applicant: Tazwell L. Anderson, Jr.; 13 pages.
Office Action dated Sep. 7, 2007; U.S. Appl. No. 10/453,385, filed Jul. 30, 2003; Applicant: Tazwell L. Anderson, Jr.; 14 pages.
Dapeng, Wu; et al; “On End-to-End Architecture for Transporting MPEG-4 Video Over the Internet” IEEE Transaction vol. 10, No. 6, Sep. 2000, 19 pgs.
Capin, Tolga K., Petajen, Eric and Ostermann, Joern; “Efficient Modeling of Virtual Humans in MPEG-4” IEEE 2000, 4 pgs.
Battista, Stafano; Casalino, Franco and Lande, Claudio; “MPEG-4: A Multimedia Standard for the Third Millennium, Part 1”; IEEE 1999, 10 pgs.
Wireless Dimensions Corporation Adds to Mobile-Venue Suite™; Press Release, Wireless Dimensions; Allen, Texas; Jul. 26, 2000; www.wirelessdimensions.net/news.html, 2 pgs.
Seeing is Believing—Motorola and PacketVideoDemonstrate MPEG4 Video Over GPRS; Publication: Business Wire Date: Wednesday, May 10, 2000; www.allbusiness.com; 4 pgs.
Adamson, W.A.; Antonelli, C.J.; Coffman, K.W.; McDaniel, P.; Rees, J.; Secure Distributed Virtual Conferencing Multicast or Bust; CITI Technical Report 99-1; Jan. 25, 1999; 8 pgs.
SGI and the Pepsi Center; 2 pgs.
Office Action dated Sep. 10, 2007; U.S. Appl. No. 10/680,612, filed Oct. 7, 2003; Applicant: Tazwell L. Anderson, Jr.; 19 pages.
Sony GV S50 Video Walkman Operating Instructions; 1993; pp. 3.
Spanberg,Erik; “Techies Hit the Fast Track”; The Business Journal. Charlotte: Jul. 30, 1999; vol. 14; Iss. 17; pp. 3.
Hiestand, Michael; Up Next: Rent Wireless Video Devices at games: [Final Edition]; USA Today; MdLean, VA; Jan. 29, 2002; pp. 2.
PR Newswire; “Baseball Fans to Get Best of Both Worlds: Seats int eh Stadium and Up Close Camera Shots”; New York; Mar. 22, 2002; pp. 2.
Canadian Office Action for Application No. 2,598,644; dated Sep. 24, 2014, (4 pages).
Related Publications (1)
Number Date Country
20140218536 A1 Aug 2014 US
Provisional Applications (1)
Number Date Country
60123341 Mar 1999 US
Continuations (4)
Number Date Country
Parent 13554347 Jul 2012 US
Child 14245058 US
Parent 11932449 Oct 2007 US
Child 13554347 US
Parent 10453385 Jun 2003 US
Child 11932449 US
Parent 09322411 May 1999 US
Child 10453385 US