Field of the Invention
Certain embodiments of the present invention generally relate to video and audio signal processing techniques and, in particular, to a system and method for receiving video and audio signals from a plurality of sources and for providing a user with multiple combinations of these signals to select from. Certain embodiments of the present invention generally relate to an apparatus for processing video and/or audio signals and for displaying images and producing sounds based on the processed video and/or audio signals. Certain embodiments of the present invention generally relate to video and audio device programming, charging, and vending and, in particular, to a system and method for programming and charging one or more personal audio/video devices.
Related Art
Audio and video signals are generated from a plurality of sources during many events, For example, at an auto race, television crews usually position cameras at various locations within view of a racetrack. These cameras generate video signals defining views of the racetrack from various perspectives. In addition, microphones positioned at various locations generate audio signals defining different sounds at the auto race. For example, microphones may be located close to the race track to receive sounds produced by the vehicles participating in the race, and microphones may be located close to television commentators to receive the comments of the commentators as they observe and comment on the race. As another example, at a football game or other type of sporting event, television crews usually position cameras and microphones at various locations in the stadium.
One of the video signals and one or more of the audio signals are usually selected and combined together at a television station to form a combined video/audio signal. This signal is then modulated and transmitted so that users having a television can receive the combined signal via the television. The television demodulates the combined signal and displays an image defined by the video signal on a display screen and reproduces the sounds defined by the audio signals via speakers. Therefore, the sights and sounds of the race can be viewed and heard via the television.
In addition, one or more of the audio signals, such as audio signals defining the comments of radio commentators, are usually selected and modulated at a radio station to form a radio signal. This radio signal is then transmitted as a wireless signal so that users having radios can receive the signal via a radio. The radio demodulates the signal and reproduces the sounds defined by the radio signal via speaker.
However, users viewing and/or hearing the sights and sounds of the race or game via televisions and/or radios are not usually given the opportunity to select which video and/or audio signals are modulated and transmitted to the television and/or radio. Therefore, the user is only able to receive the signals modulated and transmitted to the television and/or radio, even though the user may prefer to receive the other audio and/or video signals that are generated at the auto race or game.
Spectators who actually attend the sporting event are usually given more options to view and/or hear the sights and/or sounds of the sporting event from different perspectives. In this regard, a plurality of monitors are usually located at particular locations in the stadium. As used herein, “stadium” shall be defined to mean any non-movable structure having a large number (i.e., thousands) of seats, wherein an event occurs at (i.e., within a close proximity of) the seats such that spectators sitting in the seats can view the event. An “event” is any occurrence viewed by a spectator.
Each monitor within the stadium receives one of the aforementioned video signals and displays an image defined by the received video signal to many of the spectators. However, the monitor does not always display a desirable perspective with respect to each spectator in the stadium, and the monitor is often located in an inconvenient location for many of the spectators. In this regard, many of the spectators often must leave their seats (or other locations) in the stadium and go to a location where the spectators, along with other spectators, can view the monitor displaying the desired perspective. The spectators viewing the monitor often do not have control over which image is displayed by the monitor.
Thus, a heretofore-unaddressed need exists in the industry for providing a system and method that enables a spectator to conveniently view an event from different perspectives.
A way to address this need is with personal audio/video devices for use by spectators at an event or for use in association with an event. However, stadiums have varying audio and video frequencies available for use in connection with various events. Different stadiums in different geographical locations will also have different audio and video frequencies available for transmission in connection with the events.
Accordingly, there also exists a need to alter or program the audio and video frequencies used by audio/video devices to ensure that one or more audio/video devices are able to receive the proper audio and video frequencies at each stadium and event. Similarly, after each use, an audio/video device may need to be charged before its next use. Accordingly, a need exists for providing and system and method for charging one or more audio/video device between uses.
The present invention overcomes the inadequacies and deficiencies of the prior art as discussed hereinbefore. In accordance with certain embodiments, a system and method are provided for providing a user with a plurality of audio and video signals defining different sounds and views associated with an event. The system includes a handheld device having a video receiver, a virtual image display device, and one or more speakers. The virtual image display device produces virtual visual images based on received video signals, and the speakers produce sounds based on the received audio signals. As a result, the user may hear the sounds produced by the speakers and may see the video images produced by the display device by holding the handheld device to the user's face, or the user may watch the event live by removing the handheld device from the user's face.
In accordance with another embodiment, the handheld device incorporates an integrated light shield/shroud to block ambient light that can interfere with the user's ability to view the virtual image. Unlike individual eye shrouds characteristic of a pair of optical binoculars, the present shroud shields both eyes at the same time. Among other advantages, the present shroud enables the user to operate the device while wearing eyeglasses or sunglasses.
In accordance with other embodiments, a system and method are provided for providing a user with a plurality of audio and video signals defining different views and sounds associated with an event while reducing the amount of external noise heard by the user. The system includes a display device, a head mount, noise reduction devices, and a speaker. The display device is coupled to the head mount and produces visual images based on received video signals. The head mount is mounted on the user's head and is coupled to the noise reduction devices, which cover the user's ears such that external noise is reduced. The noise reduction devices are coupled together via a strap that fits around the user head. The noise reduction devices are coupled to and house speakers that produce sound signals based on received audio signals. As a result, the user may see the video images produced by the display device and the sounds produced by the speaker, and the external noise heard by the user is reduced.
In accordance with another feature, the head mount has ridges formed thereon, and the noise reduction devices include notches. Once the noise reduction devices are properly positioned, the ridges are received by the notches, and the noise reduction device is, therefore, less likely to move with respect to the head mount.
In accordance with other embodiments, a system and method are provided for programming and/or charging one or more audio/video devices such that the audio/video device or devices will be properly programmed and charged to receive transmitted audio and video signals associated with an event, allowing a user to use the audio/video device to observe the sights and sounds of the event.
The system includes a cart with a securing mechanism for each of a plurality of personal audio/video devices, a charger configured to charge the power source of each personal audio/video display device, and programming logic configured to program each of the personal audio/video devices.
In accordance with another feature, the cart includes a control panel to allow the appropriate audio and video frequencies to be selected for programming the personal audio/video devices.
Other features and advantages of the present invention will become apparent to one skilled in the art upon examination of the following detailed description, when read in conjunction with the accompanying drawings. It is intended that all such features and advantages be included herein within the scope of the present invention and protected by the claims.
The invention can be better understood with reference to the following drawings. The elements of the drawings are not necessarily to scale relative to each other, emphasis instead being placed upon clearly illustrating the principles of the invention. Furthermore, like reference numerals designate corresponding parts throughout the several views.
The preferred embodiment of the present invention will be described hereafter in the context of auto racing applications. However, the scope of the present invention should not be so limited, and it should be apparent to one skilled in the art that the principles of the present invention may be employed in the context of other applications, particularly in the context of other sporting events (e.g., football games, basketball games, baseball games, hockey matches, etc.) and at various stadiums housing the sporting events.
Furthermore, each of the audio signals 25 defines different sounds associated with the race, For example, at least one of the audio signals 25 may be generated from a microphone located close to the track or in one of the vehicles such that the audio signal 25 defines noise from the vehicles participating in the race. Alternatively, at least one of the audio signals 25 may define the comments of television commentators, and at least one of the audio signals 25 may define the comments of radio commentators. Furthermore, at least one of the audio signals 25 may define the comments between one of the drivers participating in the race and the driver's pit crew.
Some of the video and audio signals 22 and 25 can be unmodulated when transmitted to the interface device 28 and, therefore, do not need to be demodulated by the system 20. However, some of the video and audio signals 22 and 25 may need to be demodulated by the system 20. For example, at least one of the audio signals 25 defining the comments of the radio commentators may be modulated as a radio signal for transmission to radios located at or away from the stadium, and at least one of the video signals 22 may be modulated as a television signal for transmission to televisions located at or away from the stadium. In addition, the comments between a driver and the driver's pit crew are usually transmitted via ultra high frequency (UHF) radio waves, which are known to be modulated signals. Therefore, as shown by
It is possible for some of the video and audio signals 22 and 25 to be received from a combined signal 35, which is comprised of at least one video signal 22 combined with at least one audio signal 25. For example, the combined signal 35 may be a television signal modulated for transmission to televisions located at or away from the track stadium. To facilitate the combination of different audio signals 25 with the video signal(s) 22 defined by the combined signal 35, a separator 37 preferably separates the combined signal 35 into its respective video signal 22 and audio signal 25, as shown by
Various configurations of the separator 37 may exist without departing from the principles of the present invention.
As an example, one of the combined signals 55 may include an audio signal 25 defining comments between a driver and the driver's pit crew and also an audio signal 25 defining sounds (i.e., vehicular noises) received by a microphone located in the driver's vehicle. Another of the combined signals 55 may include the aforementioned audio signals 25 as well as an audio signal 25 defining a radio commentator's comments. Another combined signal 55 may only include an audio signal 25 defining a television commentator's comments. Accordingly, the combined signals 55 preferably define different combinations of sounds. It should be noted that combinations of audio signals 25 other than those described hereinabove are possible.
As shown by
Various techniques exist for transmitting combined signal 71 to receivers 75. For example, a coaxial cable may be used to transmit the combined signal 71 to each of the receivers 75. In another example, the system 20 may include a wireless transmitter (not shown) that transmits the combined signal 71 to the receivers 75. Any technique for transmitting the combined signal 71 to the receivers 75 should be suitable for implementing the present invention.
A more detailed view of receiver 75 is shown by
Signals 84 are transmitted from demodulator 82 to a multiplexer 88, which also receives control signals 92 from a user interface 94. The user interface 94 preferably includes buttons or other types of switches that enable a spectator to select one of the signals 84 via control signals 92. In this regard, the multiplexer 88, through techniques well known in the art, selects one of the signals 84 based on control signals 92 and outputs the selected signal 84 as output signal 97, as shown by
The receiver 75 includes an audio signal filter 41 configured to filter the audio signal(s) 25 out of signal 97. Therefore, only the video signal(s) 22 within signal 97 are transmitted to a display device 101, which is configured to render the received video signal(s) 22 (i.e., display an image defined by the received video signal(s) 22) to the spectator.
The receiver 75 also includes a video signal filter 43 configured to filter the video signal(s) 22 out of signal 97. Therefore, only the audio signal(s) 25 within signal 97 are transmitted to a speaker 103, which is configured to produce sounds defined by the received audio signal(s) 25, through techniques well known in the art.
In an embodiment, the display device 101 and speaker 103 may be included within a head mounted display (HMD), which is discussed in further detail hereinbelow. By utilizing head mounted displays, the spectator's experience may be enhanced. For example, when a head mounted display is used to show an in-car view from a camera located in a driver's ear during an auto race, the spectator sees a similar view as the driver of the car. Because the head mounted display limits the spectator's peripheral view of the environment around him, the user naturally focuses on the view provided by the head mounted display. Therefore, the user may feel almost as if he were riding in the car along with the driver, thereby enhancing the spectator's experience. The head mounted display may similarly enhance a spectator's experience at other events, such as other sporting events, for example.
Furthermore, when the combined signal 71 is transmitted via a coaxial cable, the receiver 75 may be located at a spectator's stadium seat or other convenient location. When the combined signal 71 is transmitted via a wireless transmitter, the receiver 75 is portable, and a spectator may carry the receiver 75 with him and choose where he would like to view the images and hear the sounds produced by the receiver 75.
Accordingly, the spectator may remain in his seat (or other convenient location) and control, by manipulating buttons or other types of switches in the user interface 94, which combination of video and audio signals 22 and 25 are respectively transmitted to display device 101 and speaker 103. Therefore, the system 20 gives the spectator more flexibility in how the spectator views the race and, as a result, makes the race a more enjoyable experience.
It should be noted that video signals 22 and audio signals 25 may be separately transmitted to receiver 75. For example, video signals 22 may be processed and transmitted to receiver 75 via interface device 28 or other type of device, and audio signals 25 may be transmitted to receiver 75 via another device. Through conventional techniques, the receiver 75 may then be configured to select the audio and video signals 25 and 22 to be transmitted to display device 101 and speaker 103.
Head Mounted Displays
Many different types of head mounted displays may be employed to implement the present invention. Examples of head mounted displays that may be used to implement the present invention are fully described in U.S. Pat. No. 5,844,656, entitled “Head Mounted Display with Adjustment Components” and filed on Nov. 7, 1996, by Ronzani et all, and U.S. Pat. No. 5,903,395, entitled “Personal Visual Display System,” and filed on Aug. 31, 1994, by Rallison et al., which are both incorporated herein by reference.
As depicted in
As can be seen in
A connection is provided for establishing communication or data transfer to the HMD 151, which, in the depicted embodiment, involves a cable 171 mounted along the underside of the left temple piece 154a. As an example, the demodulator 82 (
As can be seen by
In the depicted embodiment, a rocker switch 179 can be used to provide control of a parameter which varies through a range, such as the volume of the sound produced by the speakers 103a and 103b. Other items that could be controlled in this fashion include, but are not limited to, tint, hue or contrast of the video, selection of a video and/or audio source such as channel selection, image brightness, audio tone (i.e., treble/bass control) and the like. A slider switch 181 can be used, e.g., to select among discrete choices. For example, the slider switch 181 may be used to select left, right or no relative frame phasing, to select between stereo and non-stereoscopic views, etc. Other controls and/or indicators can also be used and can be mounted on various surfaces of the head-mounted apparatus of
Left speaker 103a is movably attached to the end of the temple piece 154a, e.g., by pivotal arm 185a which can be laterally adjusted to a mounting slot 188a in temple piece 154a. The speaker 103a can be held in position by friction or a detent tightener 189 can be used to secure the speaker 103a in the desired position. Right speaker 103b is similarly secured to temple piece 154b. Cables 191a and 191b are respectively used in the HMD 151 of
At many sporting events (e.g., auto races, in particular), relatively loud noises are produced. Therefore, it would be difficult for a user to hear the selected audio signals via many conventional head mounted displays, such as the one depicted by
Similar to HMD 151 of
Similar to U.S. Pat. No. 5,018,599, entitled “Headphone Device,” and filed on Sep. 18, 1989, by Masahiro et al., which is incorporated herein by reference, each noise reduction device 252a and 252b is respectively coupled to and houses speakers 103a and 103b, The speakers 103a and 103b are respectively coupled to cables 191a and 191b, and produce sound corresponding to the audio signals transmitted via cables 191a and 191b. Consequently, in use, external noises are attenuated, yet the spectator can clearly hear the selected audio signals produced by the speakers 103a and 103b.
Device 252a will be described in more detail hereafter. However, it should be apparent to one skilled in the art that device 252b includes the same features of device 252a except that device 252b is coupled to temple piece 259b (instead of piece 259a) and is designed to cover the spectator's opposite ear.
Referring to
As shown by
Therefore, the user can slide the device 252a in the x-direction along the length of the temple piece 259a causing the ridge 301 to be received by different notches 303 until the device 252a is properly positioned relative to the spectator's head (i.e., until the spectator's ear is comfortably positioned within the recess 267 of the device 252a). Once the spectator stops sliding the device 252a and the ridge 301 is received by one of the notches 303, the position of the device 252a relative to the temple piece 259a and, therefore, the spectator's head should remain constant until a force sufficient for defaulting the flexible portion is exerted on the HMD 250.
As shown by
In this regard, tightening the strap 156 reduces the circumference of the HMD 250 thereby pressing each device 252a and 252b and the forehead brace 161 further against the spectator's head. To a certain degree, as the strap 156 is tightened, external noise is better attenuated, and it is less likely that the HMD 250 will move with respect to the spectator's head. Accordingly, the spectator can tighten or loosen the strap 156 as desired until the desired fit and desired noise reduction is achieved.
It should be noted that it is possible to swap the position of ridge 301 with notches 303. In other words, it is possible to form ridge 301 on a flexible portion of device 252a and to form the notches 303 in the temple piece 259a without materially affecting the performance or operation of the HMD 250.
In particular, at least one of the audio and one of the video signals may be transmitted as a single combined signal from an audio/video system described in U.S. patent application Ser. No. 09/322,411 entitled “Video/Audio System and Method Enabling a User to Select Different Views and Sounds Associated with an Event.” Additionally, one or more of the video and/or audio signals may be wireless, in which case, the interface 318 may comprise an antenna for receiving the wireless signals. However, various other types of signal interfaces 318 are possible. For example, the signal interface 318 may be a cable or other type of signal transmission apparatus. Any type of wireless and/or non-wireless technique may be used to transmit signals to the video and audio receiver 316 via the signal interface 318.
Some of the audio and video signals 315 and 314 can be unmodulated when transmitted to the receiver 316 through the signal interface 318 and, therefore, do not need to be demodulated by the system 313. However, some of the audio signals 315 and/or video signals 314 may be modulated when received by the receiver 316 and, therefore, may need to be demodulated by the system 313. For example, at least one of the audio signals 315 defining the comments of the radio commentators may be modulated as a radio signal for transmission to radios located at or away from the stadium, and at least one of the video signals 314 may be modulated as a television signal for transmission to televisions located at or away from the stadium. Therefore, as shown by
Once demodulated, if necessary, the audio and video signals 315 and 314 are processed by signal processing logic 322, which selects and conditions the signals 315 and 314. More specifically, the signal processing logic 322 selects, based on inputs from the user, one of the audio signals 315 and one of the video signals 314. Note that the logic 322 may be implemented via hardware, software, or a combination thereof. Further, the logic 322 may include one or more filters for filtering out the unselected signals 315 and 314. After selecting one of the audio and video signals 315 and 314, the logic 322 conditions the selected video signals 314 so that they are compatible with the virtual image display system 330, and the logic 322 conditions the selected audio signals 315 so that they are compatible with the speakers 334. The logic 322 then transmits the conditioned audio signals 315 to the speakers 334, which converts the conditioned audio signals 315 into sound. The logic 322 also transmits the conditioned video signals 314 to the virtual image display system 330, which displays the image defined by the conditioned video signals 314 according to techniques known in the art. Note that the processing performed by the signal processing logic 322 may be similar to or identical to the processing performed by the system in U.S. patent application Ser. No. 09/322,411 entitled “Video/Audio System and Method Enabling a User to Select Different Views and Sounds Associated with an Event.”
An input device 324, which may comprise one or more buttons knobs, dials, or other types of switches, may be used to provide the inputs for the processing performed by the processing logic 322. By controlling the components of the input device 324, the user may control various aspects of the processing performed by the logic 322, including which video signals 314 are selected for viewing, as well as which audio signals 315 are heard and the volume of the audio signals 315.
In the preferred embodiment, the receiver 316, signal processing logic 322, virtual image display system 330, and speakers 334 are all embodied within a handheld device 350, which is discussed in further detail herein below. Note that the handheld device 350 may be comprised of a housing unit or a casing coupled to each of the components shown in
Because the handheld device 350 limits the user's peripheral view of the environment around him, the user 344 naturally focuses on the view provided by the handheld device 350. When the user 344 desires to view the game directly, the user may quickly lower the device 350 so that the user's view of the game is not obstructed by the device 350. The handheld device 350 may similarly enhance a user's experience at other events, such as other sporting events, for example.
Furthermore, since the device 350 is handheld, the device 350 is easily portable, and the user 344 may carry the handheld device 350 with him and choose where he would like to view the images produced by the handheld device 350. Indeed, the user 344 may roam the stadium with the device 350 in hand while intermittently viewing the images and hearing the sounds produced by the system 313. Furthermore, by manipulating buttons or other types of switches 356 in the user input device 324, the user 344 may control which video signals 314 are displayed and which audio signals 315 are produced by the system 313. Accordingly, the handheld device 350 gives the user 344 more flexibility in how the user 344 observes and listens to the sporting event and, as a result, makes the event a more enjoyable experience.
Many different types of casings for the handheld device 350 may be employed to implement the present invention.
As depicted in
As depicted in
It should be noted that it is not necessary for the user 344 to keep the handheld device 350 within the stadium. In this regard, the audio and video signals 315 and 314 may be transmitted via satellites and/or communication networks to various locations around the world, and the user 344 may select the view he prefers the most from just about any location capable of receiving a video signal 314 and/or audio signal 315.
It should also be noted that the handheld device 350 may be retrieved from the user 344 after the user 344 is finished viewing the event so that the handheld device 350 can be provided to another spectator for another event at the stadium. Each user 344 may be charged a usage fee for the user's use of the handheld device 350. In some embodiments, payment of the fee may be required before the user 344 is provided with the device 350. In other embodiments, the device 350 may receive information, via signals 314 and/or 315 or otherwise, indicating whether the device 350 is authorized to produce sounds and images defined by the signals 314 and 315. In this embodiment, the device 350 is configured to produce such images and sounds only when authorized to do so, and such authorization should only be transmitted to the device 350 once the user 344 of the device 350 has provided payment.
In other embodiments, the storage bins 428 may be other shapes, including but not limited to circular, or triangular. In the preferred embodiment, the vertical walls 430 and horizontal walls 432 defining the openings do not run the entire length of the front panel 422 and back panel 420, with separate sets of storage bins 428 on the left side 424 of the base 418 and the right side 426 of the base 418, rather than continuous storage bins 428 running the entire length of the front panel 422 and back panel 420.
At the top of the base 418 in
The left top panel 412 and right top panel 414 are hingedly connected to each other, and to the front storage wall 436 and rear storage wall 438, such that the left top panel 412 in the down position (
A lock 413 is provided for securing and/or locking the left top panel 412 to the base 418 when the left top panel 412 is in the down position, ensuring that the left top panel 412 does not open, for safety and security purposes. Similarly, a lock is provided for securing and/or locking the right top panel 414 to the base 418 when the right top panel 414 is in the down position.
The cart 410 depicted in
The cart 410 depicted in
As depicted in
In the preferred embodiment, the receiver pocket 454 includes a charge/program connector 458. The charge/program connector 458 receives power from the power source of the cart 410. The charge/program connector 458 is configured to engage the audio/video device 460 when the audio/video device 460 is seated in the receiver pocket 454, such that electrical current and/or information or digital data may be transmitted between the receiver pocket 454 and the audio/video device 460.
In the preferred embodiment, the base portion 456 also includes a charge indicator light 462. The charge indicator light 462 is configured to illuminate in a first color when the audio/video device 460 is connected to the charge/program connector 458, indicating that a proper connection has been made. The charge indicator light 462 is further configured to illuminate, in a second color when a proper connection has been made, and after the power source of the audio/video device 460 is fully charged.
There are also a variety of ways contemplated to select the information to be input into the audio/video device 460. For example, the cart 410, in
The control panel 448 also includes an add to memory activator 474 and an erase from memory activator 476. The add to memory activator 474 may be activated to add a frequency selected on the channel selector 470 to the memory of one or more audio/video devices 460 contained within the charging/programming area 450 of the cart 410. The selector wheels 472 of the channel selector 470 are manipulated by the operator to display a desired frequency. Once the desired frequency is selected on the channel selector 470, the add to memory activator 474 is activated by the operator. Upon activation of the add to memory activator 474, the frequency selected on the channel selector 470 is programmed into the memory of each audio/video device 460 that is fitted into a docking port 452 when the add to memory activator 474 is activated. In different implementations, the memory of the audio/video devices 460 may include software, hardware, and or firmware, and the programming of the memory may take place in a variety of manners that would be known to one of skill in the art.
Similarly, the erase from memory activator 476 may be activated to erase the frequency selected on the channel selector 470 from the memory of one or more audio/video devices 460 contained within the charging/programming area 450 of the cart 410. In the preferred embodiment, the selector wheels 472 of the channel selector 470 are manipulated by the operator to display a desired frequency. Once the desired frequency is selected on the channel selector 470, the erase from memory activator 476 is activated by the operator. Upon activation of the erase from memory activator 476, the frequency selected on the channel selector 470 is erased from the memory of each audio/video device 460 that is fitted into a docking port 452 when the erase from memory activator 476 is activated. In different implementations, the memory of the audio/video devices 460 may include software, hardware, and or firmware, and the erasing of the memory may take place in a variety of manners that would be known to one of skill in the art.
In different embodiments, the activators may be buttons, switches or other activation devices. Similarly, in other embodiments, the channel selector 470 may be a digital pad with a display, allowing manual entry of frequencies and other information by an operator through the digital pad. In yet other embodiments, the control panel 448 could include a receiving mechanism (not shown) allowing information to be transmitted to the cart 410 from a remote device, including an infra-red or other wireless device, rather than manual entry of the information on the control panel 448 itself by the operator.
Operation
An exemplary use and operation of the video/audio system and associated methodology are described hereafter.
Assume for illustrative purposes that a spectator would like to attend an auto race and would like to have access to an in-car view from a camera within his favorite driver's car. In addition, the spectator would also like to continuously hear the dialogue between the aforementioned driver and the driver's pit crew, as well as the comments provided by his favorite radio commentator. It should be apparent that other views and/or sounds may be desirable in other examples.
In the past, the spectator would attend the race and acquire (as well as tune) a radio to receive the commentator's comments and a radio to receive the radio signals transmitted between the driver and the driver's pit crew. Then, the spectator would locate a monitor at the stadium displaying the in-car view that he desires to see, assuming that such a monitor is provided. The spectator would then remain within sight of the monitor and listen to the two radios. If the monitor is not located in a desirable location for viewing the race, the spectator would have to choose between viewing the monitor and viewing the race at a desirable location. Furthermore, the handling of multiple radios is generally cumbersome and distracting.
When the user attends the race and the user is provided a receiver 75 for his individual use. In the preferred embodiment, the receiver 75 is located at the spectator's seat within the stadium. However, the receiver 75 may be located at other convenient locations, and when the combined signal 71 is transmitted via a wireless transmitter, the spectator may carry the receiver 75 around with him to any desirable location in or around the stadium.
The receiver preferably includes the HMD 250 depicted by
In this regard, the interface device 28 preferably receives at least a video signal 22 defining the in-car view of his favorite driver and a plurality of audio signals 25 defining the dialogue between his favorite driver and the driver's pit crew, as well as the comments from his favorite radio commentator. At least one of the audio combiners 52 combines these audio signals 25 into a combined signal 55. One of the signal modulators 61 receives this combined signal 55 and the video signal 22 defining the desired in-car view. This video signal 22 is modulated and combined with the foregoing combined signal 55 by one of the signal modulators 61 to create a modulated signal 64. This modulated signal 64 is combined with other modulated signals 64 and transmitted to the spectator's receiver 75 via combiner 67.
The demodulator 82 in the spectator's receiver 75 demodulates and separates the received signal 71 into separate signals 84. Based on the control signals 92 received from user interface 94, the multiplexer 88 allows only the signal 84 defined by the aforementioned video and audio signals 22 and 25 to pass. Therefore, these video and audio signals 22 and 25 are respectively transmitted to the display device 101 and speakers 103a and 103b and the spectator may enjoy the view and sounds that he selected.
It should be noted that it is not necessary for the spectator to keep the receiver 75 within a stadium. In this regard, the signal 71 may be transmitted via satellites and/or communication networks to various locations around the world, and the spectator may select the view and sounds he prefers the most from just about any location capable of receiving signal 71.
It should also be noted that the receiver 75 may be retrieved from the spectator after the spectator is finished viewing the event so that the receiver can be provided to another spectator for another event at the stadium. Each spectator is preferably charged a usage fee for the spectator's use of the receiver 75. It should be noted that a portion of the receiver 75 may be installed at the spectator's seat such that user only needs to retrieve the HMD 151 and/or other components of the receiver 75 during the event and return the retrieved components after the event. Furthermore, the entire receiver 75 may be installed at the spectator's seat such that spectator only needs to pay for the use of the receiver.
In addition, it may be desirable for one of the audio signals 25 to have a higher amplitude than the other audio signals 25. For example, a spectator may desire to hear comments from a radio commentator unless a communication between his favorite driver and the driver's pit crew occurs. When the a communication between the driver and the driver's crew occurs, the spectator would rather listen to this communication instead of the radio commentator's comments.
Accordingly, one of the audio combiners 52 is preferably used to combine a first audio signal 25 defining the radio commentator's comments and a second audio signal defining the communications between the driver and the driver's pit crew preferably increases the amplitude of the second audio signal 25 relative to the first audio signal. This may be accomplished by increasing the amplitude of the second audio signal 25 with an amplifier or by attenuating the amplitude of the first audio signal 25 with an attenuator. Therefore, when the combined signal 55 produced by the aforementioned audio combiner 52 is ultimately received by the spectator's receiver 75, which produces sound based on this combined signal 55, the user hears the radio commentator's comments when there is no communication between the driver and the driver's crew. However, when there is a communication between the driver and the driver's crew, this communication is louder than the radio commentator's comments. Accordingly, the spectator can clearly hear the communications between the driver and the driver's crew even though the spectator's ability to clearly hear the radio commentator's comments is impaired. It should be noted that the foregoing techniques for increasing the amplitude of one audio signal 25 relative to others may be employed for different combinations of audio signals 25 and is not limited to the exemplary combination described above.
Furthermore, it should also be noted that the system has been described herein in the context of auto racing. However, the system 20 may be useful in other applications as well. The system 20 would be useful in any application where it is desirable for the user to control the types of views and sounds of an event that are presented to the user. For example, the present invention could be particularly useful in any type of sporting event or other type of event attended by a large number of people.
The system is also capable of storing, vending, programming and/or charging audio/video devices 460. In an embodiment, the system programs and/or charges personal audio/video devices 460 for use in association with events at a stadium as previously disclosed in copending non-provisional U.S. patent application Ser. Nos. 09/322,411, 09/386,613, and Ser. No. 09/837,128, which have been incorporated herein by reference. The personal audio/video devices 460 may be stored in the charging/programming area 450 of the cart 410 when the personal audio/video devices 460 are not being used, with the left top panel 412 (
The steerable caster wheels 444 and tow bar 446 allow the cart 410 or a plurality of carts 410 to be easily transported to different stadiums, including stadiums in different geographic locations, or to different events at the same stadium. The hingedly connected top panels 412 and 414 allow the personal audio/video devices to be displayed and/or vended to potential users at a stadium or event, if desired, by unlocking and placing the left top panel 412 and/or right top panel 414 into the open position. Similarly, the personal audio/video devices 460 may be collected from users at the conclusion of an event and stored within the cart 410 until the personal audio/video devices 460 are vended or provided to users at the next stadium or event.
In the preferred embodiment, the cart 410 further allows programming the memory of and/or charging of the power source of one or more of a plurality of personal audio/video devices 460 when each personal audio/video device 460 is placed in a receiver pocket 454 in the charging/programming area 450 of the cart 410. As depicted in
When placed in the receiver pocket 454, the personal audio/video device 460 engages the charge/program connector 458 contained within the preferred docking port 452, establishing a connection. The charge/program connector 458 allows electric current to flow between cart 410 and the personal audio/video device 460, charging the power source of the personal audio/video device 460. Additionally, the charge/program connector 458 in the preferred embodiment is controlled by logic allowing communication of information and/or data between the cart 410 and the personal audio/video devices 460. In the preferred embodiment, the logic is contained on a charge/program printed circuit board (“PCB”) 464.
When one or more personal audio/video devices 460 are placed in the receiver pocket 454 and engage the charge/program connector 458, the power source of one or more personal audio/video device 460 may be charged through the charge/program connector 458 while the personal audio/video devices 460 are being stored in the charging/programming area 450 of the cart 410. Each docking port 452 contains a charge indicator light 462 to indicate when the personal audio/video device 460 is properly seated in the docking port 452, such that the personal audio/video devices 460 is engaged with the charge/program connector 458.
The charge indicator light 462 in the preferred embodiment is an LED light which illuminates a first color when the personal audio/video device 460 is properly seated in the docking port 452, and the power source of the personal audio/video device 460 is being charged through the charge/program connector 458. In the preferred embodiment, the charge/program PCB 464 contains logic to detect when the power source of a personal audio/video device 460 is fully charged. When the power source of a personal audio/video device 460 is fully charged, the charge indicator light 462 for the docking port 452 containing the fully charged personal audio/video device 460 illuminates a second color indicating a proper connection, and that the personal audio/video device 460 is fully charged.
Further, the charge/program PCB 464 contains logic to enable only some of a plurality of personal audio/video devices 460 to be charged if desired. In this embodiment, less than all of the plurality of personal audio/video devices 460 contained in the charging/programming area 450 of the cart 410 may be charged, and any combination of the plurality of personal audio/video devices 460 may be selected for charging if desired, with the remaining unselected personal audio/video devices 460 not being charged through the charge/program connector 458.
Additionally, the logic, contained in the charge/program PCB 464 allows the memory of the personal audio/video devices 460 to be “programmed” with various information desired.
In one embodiment, the information “programmed” into the memory of the personal audio/video devices 460 will include assigning specific audio frequencies and video frequencies for each selectable channel of the personal audio/video devices 460, such as that discussed above in relation to the add to memory activator 474 and erase from memory activator 476. In this embodiment, the specific audio frequencies and video frequencies assigned to the selectable channels of the personal audio/video devices 460 will correspond to the audio frequencies and video frequencies available for use at the next stadium or event at which the personal audio/video devices 460 will be used. In this embodiment, the “programming” could further include erasing or deleting from the memory of the personal audio/video devices 460 the audio frequencies and video frequencies used by the personal audio/video devices 460 at the previous stadium or event.
In other embodiments, the “programming” may include upgrades, updates, alterations, or modifications to the software or firmware contained in one or more of the personal audio/video devices 460 and/or in the memory of one or more of the personal audio/video devices 460 placed in the charging/programming area 450 of the cart 410. As an example, and in no way intended to limit the present invention, the personal audio/video devices 460 may include instructions contained in software, firmware, and/or hardware of the audio/video devices 460 to enable the personal audio/video devices 460 to operate. These operating instructions may include software code stored in the memory of the audio/video devices 460. The “programming” in this embodiment will include, transferring new software code and/or new portions of software code into the memory of the audio/video devices 460 to upgrade the software code in the memory of the audio/video devices 460, enhancing performance. This upgrading may be performed in a variety of manners that would be known to one of ordinary skill in the art.
As with the charging, specific personal audio/video devices 460 placed or stored in the charging/programming area 450 of the cart 410 may be selected to receive “programming” information or data, while other personal audio/video devices 460 are not “programmed.” Similarly, one or any number of personal audio/video devices 460 may be selected to receive a first set of “programming” information or data, while a second number of personal audio/video devices 460 may be selected to receive a second and different set of “programming” information or data.
By way of example, and in no way intended to limit the present invention, a first desired number of the personal audio/video devices 460 contained in the charging/programming area 450 of a cart 410 may be selected to receive a first set of audio frequencies and video frequencies, while a second desired number of the personal audio/video devices 460 contained in the unit charging/programming area of the same cart 410 may be selected to receive a second and different set of audio frequencies and video frequencies, and a third desired number of the personal audio/video devices 460 contained in the charging/programming area 450 of the same cart 410 may have all audio frequencies and video frequencies erased from memory.
The information or data to be “programmed” into one or more of the personal audio/video devices 460 may be communicated to the cart 410 by the operator in a variety of ways, including manually on a control panel 448 located on the cart 410 (
In a preferred embodiment depicted in
In this embodiment, the control panel 448 further includes an add to memory activator 474 and an erase from memory activator 476, which may be buttons, switches or other activators. By selecting a value on the channel selector 470 and activating one of the activators 474 and 476, the value on the channel selector 470 may be “programmed” into, or erased from, the memory of one or more of the personal audio/video devices 460. Additionally, in other embodiments, the control panel 448 may include a channel selector 470, which includes a keypad with a display (not shown).
In other embodiments, the control panel may include a port, connector, or wireless receiver allowing an operator to use a remote device to communicate to the cart 410 the desired information or data to be “programmed” into one or more of the personal audio/video devices 460. Similarly, in some embodiments, the cart 410 may not have a control panel 448 at all, but instead just a port, connector, or wireless receiver allowing a remote device to communicate to the cart 410, the desired information or data to be “programmed” into one or more of the personal audio/video devices 460.
It should also be noted that the present invention has been described herein in the context of auto racing. However, the system may be useful in other applications as well. The cart 410 would be useful in any application where it is desirable for the user to control the types of views and sounds of an event that are presented to the user via personal audio/video devices 460. For example, the system could be particularly useful in any type of sporting event or other type of event attended by a large number of people.
It should be emphasized that the above-described embodiments of the present invention, particularly, any “preferred” embodiments, are merely possible examples of implementations, merely set forth for a clear understanding of the principles of the invention. Many variations and modifications may be made to the above-described embodiment(s) of the invention without departing substantially from the spirit and principles of the invention. All such modifications and variations are intended to be included herein within the scope of the present invention and protected by the claims.
The present application is a continuation application of application Ser. No. 13/591,691, filed Aug. 22, 2012, which is a continuation application of application Ser. No. 12/969,139, filed Dec. 15, 2010, now U.S. Pat. No. 8,253,865 which is a continuation of application Ser. No. 11/702,716, filed Feb. 5, 2007, now U.S. Pat. No. 7,859,597, which is a divisional application of application Ser. No. 10/159,666, filed May 30, 2002, now U.S. Pat. No. 7,210,160 which was a continuation-in-part of and claims priority to non-provisional U.S. patent application entitled “Electronic Handheld Audio/Video Receiver And Listening/Viewing Device,” assigned Ser. No. 09/837,128, filed Apr. 18, 2001, now Abandoned, the complete and full subject matter of which are all expressly incorporated herein by reference in their entireties. The present application is also a continuation application of application Ser. No. 13/478,756, filed May 23, 2012, which is a continuation of U.S. patent application Ser. No. 10/630,069 filed Jul. 30, 2003, now Abandoned, which is a continuation-in-part and claims priority to U.S. patent application Ser. No. 09/837,128 filed Apr. 18, 2001, now Abandoned, for “Electronic Handheld Audio/Video Receiver and Listening Viewing Device,” which is hereby incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4472830 | Nagai | Sep 1984 | A |
4479150 | Ilmer et al. | Oct 1984 | A |
4486897 | Nagai | Dec 1984 | A |
4504861 | Dougherty | Mar 1985 | A |
4572323 | Randall | Feb 1986 | A |
4580174 | Tokunaka | Apr 1986 | A |
4605950 | Goldberg et al. | Aug 1986 | A |
4615050 | Lonnstedt | Oct 1986 | A |
4620068 | Wieder | Oct 1986 | A |
4665438 | Miron et al. | May 1987 | A |
4727585 | Flygstad | Feb 1988 | A |
4764817 | Blazek et al. | Aug 1988 | A |
4802243 | Griffiths | Feb 1989 | A |
4809079 | Blazek et al. | Feb 1989 | A |
4855827 | Best | Aug 1989 | A |
4856118 | Sapiejewski | Aug 1989 | A |
4864425 | Blazek et al. | Sep 1989 | A |
4866515 | Tagawa et al. | Sep 1989 | A |
4887152 | Matsuzaki et al. | Dec 1989 | A |
4965825 | Harvey et al. | Oct 1990 | A |
4982278 | Dahl et al. | Jan 1991 | A |
5023707 | Briggs | Jun 1991 | A |
5023955 | Murphy, II et al. | Jun 1991 | A |
5109414 | Harvey et al. | Apr 1992 | A |
5119442 | Brown | Jun 1992 | A |
5128765 | Dingwall et al. | Jul 1992 | A |
5133081 | Mayo | Jul 1992 | A |
5138440 | Radice | Aug 1992 | A |
5138722 | Urella et al. | Aug 1992 | A |
5173721 | Green | Dec 1992 | A |
5179736 | Scanlon | Jan 1993 | A |
5189630 | Barstow et al. | Feb 1993 | A |
5237648 | Mills et al. | Aug 1993 | A |
5243415 | Vance | Sep 1993 | A |
5252069 | Lamb et al. | Oct 1993 | A |
5289272 | Rabowsky et al. | Feb 1994 | A |
5289288 | Silverman et al. | Feb 1994 | A |
5297037 | Ifuku | Mar 1994 | A |
5321416 | Bassett et al. | Jun 1994 | A |
5359463 | Shirochi et al. | Oct 1994 | A |
5392158 | Tosaki | Feb 1995 | A |
5408686 | Mankovitz | Apr 1995 | A |
5414544 | Aoyagi et al. | May 1995 | A |
5440197 | Gleckman | Aug 1995 | A |
5448291 | Wickline | Sep 1995 | A |
5463428 | Lipton et al. | Oct 1995 | A |
5481478 | Palmieri et al. | Jan 1996 | A |
5485504 | Ohnsorge | Jan 1996 | A |
5506705 | Yamamoto et al. | Apr 1996 | A |
5510828 | Lutterbach | Apr 1996 | A |
5513384 | Brennan et al. | Apr 1996 | A |
5524195 | Clanton et al. | Jun 1996 | A |
5546099 | Quint et al. | Aug 1996 | A |
5585850 | Schwaller | Dec 1996 | A |
5585858 | Harper et al. | Dec 1996 | A |
5594551 | Monta | Jan 1997 | A |
5598208 | McClintock | Jan 1997 | A |
5600365 | Kondo et al. | Feb 1997 | A |
5600368 | Matthews, III | Feb 1997 | A |
5613191 | Hylton et al. | Mar 1997 | A |
5617331 | Wakai | Apr 1997 | A |
5627915 | Rosser et al. | May 1997 | A |
5631693 | Wunderlich et al. | May 1997 | A |
5642221 | Fischer et al. | Jun 1997 | A |
5663717 | DeLuca | Sep 1997 | A |
5668339 | Shin | Sep 1997 | A |
5671320 | Cookson et al. | Sep 1997 | A |
5682172 | Travers et al. | Oct 1997 | A |
5696521 | Robinson et al. | Dec 1997 | A |
5708961 | Hylton et al. | Jan 1998 | A |
5712950 | Cookson et al. | Jan 1998 | A |
5719588 | Johnson | Feb 1998 | A |
5729471 | Jain et al. | Mar 1998 | A |
5729549 | Kostreski et al. | Mar 1998 | A |
5742263 | Wang et al. | Apr 1998 | A |
5742521 | Ellenby | Apr 1998 | A |
5754254 | Kobayashi et al. | May 1998 | A |
5760819 | Sklar et al. | Jun 1998 | A |
5760824 | Hicks, III | Jun 1998 | A |
5767820 | Bassett et al. | Jun 1998 | A |
5793416 | Rostoker et al. | Aug 1998 | A |
5806005 | Hull et al. | Sep 1998 | A |
5808695 | Rosser et al. | Sep 1998 | A |
5812224 | Maeda et al. | Sep 1998 | A |
5815126 | Fan et al. | Sep 1998 | A |
5841122 | Kirchhoff | Nov 1998 | A |
5844656 | Ronzani et al. | Dec 1998 | A |
5847612 | Birleson | Dec 1998 | A |
5847762 | Canfield et al. | Dec 1998 | A |
5867223 | Schindler et al. | Feb 1999 | A |
5867579 | Saito | Feb 1999 | A |
5878324 | Borth et al. | Mar 1999 | A |
5880773 | Suzuki | Mar 1999 | A |
5892554 | DiCicco et al. | Apr 1999 | A |
5894320 | Vancelette | Apr 1999 | A |
5900849 | Gallery | May 1999 | A |
5903395 | Rallison et al. | May 1999 | A |
5920827 | Baer et al. | Jul 1999 | A |
5946635 | Dominguez et al. | Aug 1999 | A |
D413881 | Ida et al. | Sep 1999 | S |
5953076 | Astle et al. | Sep 1999 | A |
5982445 | Eyer et al. | Nov 1999 | A |
5986803 | Kelly | Nov 1999 | A |
5990958 | Bheda et al. | Nov 1999 | A |
5999808 | LaDue | Dec 1999 | A |
6002720 | Yurt et al. | Dec 1999 | A |
6002995 | Suzuki et al. | Dec 1999 | A |
6009336 | Harris et al. | Dec 1999 | A |
6016348 | Blatter et al. | Jan 2000 | A |
6020851 | Busack | Feb 2000 | A |
6034716 | Whiting et al. | Mar 2000 | A |
6035349 | Ha et al. | Mar 2000 | A |
6043837 | Driscoll, Jr. et al. | Mar 2000 | A |
6052239 | Matsui et al. | Apr 2000 | A |
6060995 | Wicks et al. | May 2000 | A |
6064860 | Ogden | May 2000 | A |
6069668 | Woodham, Jr. et al. | May 2000 | A |
D426527 | Sakaguchi | Jun 2000 | S |
6078954 | Lakey et al. | Jun 2000 | A |
6080063 | Khosla | Jun 2000 | A |
6084584 | Nahi et al. | Jul 2000 | A |
6088045 | Lumelsky et al. | Jul 2000 | A |
6095423 | Houdeau et al. | Aug 2000 | A |
6097441 | Allport | Aug 2000 | A |
6097967 | Hubbe et al. | Aug 2000 | A |
6100925 | Rosser et al. | Aug 2000 | A |
6104414 | Odryna et al. | Aug 2000 | A |
6112074 | Pinder | Aug 2000 | A |
6121966 | Teodosio et al. | Sep 2000 | A |
6124862 | Boyken et al. | Sep 2000 | A |
6125259 | Perlman | Sep 2000 | A |
6128143 | Nalwa | Oct 2000 | A |
6131025 | Riley et al. | Oct 2000 | A |
6133946 | Cavallaro et al. | Oct 2000 | A |
6137525 | Lee et al. | Oct 2000 | A |
6144375 | Jain et al. | Nov 2000 | A |
6166734 | Nahi et al. | Dec 2000 | A |
6192257 | Ray | Feb 2001 | B1 |
6195090 | Riggins et al. | Feb 2001 | B1 |
6209028 | Walker et al. | Mar 2001 | B1 |
6215475 | Meyerson et al. | Apr 2001 | B1 |
6327570 | Stevens | Dec 2001 | B1 |
6330021 | Devaux | Dec 2001 | B1 |
6347301 | Bearden, III et al. | Feb 2002 | B1 |
6351252 | Atsumi et al. | Feb 2002 | B1 |
6356905 | Gershman et al. | Mar 2002 | B1 |
6380978 | Adams et al. | Apr 2002 | B1 |
6401085 | Gershman et al. | Jun 2002 | B1 |
6417853 | Squires et al. | Jul 2002 | B1 |
6421031 | Ronzani et al. | Jul 2002 | B1 |
6424369 | Adair et al. | Jul 2002 | B1 |
6434403 | Ausems et al. | Aug 2002 | B1 |
6434530 | Sloane et al. | Aug 2002 | B1 |
6463299 | Macor | Oct 2002 | B1 |
6466202 | Suso et al. | Oct 2002 | B1 |
6505055 | Kahn et al. | Jan 2003 | B1 |
6522352 | Strandwitz et al. | Feb 2003 | B1 |
6525762 | Mileski et al. | Feb 2003 | B1 |
6526580 | Shimomura | Feb 2003 | B2 |
6532152 | White et al. | Mar 2003 | B1 |
6535254 | Olsson et al. | Mar 2003 | B1 |
6535493 | Lee et al. | Mar 2003 | B1 |
6549229 | Kirby et al. | Apr 2003 | B1 |
6564070 | Nagamine et al. | May 2003 | B1 |
6567079 | Smailagic et al. | May 2003 | B1 |
6570889 | Stirling-Gallacher et al. | May 2003 | B1 |
6574672 | Mitchell et al. | Jun 2003 | B1 |
6578203 | Anderson, Jr. | Jun 2003 | B1 |
6597346 | Havey et al. | Jul 2003 | B1 |
6624846 | Lassiter | Sep 2003 | B1 |
6669346 | Metcalf | Dec 2003 | B2 |
6681398 | Verna | Jan 2004 | B1 |
6745048 | Vargas et al. | Jun 2004 | B2 |
6781635 | Takeda | Aug 2004 | B1 |
6782238 | Burg et al. | Aug 2004 | B2 |
6785814 | Usami et al. | Aug 2004 | B1 |
6850777 | Estes et al. | Feb 2005 | B1 |
6912517 | Agnihotri et al. | Jun 2005 | B2 |
6931290 | Forest | Aug 2005 | B2 |
6934510 | Katayama | Aug 2005 | B2 |
6952558 | Hardacker | Oct 2005 | B2 |
6961430 | Gaske et al. | Nov 2005 | B1 |
7006164 | Morris | Feb 2006 | B1 |
7124425 | Anderson, Jr. | Oct 2006 | B1 |
7149549 | Ortiz | Dec 2006 | B1 |
7210160 | Anderson, Jr. | Apr 2007 | B2 |
7227952 | Qawami et al. | Jun 2007 | B2 |
7268810 | Yoshida | Sep 2007 | B2 |
7448063 | Freeman | Nov 2008 | B2 |
7859597 | Anderson et al. | Dec 2010 | B2 |
8732781 | Anderson et al. | May 2014 | B2 |
20010016486 | Ko | Aug 2001 | A1 |
20010030612 | Kerber et al. | Oct 2001 | A1 |
20010034734 | Whitley | Oct 2001 | A1 |
20010039180 | Sibley et al. | Nov 2001 | A1 |
20010039663 | Sibley | Nov 2001 | A1 |
20010042105 | Koehler et al. | Nov 2001 | A1 |
20010047516 | Swain et al. | Nov 2001 | A1 |
20020007490 | Jeffery | Jan 2002 | A1 |
20020014275 | Blatt et al. | Feb 2002 | A1 |
20020046405 | Lahr | Apr 2002 | A1 |
20020052965 | Dowling | May 2002 | A1 |
20020057365 | Brown | May 2002 | A1 |
20020063799 | Ortiz | May 2002 | A1 |
20020069416 | Stiles | Jun 2002 | A1 |
20020069419 | Raverdy et al. | Jun 2002 | A1 |
20020090217 | Limor et al. | Jul 2002 | A1 |
20020091723 | Traner et al. | Jul 2002 | A1 |
20020095682 | Ledbetter | Jul 2002 | A1 |
20020104092 | Arai et al. | Aug 2002 | A1 |
20020108125 | Joao | Aug 2002 | A1 |
20020115454 | Hardacker | Aug 2002 | A1 |
20020130967 | Sweetser | Sep 2002 | A1 |
20020138582 | Chandra et al. | Sep 2002 | A1 |
20020138587 | Koehler | Sep 2002 | A1 |
20020152476 | Anderson et al. | Oct 2002 | A1 |
20030004793 | Feuer et al. | Jan 2003 | A1 |
20030005052 | Feuer et al. | Jan 2003 | A1 |
20030005437 | Feuer et al. | Jan 2003 | A1 |
20030005457 | Faibish | Jan 2003 | A1 |
20030014275 | Bearden, III et al. | Jan 2003 | A1 |
20030023974 | Dagtas et al. | Jan 2003 | A1 |
20030204630 | Ng | Oct 2003 | A1 |
20040034617 | Kaku | Feb 2004 | A1 |
20040073437 | Halgas et al. | Apr 2004 | A1 |
20040073915 | Dureau | Apr 2004 | A1 |
20040203630 | Wang | Oct 2004 | A1 |
20040207719 | Tervo et al. | Oct 2004 | A1 |
20040243922 | Sirota et al. | Dec 2004 | A1 |
20050076387 | Feldmeier | Apr 2005 | A1 |
20060174297 | Anderson et al. | Aug 2006 | A1 |
20070060200 | Boris et al. | Mar 2007 | A1 |
20070107028 | Monroe | May 2007 | A1 |
20070207798 | Talozi et al. | Sep 2007 | A1 |
20070256107 | Anderson et al. | Nov 2007 | A1 |
Number | Date | Country |
---|---|---|
1241860 | Apr 1999 | EP |
2372892 | Sep 2002 | GB |
10136277 | May 1998 | JP |
20010275101 | Oct 2001 | JP |
9411855 | May 1994 | WO |
WO 9411855 | May 1994 | WO |
WO 9966670 | Dec 1999 | WO |
WO 0054554 | Sep 2000 | WO |
03001772 | Jan 2003 | WO |
WO-2004002130 | Dec 2003 | WO |
2004034617 | Apr 2004 | WO |
WO 2004034617 | Apr 2004 | WO |
Entry |
---|
Canadian Office Action for Application No. 2,598,644; dated Sep. 24, 2014. (4 pages). |
Ron Glover; “Armchair Baseball From the Web—Or Your Stadium Seat”; copyright 1998; The McGraw-Hill Companies, Inc.; 2 pgs. |
Choiceseat™ Fact Sheet; Jun. 13, 2007; 4 pgs. |
ChoiceSeat—Events Operations Manual for Madison Square Garden; Dec. 15, 1999; Intel Corporation; 91 pgs. |
ChoiceSeat™; www.choiceseat.net; 1999 Williams Communications; 71 pgs. |
ChoiceSeat—System Administrator's Binder for Madison Square Garden; Dec. 17, 1999; 80 pgs. |
ChoiceSeat—In Your Face Interactive Experience—1998 SuperBowl; Broncos v. Packers; 15 pgs. |
In-Seat Interactive Advertising Device Debuts; Nov. 19, 1999; Williams; 2 pgs. |
Reality Check Studios Goes Broadband with Production for Choiceseat at Madison Square Garden; Dec. 1, 1999; 3 pgs. |
Press Release: Vela Research LP to Supply Encoding for ChoiceSeat at SuperBowl XXXII; Jan. 13, 1998; 2 pgs. |
Ruel's Report: ChoiceSeat; ChoiceSeat makes Worldwide Debut at the 1998 Super Bowl in San Diego California; Sep. 1, 1997; 9 pgs. |
San Diego Metropolitan; Jan. 1998; 29 pgs. |
Stadium fans touch the future—Internet Explorer and touch screens add interactivity to Super Bowl XXXII; Jan. 26, 1998; 2 pgs. |
Telephony online Intelligence for the Broadband Economy; Fans take to ChoiceSeats: Interactive technology, e-commerce expand to sporting events; Jan. 10, 2000; 2 pgs. |
Williams ChoiceSeat interactive network launches inaugural season with Tampa Bay Devil Rays; expands features for second season; Mar. 30, 1998; 2 pgs. |
Williams Communications; ChoiceSeat™ demonstrates the interactive evolution of sports at Super Bowl™ XXXIII; Jan. 20, 1999; 2 pgs. |
ChoiceSeat the Premiere Provider of Interactive Event Entertainment; 18 pgs. |
Choice Seat Specification; Version 2.2; Ethernet Model; Williams Communications Group; Oct. 10, 1997; 13 pgs. |
ChoiceSeat Intellectual Property List; 3 pgs. |
CSI Incorporated Draft; Schedule A-IP; Schedule of Patents; 2 pgs. |
HK-388P/PW Color Camera Operation Manual; vol. 2; Ikegami; 280 pgs. |
Eric Breier; Computer age comes to ballpark; Quallcomm is test site for ChoiceSeat's sports televisio Robert Carter; Web Technology: It's in the Game; SiteBuilder network; Dec. 15, 1997; 1 pg.n network; Aug. 1997; 2 pgs. |
Robert Carter; Web Technology: It's in the Game; SiteBuilder network; Dec. 15, 1997; 1 pg. |
ChoiceSeat™ Fact Sheet; Project: Super Bowl XXXII; Qualcomm Stadium, San Diego, Calif., USA; Jan. 25, 1998; 1 pg. |
Screen Shot Super Bowl XXXII; Jan. 25, 1998; 1 pg. |
Vyvx® ChoiceSeat Cover; 1 pg. |
Welcome to the Interactive Evolution of Sports. ChoiceSeat™; Jan. 1998; 1 pg. |
The Ultimate Super Bowl Experience! Williams ChoiceSeat™ Jan. 1998; 1 pg. |
Bradley J. Fikes; Super Bowl XXXII; It's just business; for lucky 600 fans, there'll be TV sets at the seats; San Diego North County Times; Jan. 1998; 1 pg. |
D.R. Stewart; Williams Interactive Video Gives Football Fans Choice; Tulsa World; Jan. 1998; tulsaworld.com; 2 pgs. |
ChoiceSeat Handout; Welcome to the Interactive Evolution of Sports. www.choiceseat.net; 1 pg. |
Cyberscope; Just Call It Wired Bowl; Jan. 28, 1998; 1 pg. |
Ruel.Net Set-Top Page Interactive TV Top.Box.News; Ruel's Report: ChoiceSeat; Fall 1998; 7 pgs. |
Williams ChoiceSeat interactive network launches inaugural season with Tampa Bay Devil Rays; expands features for second season with San diego Padres; www.williams.com/newsroom/news—releases;Mar. 30, 1998; 2 pgs. |
The Herald: Super Bowl Turns Techno Bowl; Jan. 24, 1999; 1 pg. |
Williams communications' ChoiceSeat™ demonstrates the interactive evolution of sports at Super Bowl™ XXXIII; http://www.williams.com/newsroom/news—releases; Jan. 20, 1999; 3 pgs. |
NTN Interactive games available on ChoiceSeat™ during Super Bowl XXXIII; Jan. 1999; 1 pg. |
Williams Fact Sheet; Super Bowl™ XXXIII; Pro Player Stadium, Miami, Florida, USA; Jan. 31, 1999; 1 pg. |
Super Bowl XXXII Game Recap; http://www.nfl.com/superbowl/history/recap/sbxxxiii; 8 pgs. |
ChoiceSeat™ User Guide; New York Knicks; The Garden Fanlink; 8 pgs. |
ChoiceSeat™ User Guide; New York Rangers; The Garden Fanlink; 8 pgs. |
ChoiceSeat™ Flow Chart; New York Knicks; The Garden Fanlink; 1 pg. |
ChoiceSeat™ Presentation Document; The “Be There” Experience; 15 pgs. |
In-Seat Interactive Advertising Device Debuts; http://www.williams.com/newsroom/news—releases; Nov. 19, 1999; 2 pgs. |
Intel and ChoiceSeat™ collaborate to advance interactive sports technology; http://www.williams.com/newsroom/news—releases; Nov. 29, 1999; 3 pgs. |
Media Coverage; ChoiceSeat the Interactive Evolution of Sports; Good News Travels Fast.; 1 pg. |
Screen Shot: ChoiceSeat the Interactive Evolution of Sports; 1 pg. |
Digital Video; ChoiceSeat Coverage; www.dv.com; Apr. 2000; 11 pgs. |
Wall Street Journal; With Wired Seats, Fans Get Replays, Rules, Snacks; May 21, 2000; 1 pg. |
Wireless History; www.jhsph.edu/wireless/story; 5 pgs. |
Wikipedia; Wireless LAN; 4 pgs. |
Proposed ChoiceSeat Client Specification Summary; Initial Draft Aug. 29, 1997; Updated Sep. 30, 1997; 2 pgs. |
Proposed ChoiceSeat Network Specification Summary; Initial Draft Aug. 25, 1997; 2 pgs. |
Proposed ChoiceSeat Network Specification Summary; Updated Draft Sep. 30, 1997; 4 pgs. |
Quallcomm Stadium ChoiceSeat Network Diagram; May 11, 1998; 5 pgs. |
Schedule of Personal Property; Patents; Software and Trademarks etc Draft; 3 pgs. |
PCT International Search Report dated Feb. 5, 2004; In re International Application No. PCT/US03/31696. |
Written Opinion cited document in International Application No. PCT/US03/31696. |
Office Action dated Aug. 10, 2007; U.S. Appl. No. 10/630,069, filed Jul. 30, 2003; Applicant: Tazwell L. Anderson, Jr.; 11 pages. |
Office Action dated Aug. 23, 2007; U.S. Appl. No. 09/837,128, filed Apr. 18, 2001; Applicant: Tazwell L. Anderson, Jr.; 13 pages. |
Office Action dated Sep. 7, 2007; U.S. Appl. No. 10/453,385, filed Jul. 30, 2003; Applicant: Tazwell L. Anderson, Jr.; 14 pages. |
Dapeng, Wu; et al; “On End-to-End Architecture for Transporting MPEG-4 Video Over the Internet”' IEEE Transaction vol. 10, No. 6, Sep. 2000, 19 pgs. |
Capin, Tolga K., Petajen, Eric and Ostermann, Joern; “Efficient Modeling of Virtual Humans in MPEG-4” IEEE 2000, 4 pgs. |
Battista, Stafano; Casalino, Franco and Lande, Claudio; “MPEG-4: A Multimedia Standard for the Third Millennium, Part 1”; IEEE 1999, 10 pgs. |
Wireless Dimensions Corporation Adds to Mobile-Venue Suite™; Press Release, Wireless Dimensions; Allen, Texas; Jul. 26, 2000; www.wirelessdimensions.net/news.html , 6 pgs. |
Seeing is Believing—Motorola and PacketVideo Demonstrate MPEG4 Video Over SPRS; Publication: Business Wire Date: Wednesday, May 10 2000; www.alibusiness.com; 4pgs. |
Adamson, W.A.; Antonelli, C.J.; Coffman, K.W.; McDaniel, P.; Rees, J.; Secure Distributed Virtual Conferencing Multicast or Bust; CITI Technical Report 99-1; Jan. 25, 1999; 8 pgs. |
SGI and the Pepsi Center; 2 pgs. |
Office Action dated Sep. 10, 2007; U.S. Appl. No. 10/680,612, filed Oct. 7, 2003; Applicant: Tazwell L. Anderson, Jr.; 19 pages. |
Spanberg, Erik; “Techies Hit the Fast Track”; The Business Journal. Charlotte: Jul. 30, 1999; vol. 14, Iss. 17; pp. 3. |
Hiestand, Michael; Up Next: Rent Wireless Video Devices at games: [Final Edition]; USA Today; McLean, VA: Jan. 29, 2002; pp. 2. |
PR Newswire; Baseball Fans to Get Best of Both Worlds: Seats in the Stadium and Up Close Camera Shots; New York; Mar. 22, 2002; 2 pgs. |
Sony GV S50 Video Walkman Operating Instructions; 1992; 3 pgs. |
Number | Date | Country | |
---|---|---|---|
20150042813 A1 | Feb 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 10159666 | May 2002 | US |
Child | 11702716 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13591691 | Aug 2012 | US |
Child | 14522412 | US | |
Parent | 12969139 | Dec 2010 | US |
Child | 13591691 | US | |
Parent | 11702716 | Feb 2007 | US |
Child | 12969139 | US | |
Parent | 13478756 | May 2012 | US |
Child | 14522412 | US | |
Parent | 10630069 | Jul 2003 | US |
Child | 13478756 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 09322411 | May 1999 | US |
Child | 10159666 | US | |
Parent | 09386613 | Aug 1999 | US |
Child | 09322411 | US | |
Parent | 09837128 | Apr 2001 | US |
Child | 09386613 | US | |
Parent | 14522412 | US | |
Child | 09386613 | US | |
Parent | 09837128 | Apr 2001 | US |
Child | 10630069 | US |