A detailed description of examples of implementation of the present invention is provided hereinbelow with reference to the following drawings, in which:
In the drawings, embodiments of the invention are illustrated by way of example. It is to be expressly understood that the description and drawings are only for purposes of illustration and as an aid to understanding, and are not intended to be a definition of the limits of the invention.
The system 10 delivers to spectators attending a football live sporting event video, audio and data content. For clarity, the invention can be used in connection with a wide variety of live sporting events without departing from the spirit of the invention. Accordingly, while the examples of implementation provided in this specification are made in connection with a football game, this should not be considered as a limiting feature.
As shown in
The production studio 12 and sites A, B and C are all linked via a data connection shown as a network 14. The network 14 allows data to be sent from any one of the sites A, B or C to the production studio 12 and also allows data to be sent from the production studio 12 to any one of the sites A, B or C. The type of network 14 used to perform the data transport function from the sites A, B and C to and from the production studio 12 is not critical as long as it can meet sufficient performance requirements. Networks based on optical fiber technology that provide a high bandwidth, low latency and high speed data transmission have been found satisfactory. Note that the network does not need to be strictly landline based buy may include wireless segments.
The transmitter 18 communicates with the individual handheld electronic devices 16 in a wireless manner. In the example that is being shown in the drawings, the communication is a Radio Frequency (RF) communication. This RF transmission is unidirectional. In other words, the information stream is from the transmitter 18 to each electronic device 16. This is accomplished in the broadcast mode wherein each electronic device 16 receives the same information from the transmitter 18. In the unidirectional RF transmission, the handheld electronic devices 16 are unable to transmit information back to the transmitter 18 over the wireless RF communication link.
In a non-limiting example of implementation the wireless RF transmission is performed locally of the venue. “Locally of the venue” means that the antenna generating the wireless RF transmission originates either at the venue or outside the venue but generally close to the venue. The signal power level is also controlled such that handheld electronic receivers 16 can adequately receive the wireless RF transmission at the venue, but at significant distances from the venue the signal weakens and may no longer permit a quality reception. By “significant” distance is meant a distance in terms of kilometer range.
It should be understood that the handheld electronic devices 16 can be capable of unidirectional wireless communication, as described above, or alternatively, they can be capable of bi-directional wireless communication. In the case of unidirectional wireless communication, the handheld electronic devices 16 are only able to receive wireless information. In other words, they are not able to transmit information back to the transmitter 18, or to another receiver/transmitter, over a wireless communication link. It should be appreciated that although the handheld electronic devices 16 may only be capable of unidirectional wireless communication, they may be operative to transmit and receive information over a wireline link, such as via a USB connection port, for example.
In the case of bi-directional wireless communication, each handheld electronic device 16 is able to receive information over a wireless communication link, and is also able to transmit information over a wireless communication link. In this case the electronic device 16 is provided with an RF transceiver (not shown in the drawings) that can handle the receive and transmit functions. The transmitted information may be sent to an entity of the system 10 (not shown), or to an entity of an external network that is independent of the system 10. The handheld electronic devices 16 may be operable to transmit information over a wireless RF communication link, such as over a cellular link. In the case of a cellular link, the handheld electronic devices 16 would dial a phone number and then transmit information over the cellular phone link.
The bi-directional communication feature may be implemented to provide identical or similar bandwidths over the receive and transmit links. However, in most cases, this is not necessary since the amount of information that needs to be sent from the handheld electronic device 16 is generally different from the amount of information that it needs to receive. Typically, the handheld electronic device 16 needs to send far less information that it receives. The implementation using the cellular network is an example that would provide a sufficient bandwidth over the transmit link. By “cellular” network is meant a network that uses a series of cells having a limited geographical extent within which communication services are available. In one possible form of implementation, such cells can be arranged to provide a hand-off to moving handheld electronic devices 16, such that as a handheld electronic device 16 moving outside a cell and entering a new cell, the communication services are seamlessly transferred from one cell infrastructure to another cell infrastructure. The “cellular” network terminology encompasses both communication infrastructures using licensed bandwidth, such as typical cellular telephones based on Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Groupe Station Mobile (GSM), or other technologies, and communication infrastructures using unlicensed bandwidth, such as Wireless Fidelity (WiFi) that is used commonly to provide wireless access to computer networks. Another possible example of a “cellular” technology using unlicensed bandwidth is the so called “Bluetooth” protocol that provides very short range wireless communication capabilities.
The cellular network allows the handheld electronic device 16 to transmit information over a relatively limited bandwidth, however, in most cases the amount of information that needs to be sent is low such the available bandwidth should suffice. On the other hand, the receive link has a higher bandwidth in order to accommodate the multiple video streams and other data that is to be sent to the handheld electronic device 16. Also the cellular link allows the handheld electronic devices 16 to transmit information independently from one another.
The input 11 receives signals that convey video/audio/data content originating form various sources. In the example shown in
Multiple audio feeds 32 are also provided, where each audio feed 32 is associated with a video feed 31. An audio feed 32 conveys audio information such as the noise picked up by a microphone at a location at which the associated camera is placed, or an audio commentary. Such an audio commentary can be the speech picked up by a microphone from a commentator or any individual that appears in one or more of the video feeds 31. Note that the audio feeds 32 are shown separate from the video feeds 31 for clarity only. In many practical applications the video feed 31 and the associated audio feed 32 will be carried over a common physical conductor.
Independent audio feeds 35 are also provided that convey independent audio content which is not associated with any particular video feed 31. For instance those independent audio feeds 35 may be radio conversations between members of a football team or a radio commentary by a reporter over a radio channel. Such audio conversations can be picked up by one or more radio receivers (not shown) each tuned to a particular frequency.
The audio and video content is typically supplied by the authority managing the live sporting event. For example, in the case of a football game, the video and audio data might by supplied by the National Football League. (NFL). In a further non-limiting example, the independent audio feeds that contain audio commentary may be supplied by the commentator's affiliated television network, such as TSN, for example.
The input 11 also receives a real time data content 37. The real time data content 37 conveys information relating to the action in the field. For example, the real time data content in the context of a football game can be:
The real time data content 37 is typically also supplied by the authority managing the live sporting event.
The video content, the audio content and the data content are physically input into a patch panel 50 that is the entry point in the network 14. The network 14 transports this video/audio/data content to the remote production studio 12 where it will be edited.
The infrastructure of the system 10 for sites B and C functions in the same way as described above. Specifically, each of the sites B and C produces audio/video/data content that is transported to the production studio 12 for editing. In a specific example of implementation each venue is hosting a football game between two teams and the games are concurrent at least in part. In the context of two sites, say sites A and B, games concurrent at least in part means that each venue is hosting a football game and both games overlap time wise. In other words, when one of the games begins, the other game starts concurrently or has already started. With games concurrent at least in part, game action occurs simultaneously at different sites. In a specific and non-limiting example of implementation, the games at the venues serviced by the system 10 (sites A, B and C) start simultaneously. The games are unlikely to end at the same time since the duration of an individual game can vary but for the most of the duration of the game, three different game actions occur simultaneously at different sites remote from one another. In this example the game that is held at each venue is the same type of game, namely a football game. The invention can also be used in applications where different types of games occur at the sites A, B and C and those games are concurrent at least in part. For example, venue A may be hosting a football game, while sites B and C are hosting baseball games. The game at venue A starts at 7:00 PM while the games at sites B and C start at 7:30 PM. Thus, from 7:30 PM three different game actions are in occurrence, there being one football game and two baseball games.
The content production console units 56, 58 and 60 can also mix the content. The mixing function is accomplished by linking the content production console units 56, 58 and 60 to one another via data interconnects 62, 64 and 66. The data interconnects 62, 64 and 66 allow content that originates from one venue A, B or C to be delivered to the content production console unit 56, 58, 60 associated with another site. The way in which the content mixing operation will be performed is under the direct control of the operator of the content production station 54.
Each content production console unit 56, 58, 60 has an output 68, 70 and 72 that releases an edited and mixed audio/video/data content. Examples of mixing operations include:
1. Venue A, Venue B and Venue C Host Football Type Games that are Concurrent at Least in Part.
2. Venue A and Venue B Host Football Type Games that are Concurrent at Least in Part and Venue C Hosts a Motor Sports Event.
3. International Competitions such as the Olympic Games or the World Soccer Cup.
Referring back to
The head end station unit 82 receives seven different inputs. Those inputs are broadly described below:
The head end station unit 82 organizes the data from the various inputs into a structured information stream for broadcasting to the individual handheld electronic devices 16. The head end station unit 82 has a video processor 102, an audio processor 104, a control entity 106 and a multiplexer 108. The control entity 106 includes a computing platform running a program to carry out various tasks. While not shown in the drawings, the computing platform includes a processor, memory to hold the program code and data that is being processed by the processor. In addition, the computing platform has a Graphical User Interface (GUI) 110 that provides a technician with the ability to send commands to the control entity 106 or to receive information therefrom. The GUI 110 can take various forms without departing from the spirit of the invention. For instance, the GUI 110 can include a display on which information is shown to the technician and a keyboard and mouse combination for data and commands entry.
The control entity 106 receives the various forms of information and will direct them to the appropriate encoders for processing. Specifically, all the video feeds that are received at the head end station unit 82 are handled by the video processor 102 that will convert the SDI format into Moving Picture Experts Group (MPEG)—4 format. Each video stream is compressed to provide at the handheld electronic device 16 a moving image at 30 Frames per second (fps), 16 bit colors at a 320×240 pixels resolution. The resulting bit rate is 384 Kbits/sec. Since the video processor 102 needs to handle multiple video feeds simultaneously it is designed in order to be able to process those feeds in parallel. The preferred form of implementation uses a plurality of encoder stations, each being assigned a video feed. The encoder stations can be based on dedicated video processing chips or purely on software, or a combination of both. Alternatively, the video processor 102 can use a single processing module with buffering capabilities to sequentially handle blocks of data from different video feeds. With an adequate size buffer and a processing module that is fast enough, all the video feeds can be encoded without causing loss of data.
Note that since MPEG-4 encoding also handles audio, the audio feeds that are associated with the respective video feeds are also directed to the video processor 102. The output of the video processor 102 is thus MPEG-4 encoded video channels where each channel has a video stream portion and an audio stream portion.
The independent audio feeds 35 that constitute the third input 300 are directed to an audio processor 104 that will encode them into a Moving Pictures Experts Group Audio layer 3 (MP3) format. Since the MP3 encoded audio streams convey voice information they can be compressed into an 8 Kbits/sec data rate while maintaining adequate quality. As in the case with the video processor 102, the audio processor 104 uses a series of audio encoding stations, each dedicated to a given audio feed. Alternatively, the audio processor 104 can use a single sufficiently fast encoding module having buffering capabilities to sequentially handle data blocks from all the audio feeds.
The control entity 106 handles the processing of the fourth, fifth, sixth and seventh inputs, namely the real time data, the authentication data, the ancillary content and the service data. The purpose of the processing is to packetize the data such that it can be transmitted to the individual handheld electronic devices 16.
The outputs of the control entity 106 and the video and the audio processors 102, and 104, are passed to a multiplexer 108 that combines the data into one common data flow. The data flow is then directed to an output 112. The data flow at the output 112 is organized in the form of packets. In a specific and non-limiting example of implementation, three types of packets are being sent. The first type includes the video information. In essence, the MPEG-4 information is packetized and transmitted. The video information packet includes a header that contains the relevant data allowing a handheld electronic device 16 to appropriately decode it and process it. Advantageously, error detection and correction data is also included in the header for a more reliable transmission. The second type of packet includes the independent audio information. The third type of packet includes the remainder of the payload, such as the ancillary information and the real and service type data. As in the case of the first type of packet, the second and third types of packets include identification data in the header to inform the handheld electronic device 16 what type of content the packet holds such that the content can be adequately processed.
The table below provides an example of data at the output 112 and the respective bit rate.
As mentioned previously, the head end station 80 includes a number of head end station units 82 identical to the number of sites that are being serviced by the system 10. In the present case, there are three head end station units 82, associated with the sites A, B and C, respectively. Each head end station unit 82 issues a data flow at its output 112 that is directed to the respective site.
The data streams 112A, 112B and 112C may be identical but for most applications they will carry different content. The content may differ in terms of video streams, associated audio streams and independent audio streams, which is determined largely by the mixing operation performed at the content production station 54. If every video, associated audio and independent audio stream from a venue is distributed to every other site, ultimately the video, associated audio and independent audio streams in the data streams 112A, 112B and 112C will be the same. When a more limited mixing is performed then the data streams 112A, 112B and 112C will be different.
The most likely difference, however, between the data streams 112A, 112B and 112C is at the level of the ancillary content. Since in most applications the ancillary content is likely to be venue specific, this distinction will be reflected in the data streams 112A, 112B and 112C. More specifically:
Another likely difference between the data streams 112A, 112B and 112C is at the level of the service data. Since the service data is likely to be at least to some extent venue specific, it will be different from one data flow 112A, 112B and 112C to another. Differences could be at the following levels:
Yet another possible difference between the data flows 112A, 112B and 112C is the authentication data. Depending on the specific authentication scheme used, the authentication data in each data flow 112A, 112B and 112C could be different and specific to the population of handheld electronic devices 16 at the venue A, B or C associated with that data flow 112A, 112B and 112C. Alternatively, the authentication data can be the same in each data flow 112A, 112B and 112C.
The databases 502, 602 and 701 are designed to provide the relevant, authentication data, ancillary data and service data to each head end station unit 82. For instance, there may be databases 502, 602 and 701 that are associated with a specific head end station unit 82, when the data they provide is venue specific. Although the drawings show architecture where the databases 502, 602 and 701 are shared among the head end station units 82, this is only for the purpose of simplified illustration. The present invention encompasses both options, namely a shared set of databases 502, 602 and 701 and multiple database sets 502, 602 and 701 that are venue specific.
Referring back to
As seen in
The Configuration Layer
The GUI Layer
The Baseline Code
Basic Firmware
The software is stored in a general-purpose memory 702. Typically, the memory 702 would include a Read Only Memory (ROM) portion that contains data intended to be permanently retained such as the program code that the processor 700 executes. In addition, the memory 702 also includes a Random Access Memory (RAM) portion that temporarily holds data to be processed. The memory 702 can be implemented as a single unit, for instance as a semiconductor-based module or may include a combination of a semiconductor-based module and a mass-storage device, such as a hard-drive.
A Universal Serial Bus 704 (USB) port is provided to allow the handheld electronic device 16 to connect to external devices. Specifically, the USB port 704 allows linking the handheld electronic device 16 to a computer that can either download information from the handheld electronic device 16 or upload data to it. For instance, the download process may be used when desired to transfer data stored in the memory 702 to the external computer. Similarly, an upload process is used to perform the reverse operation. This is useful when desired, for example, to change the program running the handheld electronic device 16, by installing one or more updates. The USB port 704 requires a suitable driver that is loaded and executed by the processor 700 when the handheld electronic device 16 is powered up.
A removable storage media reader/writer 786 is provided to allow the handheld electronic device 16 to read data or write data on a removable storage media such as a memory card. This feature can be used to permanently record event-related content that is sent to the handheld electronic device 16. This functionality will be discussed later in greater detail.
As indicated earlier, the keypad 800 allows the spectator to control the operation of the handheld electronic device 16. The number and type of keys forming the keypad 800 is a matter of choice depending upon the specific application. As a possible variant, a touch sensitive screen or a voice recognition capability can be used to replace the keypad 800 or in combination with the keypad 800 as a means for command and data entry by the spectator.
The handheld electronic device 16 has an RF receiver and demodulator 710 that senses the wireless RF broadcast transmission, demodulates it and delivers it as properly organized and formatted data blocks to a data bus 712. The data thus sent over the data bus 712 is made available to the memory 702, the processor 700, the USB port 704 and the removable storage media reader/writer 706. In a specific example of implementation, the RF receiver and demodulator 710 operates in the 2.5 GHz range. Alternatively, the transmission may also be made in the Ultra High Frequency (UHF) range, specifically in the sub range of 470 MHz to 806 MHz. A 6 MHz contiguous bandwidth (equivalent to one regular TV channel) is sufficient to transmit the exemplary payload indicated earlier.
A video decoder 714 is provided to perform the decoding of the video channels received from the RF receiver and demodulator 710. For clarity it should be mentioned that while the specification refers to the decoder 714 as “video” decoder it also performs audio decoding on the audio information associated with the video channels. The video decoder 714 has a memory 727 in the form of a buffer that will hold undecoded video/audio information representing certain duration of video channel play. For instance the size of the buffer may be selected such that it holds 5 minutes of video channel play, for each channel. In use the video/audio information not yet decoded that is received from the RF receiver and demodulator 710 is sent over the data bus 712 to two locations (1) the video decoder 714 and (2) the memory buffer 727. The video decoder 714 decodes the video/audio information and then directs it to the display screen 802 to be viewed by the spectator. At the same time the undecoded video/audio information that is directed to the memory buffer 727 starts to fill the memory buffer 727. When the memory buffer 727 is completely filled, it starts overflowing such that only the last 5 minutes of the video channel play are retained. The same operation is performed on every video channel, with the exception that only the video channel the spectator wants to watch is being decoded and directed to the display screen 802. Accordingly, the memory buffer 727 is segmented in the functional sense into areas, where each area is associated with a video channel.
The audio stream that is associated with the video stream being watched is decoded, converted into an analog format, amplified and directed to speaker/headphones 724 such that the spectator can watch the video stream on the display screen 802 and hear the associated audio simultaneously.
The ability to retain the last five minutes of video channel play provides the spectator with interesting possibilities. For instance, the spectator can manipulate the data in the memory buffer 727 so as to “playback” a certain video channel content, create fast forward motion, “rewind” motion and record the video/audio information in the memory buffer 727, either in part or the entire content by copying it on a storage media in the removable storage media reader/writer 786. In this fashion, the video/audio information of interest to the spectator can be permanently retained. Moreover, the spectator can see any action that may have been missed by switching channels and then “rewinding” the content of the memory buffer 727 associated with the newly selected channel.
It is generally found suitable to use a memory buffer 727 in the form of a semiconductor based unit. In applications where large memory capacity is required in order to store a large video content, a storage device such as a hard drive can be used.
The display screen 802 can be of any suitable type. One possibility is to use a 3.5 in diagonal transrelfective Thin Film Transistor (TFT) screen capable of rendering 320×240 pixel resolution images with 16 bit color depth. Evidently, other display types can be used without departing from the spirit of the invention. Optionally, the handheld electronic device 16 can be provided with a lighting system (not shown in the drawings) using Light Emitting Diodes (LEDs) or any other suitable illumination technology to facilitate viewing under low light level conditions.
The audio decoder 720 functions in a somewhat similar manner to the video decoder 714. Specifically, the audio decoder 720 is associated with an audio memory buffer 729 and it handles the independent audio streams conveying the audio information from the independent audio feeds 35. The independent audio streams are stored in a compressed format in the audio memory buffer 729 so as to record a predetermined period of the audio content that is received.
By storing the audio content received by the handheld electronic device 16 over a time period determined by the capacity of the audio memory buffer 729, the spectator is provided with the ability to “playback” the audio content, create “fast-forward”, “rewind” and bookmarks. In addition, the audio information in the audio memory buffer 729 can be recorded either in part or in its entirety by copying the content on a storage media in the removable storage media reader/writer 786.
The functionality of the handheld electronic device 16 will now be discussed in detail.
The flowchart in
At the next step, once the identifier has been recorded, the vendor will typically create a user account in a database. The user account will allow the spectator to purchase the delivery of content to the handheld electronic device 16. In the example described in
Continuing with the above example, assume that the spectator now whishes to have access to content on the handheld electronic device 16 for a certain live sporting event that the spectator plans to attend. The spectator then makes the payment to his account. The payment can be made in person, to a kiosk or at any other location authorized to receive payments. Advantageously, electronic payment methods, such as over the Internet, can be used. With such a method the spectator logs on to an Internet site of the service provider and makes the payment via credit card or other. The payment process will typically include selecting the event or group of events for which access to content is desired, the level of service, if applicable, and then making the payment. When the payment is made and validated an entry is automatically made in the user account indicating that access to content (in full or in part) for the handheld electronic device 16 specified in the account is enabled.
At the event itself, before starting to broadcast the content to the individual handheld electronic devices 16, the database 502 connects to the network of the service provider over the Internet such that the database 502 can be populated with the identifiers of all the handheld electronic devices 16 for which payment for content delivery for the event has been made. Once this step is completed all the handheld electronic device 16 identifiers in the database 502 are transmitted to the head end station 80 such and they are then all included in the broadcast that is made by the transmitter 18. Specifically, the block of identifiers are broadcasted periodically, say every minute such as to allow the individual handheld electronic devices 16 to perform the authentication process at any time.
Since the operation of the system involves several sites, the authentication process creates a site-specific group of identifiers to be broadcast, for each venue A, B and C. For instance, the identifiers of the handheld electronic devices 16 that have purchased access to the service in relation to the football game played on venue A are all placed in a group associated with that site. The same operation is performed for all the other sites, namely sites B and C. Each site-specific group of identifiers is then placed in the respective data flow 112A, 112B and 112C. As indicated previously, another option is to create a common group of authentication number that encompasses all the handheld electronic devices 16 that have purchased service for the events in any one of the sites A, B and C. That common group is then placed in each data flow 112A, 112B and 112C.
Each handheld electronic device 16 is designed such that it cannot operate unless it has been electronically unlocked. When the handheld electronic device 16 is powered up, it automatically enters the locked mode. During the locked mode the handheld electronic device 16 will acquire the wireless RF transmission and decode the information such as to extract the block of identifiers that are being sent. Once the block of identifiers is extracted from the transmission the handheld electronic device 16 will compare each number from the block to the identifier of the handheld electronic device 16. If a match is found, then the handheld electronic device 16 enters the unlocked mode and the content that is being broadcast can be adequately received. However, if no match is found after a certain period, say 2 minutes the handheld electronic device 16 shuts down automatically.
The approach described earlier is a simple way to ensure that content is delivered only to handheld electronic devices 16 that are authorized to receive the service, in particular belonging or being used by spectators that have made payment, since no encryption of the video/audio content is required. In addition, the delivery of the authentication information to the individual handheld electronic devices 16, such as the block of identifiers, in a wireless manner, is simple from a logistics standpoint.
For enhanced security, the block of identifiers that are being transmitted can be encrypted using any suitable encryption techniques. The handheld electronic device 16 should, therefore be provided with capability to decrypt the block of identifiers by using a suitable key.
Another option is to encrypt the entire transmission and require the handheld electronic device 16 to decrypt it. In this form of implementation, the encryption constitutes the authentication data carried by the wireless RF transmission that is processed by the individual handheld electronic devices 16. A decryption key or password may need to be input by the spectator. In such case, a decryption key may be provided to the spectator following the payment for the service. When the spectator powers up the handheld electronic device 16, the spectator enters the key and that key is used to perform the decryption.
If encryption or decryption is required, the function can be implemented at the handheld electronic device 16 by suitable software or hardware, both of which are known in the art.
The authentication described earlier can be modified such as to provide service level access control. As it will be discussed later, the handheld electronic device 16 can be designed in such a way as to deliver to the spectator service available in different levels or categories. The levels can be distinguished from each other on the basis of content, for example. The basic level of service may include basic content, such as for example a limited number of video channels. A higher level of service may include a larger number of video channels and contextual information or other content. The reader will appreciate that the distinguishing characteristic of the different service levels will vary in accordance with the intended application. Generally, the higher the service level, the richer the content it provides to the spectator.
The service levels are likely to be available at different cost to the spectator. More specifically, the basic level of service is likely to be the least expensive and as content options are added to upgrade to a higher level of service then the cost to the spectator will increase.
It is desirable to provide the handheld electronic device 16 with an authentication feature that will allow the handheld electronic device 16 to provide to the spectator access to the level of service the spectator has paid for and thus protect the wireless RF transmission from unauthorized access to content or service levels that have not been purchased.
One possible option is to create, when the spectator purchases the service, distinct lists of identifiers for each service level that is available. Assume that three service levels are available, namely service level A, service level B and service level C. Service level A is the basic and the least expensive. Service level B is the intermediate level and includes features not available under service level A, for example more video channels and a limited amount of contextual information. Service level C is the highest and it provides the richest content, namely the largest number of channels and the most contextual information. As the service is being purchased by spectators, three different lists of electronic identifiers are created, one for those that have purchased service level A, one for those that have purchased service level B and one for those that have purchased the service level C.
Under this example, the wireless RF transmission is structured in a way to maintain a distinction between the different levels of service. For example, a core block of frames carries the content for the service level A, which is the basic level. A first additional block of frames carries the additional content that is added to the service level A to upgrade to service level B. Finally there is a second additional block of frames that carries the additional content added to service level B to upgrade to service level C. In such case, the service level C encompasses the content of service levels B and A, while the service level B encompasses the content under service level A.
The authentication information sent to the handheld electronic devices 16 is organized into groups as well. There is a first group that contains the list of the identifiers of the handheld electronic devices 16 for which service at level A has been purchased, a group with a list of the identifiers of the handheld electronic device 16 for which service at level B has been purchased and a group with the list of the identifiers of the handheld electronic devices 16 for which service at level C has been purchased.
As a handheld electronic device 16 picks up the wireless RF transmission, it will, as discussed earlier, try to find in anyone of the lists its own electronic identifier. If the identifier is not found in anyone of the lists, then the handheld electronic device 16 will not unlock itself and the spectator will not be able to access the content. However, the handheld electronic device 16 will unlock itself if its identifier is found in anyone of the lists. If the identifier is found in the list for service A, then the spectator will be able to view only the content carried in the core block of frames, the one that is associated with the service level A. Access to frames associated with any other service level will not be allowed. The control is implemented by the handheld electronic device 16 that determines which part of the wireless transmission it can make available to the spectator. Since the different block of frames are clearly distinguished from one another and associated with the respective groups of identifiers, the determination of the groups where the identifier of the handheld electronic device 16 resides, allows controlling the access to the relevant block of frames that hold the content. If the identifier is in the group associated with the core block of frames, only those will be processed and in effect the spectator will have only access to the service at level A. If the identifier of the handheld electronic device 16 is located in the group associated with the first additional block of frames then only the core block and the additional bloc will be processed, in effect limiting access to the content at level B. Finally, if the identifier of the handheld electronic device 16 resides in the group associated with the second additional block of frames, then full access to the entire content is granted.
The examples of the authentication feature described above are relatively simple to implement. However, there is a need to carry in the wireless RF transmission the entire list of the electronic identifiers of the handheld electronic devices 16 that are allowed to receive content. If a large number of handheld electronic devices are being serviced by the wireless RF transmission, the number of electronic identifiers that need to be transmitted may grow too large to be practical.
The handheld electronic device 16 is also provided with a bar code 2000 on its casing that is machine readable, such as by using a bar code reader (not shown). The bar code is a representation of the electronic identifier 2002. Note that the label holding the bar code may also contain another form of representation of the electronic identifier 2002, such as for example, by using alphanumeric characters suitable to be read by a human.
It is also possible to apply on the casing of the handheld electronic device 16 a bar code 2000 that is not identical to the electronic identifier 2002. In other words, the electronic identifier 2002 and the bar code 2000 are different codes. Some embodiments of the authentication process described later require access to the electronic identifier 2002 via the bar code 2000. In the embodiment where the electronic identifier 2002 and the bar code 2000 are the same codes then a reading of the bar code 2000 will yield the electronic identifier. However, when they are different codes, a mapping mechanism can be used to relate one to the other. The mapping mechanism can be a database storing all the population of electronic identifiers 2002 and the respective bar codes 2000. When it is necessary to obtain an electronic identifier 2002 of a certain handheld electronic device 16, the bar code 2000 is read, the database searched and the corresponding electronic identifier 2002 retrieved.
The handheld electronic device 16 also includes an authentication processor 2006. The authentication processor 2006 is designed to handle authentication related tasks, such as for example output the electronic identifier 2002 to an external device (as it will be described later), process a user code entered by the spectator and the authentication information contained in the wireless RF transmission to electronically unlock the handheld electronic device 16 to allow the spectator to gain access to the content in the wireless RF transmission. The authentication processor 2006 is likely implemented in software but it can also be implemented in hardware by a specialized circuit. A combination of software and hardware is another option.
When a spectator desires to purchase the delivery of service to the handheld electronic device 16, the spectator performs the transaction by interacting with an external entity which generates a user code. At the live event, the spectator enters via the user interface the user code provided earlier. The authentication processor 2006 performs a validation of the user code information provided by the spectator and issues an authentication decision. The authentication decision is conveyed by any suitable internal signal which will have the effect to allow the spectator to gain access to the content in the wireless RF signal, if the user code is a correct code, or to deny this access when the user code is a wrong code. For instance, the signal that conveys the authentication decision can be designed to enable the processing of the content in the wireless RF transmission such that it can be viewed and/or heard by the spectator, when the authentication decision validates the user code. On the other hand, when the authentication decision does not validate the user code, then the internal signal is designed to prevent content from being made available to the spectator. The authentication decision issued by the authentication processor 2006 can also be designed to handle levels of service. In such case, the authentication decision indicates which level of service the handheld electronic device 16 is entitled to receive, if any.
A block diagram of the external entity is shown in
The user code generator 2008 can, for example, be implemented at a booth at the live sporting event the spectator plans attending. The attendant at the booth receives payment from the spectator, the amount of which may be dependent on the level of service desired. The attendant then places adjacent the handheld electronic device 16 a reader such as an infrared reader to interact with an infrared port (not shown in
The electronic identifier is supplied to the user code generator 2008 in addition to the event code which is available to the user code generator 2008. Normally, the same event code is used for every handheld electronic device 16 for which service is being purchased. The event code is a code that designates the event for which service is being purchased, while the electronic identifier is a code that distinguishes one handheld electronic device 16 from another. In a specific example of implementation the event code will typically be different from one event to another. For instance, in the case of football games played at different sites, each football game will be associated with a different event code.
The user code generator 2008 will process the two entries according to the desired mathematical non-reversible function and outputs the user code. In this particular case, the mathematical processing is a succession of mathematical operations on the two entries that produce a user code that is smaller (less digits) than both the event code and the electronic identifier 2002. The user code is given to the spectator in any convenient way. It may be printed, for instance on a ticket and remitted to the spectator. Normally, this code will be unique to each handheld electronic device 16.
Note that it is also possible to implement the user code generator 2008 to produce user codes for different handheld electronic devices 16 without establishing an electronic communication with the handheld electronic devices 16. This can be done by using a bar code reader for reading the bar code 2000 on the casing of each handheld electronic device 16. If the bar code 2000 is the same as the electronic identifier 2002 then the processing by the user code generator 2008 can be effected as described earlier. Otherwise, if the bar code 2000 is different from the electronic identifier 2002, a database (not shown) mapping the bar codes 2000 to the electronic identifiers 2002 of the population of the handheld electronic devices 16 is searched to extract the electronic identifier 2002 corresponding to the bar code 2000 that was read.
As the spectator enters the stadium, the spectator turns the handheld electronic device 16 on and he is requested by the authentication processor 2006 to supply a user code. The request may be, for example, a prompt appearing on the display 802 of the handheld electronic device 16 to enter a user code (assuming that the system requires manual input of the user code). The spectator enters the user code printed on the ticket via the user interface of the handheld electronic device 16. As shown in
In the context of a multi-site arrangement, the authentication data that is conveyed in the data flows 112A, 112B and 112C is different from one another, since each data flow carries a different event code.
A possible option is to communicate the user code to the handheld electronic device 16 electronically, immediately after the electronic identifier 2002 is communicated to the user code generator 2008. As soon as the user code generator 2008 computes a user code, that code is conveyed via the communication link 2007 to the authentication processor 2006. This option obviates the need for the spectator to manually input the user code for validation purposes. The electronic transaction automatically unlocks the handheld electronic device for use at the live sporting event, without the necessity for the spectator to input any user code.
In a possible variant, the user code is provided to the spectator via an online purchase set-up that can be made any time before the live event begins. Briefly, the spectator accesses the Internet via a personal computer or any other communication device and connects with a web site where an on-line purchase of delivery of service can be made. The server hosting the web site implements the user code generator and computes a user code. The user code that is produced is communicated to the user, such as by displaying it on the screen of the personal computer, sent to the user by e-mail to a specified e-mail address or via any other suitable fashion. The user will retain the user code and enter it in the handheld electronic device 16 during the live event.
Another possible option that can be considered is to convey in the wireless RF transmission, the event code (as in the previous embodiment) and also all the user codes for the handheld electronic devices 16 for which service has been purchased. This option would require computing for every handheld electronic device 16 for which service is purchased (for example at the point of purchase of the service) a user code and storing all the user codes so computed into a database. Note that this operation can be implemented on a site by site basis, such that the RF transmission in a given site only conveys the event code and the user codes relevant for the population of electronic receivers 16 at that site. During the live sporting event, the content of the database is periodically broadcasted along with the event code. Each handheld electronic device 16 that is at the live sporting event receives the wireless RF transmission and extracts the event code. The event code is then used to compute a user code by the authentication processor 2006. That user code is then checked against the set of user codes contained in the wireless RF transmission. If a match is found the authentication processor 2006 issues an authentication decision allowing the handheld electronic device 16 to access the video/audio content in the wireless RF transmission. If no match is found then the handheld electronic device 16 remains locked.
The various embodiments described above that employ a user code for authentication purposes can also be adapted to a multi-service level arrangement. In the case of a multi service level system, the spectator will be provided with a different user code depending on the particular service level that was purchased. The wireless RF transmission has content that is structured to distinguish one service level from another and each service level is associated with different authentication information. The authentication information is a compound event code including a plurality of service level codes that are different from one service level to another. Accordingly, in this example, the authentication information will contain as many service level codes as there are different service levels. In use, the authentication processor 2008 will try to match the user code supplied by the spectator to the compound event code. Specifically, the authentication processor 2008 will issue an authentication decision to unlock the handheld electronic device 16 when a match is established between the user code and any one of the service level codes, but the authentication decision will control the access to the content, as discussed earlier, such that the spectator will only be able to gain access to the service level that was purchased.
Note that the event codes (either a unique code or a compound code in the case of a multi-level approach) are generated by the authority or organization controlling the delivery of service to the spectators during the live event. Those codes can be randomly generated for every new event.
Assuming that the authentication process described earlier has been successfully passed, the graphical and navigational layer is loaded and the user interface that allows the spectator to access the various functions is presented on the screen. Typically, the user interface presents a menu that will show a list of choices. The spectator navigates the menu by operating keys on the keyboard. Those keys may be arrow keys or any other suitable keys. When a selection has been made the choice or option is activated by pressing any suitable key such as an “enter” key.
The menu options available to the spectator can vary significantly according to the intended application. The description provided below illustrates a few possible examples.
The following examples focus on the delivery of the independent audio streams since the handling of the audio streams associated with the respective video streams was described in the earlier section.
As indicated earlier, the independent audio streams convey radio conversations associated with the football game, audio commentaries about the football game or advertisement information, among others. At the handheld electronic device 16 the spectator can manually select anyone of the audio streams and direct them to the output 724 which drives a sound reproducing handheld electronic device such as a loudspeaker or headphones.
In addition to conveying principal video channel content to the spectator, the handheld electronic device 16 is also designed to convey ancillary content. Examples of ancillary content include advertisement content, venue or event related contextual content, on-line shopping options and news, among many others. They can be in the form of video content, audio content or a combination of video and audio content.
In a non-limiting embodiment, the handheld electronic device 16 can have GPS receiving capabilities. In such an embodiment, the handheld electronic device 16 is equipped with a GPS device, such that the handheld electronic device 16 can obtain GPS coordinates associated with its location. This assumes the GPS device has an unobstructed view of the sky to pick up satellite signals. More specifically, these GPS coordinates can be displayed to a spectator on the display 802 of the handheld electronic device 16, in relation to a map of the venue, specifically showing to the spectator its location relative to the map. As such, the spectator will know where he/she is in relation to the layout of the venue.
These GPS coordinates can enable the spectator to locate him/herself in relation to specific facilities at the live sporting event. For example, the transmitter 18 can transmit to the handheld electronic devices 16 in the wireless RF broadcast cartographic data. For example, the cartographic data provides a map of the venue and shows the location on some key facilities such as washrooms, food vendors, medical/emergency facilities, exits, etc. . . . The handheld electronic device 16 then stores this geographic data in its memory 702, such that it can be easily accessed by the processor 700. As such, when GPS coordinates are produced a portion of the map or the map in its entirety is shown on the display 802, depending on the zoom level, identifying the location of the spectator. The locations of these facilities can then also be displayed on the map of the venue along with the GPS coordinates of the spectator. In this manner, the spectator would be able to locate him/herself in relation to these facilities.
The facilities can be displayed on the map of the venue in the form of symbols, or text. Preferably, the symbols or text would be indicative of the service/facility that is located at that area on the map. For example, the medical/emergency facilities may be depicted on the map via a red cross, the washroom facilities may be depicted by a W/C sign, or the traditional man and woman signs the food facilities may be depicted by a knife and fork symbol
etc. . . . In addition, the location of the handheld electronic device 16 can also be depicted on the map via an icon, such as a star, for example, such that the spectator knows where he/she is in relation to the other facilities depicted on the map. In an alternative embodiment, the position of the handheld electronic device 16 may just be depicted via a flashing dot.
In order to avoid the map being overcrowded with symbols for each of the different facilities available, the spectator could select which facilities to display on the map by a specific type of facility from a menu. For example, if a spectator needs to find the washrooms, they may access the map of the venue and have the icons associated with the washrooms appear on the map, as well as an icon associated with the position of the spectator. In that manner, the spectator will have a clear indication as to where the closest washroom is located.
In yet another possibility, the handheld electronic device 16 may be equipped with software that enables the handheld electronic device 16 to provide the spectator with directions as to how to get to a certain location. For example, based on the GPS coordinates of the handheld electronic device 16, and the GPS coordinates of a selected location stored in the GPS coordinates database, the processor 700 can use the direction software to determine the best route to get from where the spectator currently is, to the desired location. These directions can then be displayed to the spectator on the handheld electronic device 16 screen 802. The manner in which the spectator requests directions can be done in a variety of ways without departing from the spirit of the invention. In one example, the spectator may simply access a directions menu, and select from a list of options such as “directions to the washrooms”, “directions to the nearest exit”, “directions to the hot dog stand” etc. Alternatively, the spectator could highlight a specific facility icon depicted on the screen via up/down buttons on the keypad 800, and then hit an “enter” button in order to select that icon. The directions software would then provide directions to the facility associated with the selected icon. The directions provided to the user can be in the form of a text listing the route to follow or in the form of arrows showing a path to follow on the map of the venue.
The handheld electronic device 16 may also enable the spectator to store user-defined GPS coordinates into its memory 702. This may be desirable in the case where the spectator wants to remember specific locations at the venue. For example, in the case where a spectator parks his/her car in the stadium's parking lot, upon exiting the car, the spectator may choose to store the GPS coordinates associated with the location of the car in the memory 702 of the handheld electronic device 16. This could be done by invoking the GPS feature on the user interface, and then selecting a “store coordinates” option from a menu item with the appropriate selection keys. The coordinates could then be confirmed and stored by pressing an “enter” key. Those coordinates can then be associated with any suitable icon displayed on the map, thus allowing the spectator to quickly and conveniently find the location of the car. An advantage of this feature could be that at the end of the live sporting event, when the spectator wants to find his/her car, they would then be able to use the directions feature, as described above, to get directions from their current location, back to the GPS coordinates associated with their car.
Event related contextual information is information relating to the event held at the venue. In the example of a football game event, the following is considered to be event related contextual information:
The venue or event related contextual information could be delivered to the spectator over a dedicated channel that the spectator can select for viewing at his/her leisure. The channel selection is effected as described earlier. Alternatively, the venue or event related contextual information could be embedded in the video content of a principal video channel.
The ancillary content provided to the spectator over the wireless RF transmission can also include:
News—Relates to different types of news service, such as “breaking news”, weather information and economic information, among others. The news information can be delivered to the spectator in the same fashion as in the case of the venue or event related contextual information.
Note that some of the boxes 900 are identified with the “video” label which shows that an active video channel is associated with that box 900. This means that the spectator can see the live action for that particular game by selecting this channel. Some of the boxes 900 are blanked and do not show “video”. Those boxes 900 are associated with games that are now over and there is no available live video feed. Nevertheless, the box 900 shows the final score for that game.
Although various embodiments have been illustrated, this was for the purpose of describing, but not limiting, the invention. Various modifications will become apparent to those skilled in the art and are within the scope of this invention, which is defined more particularly by the attached claims.
This application claims the benefit under 35 USC 119(e) of U.S. Provisional Patent Application No. 60/789,911 filed on Apr. 7, 2006 and hereby incorporated by reference herein.
Number | Date | Country | |
---|---|---|---|
60789911 | Apr 2006 | US |