1. Field of Invention
This invention relates to communications systems and methods. Specifically, the present invention relates to systems and methods for broadcasting content via wireless networks.
2. Description of the Related Art
Numerous over-the-air (wireless) audio broadcast services are available for the consumer including conventional AM and FM radio and, more recently, satellite (e.g. XM) radio. Additional service offerings are in development including High Definition Radio and Digital AM radio.
Currently, these offerings are audio only. That is, conventional wireless broadcast technologies provide only an audio signal for the consumer.
However, these audio only services would be enhanced by the transmission and display of visual information, including images, that are synchronized and relevant to associated audio services.
Unfortunately, current and planned audio only wireless broadcast systems have no means for providing such visual imagery. Accordingly, a need exists in the art for a system or method for providing images synchronized and relevant to associated audio program content in a wireless audio broadcast network.
The need in the art is addressed by the system and method of the present invention. The invention is adapted for use in a system for wirelessly transmitting and receiving an audio data stream and includes an arrangement for providing an image and a mechanism for inserting the image into the data stream prior to transmission thereof.
In the illustrative embodiment, a third arrangement is provided for receiving and decoding the data stream to extract and a fourth arrangement is included for displaying the image while the audio signal is output. The fourth arrangement may be a digital radio, i.e., a radio adapted to process digital signals, such as a satellite radio, high definition radio, digital AM or other suitable primarily audio wireless communication system.
In a specific embodiment, the invention includes an arrangement for automatically inserting a selected image in the stream. In the illustrative embodiment, this arrangement includes a source selector, an image editor coupled to the source selector, an image resizer, an image compressor, and an arrangement for allowing a user to add text, color, style and/or other information to an image output by the compressor. The invention further includes an arrangement for adding images from an archive to the stream and a graphical picture show composer for providing an image queue.
An image server is included for feeding the image queue to the output data stream provided by a system server such as an XM or Sirius satellite radio server.
a is a block diagram of an illustrative embodiment of the system for sending images via wireless audio channels in accordance with the present teachings.
b is a block diagram which shows an arrangement for multiplexing the outputs of several Picture Show Studios in accordance with an illustrative embodiment of the present teachings.
a is a diagram showing typical message packet output by the multiplexer in accordance with an illustrative embodiment of the present invention.
b shows a message sync packet in accordance with an illustrative embodiment of the present invention.
c shows a composite audio data service component packet adapted for wireless transmission with image data in accordance with the present teachings.
a) is a diagram showing an over the air payload channel adapted for in use in connection with satellite digital audio radio service transmission of Rolling Images in accordance with the present teachings.
b) is a diagram showing typical audio payload channel as conventionally disposed within an over the air payload channel adapted for the transmission of audio data in accordance with conventional teachings.
c) is a diagram showing an expansion of the Payload Channel that contains a Service Component adapted for the transmission of image data in accordance with an illustrative embodiment of the present teachings.
d) is a diagram showing an expansion of the Service Component Control Field (SCCF) of the Payload Channel depicted in
e is a diagram showing the TSCC (Time Slot Control Channel) in accordance with an illustrative embodiment of the present teachings.
f is a diagram showing an association of picture shows to audio services and insertion into a transport layer in accordance with an illustrative embodiment of the present teachings.
g is a diagram showing an association of picture shows to audio services by labels and insertion into a transport layer in accordance with an alternative embodiment of the present teachings.
a is a flow diagram of an illustrative implementation of software stored on physical media (not shown) and executed by the controller of
b is a flow diagram of an illustrative implementation of software stored on physical media (not shown) and executed by the controller of
While the present invention is described herein with reference to illustrative embodiments for particular applications, it should be understood that the invention is not limited thereto. Those having ordinary skill in the art and access to the teachings provided herein will recognize additional modifications, applications, and embodiments within the scope thereof and additional fields in which the present invention would be of significant utility.
a is a block diagram of an illustrative embodiment of the system for sending images via wireless audio channels in accordance with the present teachings. As shown in
Selected images from the cameras may also be sent to a video recorder for storage. Images from a digital camera 36 and stock images may be stored offline (via storage elements 34 and 40. The browser 20 allows a user to select individual images to be streamed in accordance with the present teachings. The storage elements may also supply video from a camcorder 42 or a source of stock video 44. The stored video may be edited via a player/editor 46 and individual frames therefrom may be captured manually via manual frame capture system 22.
Images from the selected source are output by the source selector 12 to an image editor 48. The image is then sized for a desired format by an image resizer 50. Current typical image sizes are 130 by 130 pixels to 176 by 220 pixels. In the illustrative embodiment, the images are formatted in accordance with a compressed JPEG format at 176 by 220 pixels.
In practice the editor 48 and resizer 50 are implemented in software running on a microprocessor in response to inputs from a user via the interface 14. The image is then compressed by a conventional image compressor 52. As is known in the art, the image compressor 52 may be implemented in software or in hardware. The output of the compressor is input to routine or element 54 adapted to effect picture message composition with additional input from a user with respect to text, color, time duration etc. The composite image is fed to a real-time graphical picture show composer 60. The picture show composer allows a user to arrange the output order of the images in the queue. The images are output via a picture show server 62. These images may be stored in an archive 64 and selected for output via a scheduler 66 and the multiplexer 58. The multiplexer 58 allows a user to select between a currently composed show via the output of the Real-time Graphical Picture Show Composer 60 and a program stored in the archive 64 via the user interface 14.
The system 10 is adapted for use with a baseline wireless audio system 70. In the illustrative embodiment, the baseline audio system 70 is a satellite digital audio radio service (SDARS) system without imaging support. However, those of ordinary skill in the art will appreciate that the present teachings are not limited thereto. The present invention is not limited to the baseline system shown. The invention may be used with other baseline systems without departing from the scope of the present teachings.
In the illustrative embodiment, the baseline system 70 includes numerous audio only channels 72 and a first audio channel 71 to which transmitted image (visual) data (from 69) is synchronized and associated.
Each audio channel includes a Source 73, a Song/Program Scheduler 79, and an Audio Encoder 85. The Source 73 can be a stored database of audio songs/programs or can be an audio stream provided by some external audio source. Along with each song/program in the source is PAD data (Program Associated Data). The common PAD info provided by the Baseline (Audio Only) system is Artist Text Labels and Song Text Labels (low bandwidth data). Specific Songs/Programs from the Source are scheduled and requested for transmission by the Song/Program Scheduler 79. Upon a specific Song/Program being requested by the Song/Program Scheduler for transmission, the Source 73 begins output of the audio data along with the PAD data for this Song/Program. The audio data is sent to the Audio Encoder 85 which compresses the audio data to minimize over-the-air bandwidth usage.
The Service Layer 91 receives the output of each Audio Encoder (along with PAD data) and also receives the output of other non-audio Service Types, as in the Image Data from the Data Server 69. These inputs are the Service Components. The Service Layer generates the Payload Channels that carry the Service Components. The Payload Channel is shown in
The Transport Layer 93 receives the Payload Channels from the Service Layer 91. The Transport Layer applies forward error correction, data interleaving and multiplexing of the Payload Channels into the 432 msec Frame shown in
The Physical Layer 95 receives input from the Transport Layer 93. The Physical Layer 95 defines the physical transport signal including modulation. The User Interface 14 programs the Song/Program Scheduler 79 (for each Audio Channel) so that the desired sequence of Songs/Programs are played at the desired times. The same User Interface 14 also programs the Image Show Scheduler 66 so that desired Picture/Image Shows are scheduled to begin at the desired times, and thus synchronized with the associated audio channel Song/Program. This method of audio-to-image synchronization is depicted in
Another method of audio-to-image synchronization is as follows: The User Interface 14 programs the Image Show Scheduler 66 to start playing specific Picture/Image Shows whenever the audio source outputs a specific PAD data pattern i.e. specific Artist and/or Song Text Labels. The Image Show Scheduler 66 monitors the PAD output of the specified audio source for the specified Artist and/or Song Labels. When a matching Labels are detected, the Image Show Scheduler 66 starts the playing of the specified Picture/Image Show. This method of audio-to-image synchronization is depicted in
In
b is a block diagram that shows an arrangement for multiplexing the outputs of several Picture Show Studios in accordance with an illustrative embodiment of the present teachings. In
The images output by the picture show server are inserted into the digital data stream as illustrated in
a is a diagram showing a typical rolling image message packet 80 output by the multiplexer 58 in accordance with an illustrative embodiment of the present invention. The packet 80 includes a message identifier (ID) 82 and a message payload 84. As discussed more fully below, the message packets 80 include both Rolling Image content-type messages (messages containing images) and Rolling Image control-type messages (messages that associate and synchronize a series of Rolling Image content-type Messages to existing Audio Services).
b shows a message sync packet 85 in accordance with an illustrative embodiment of the present invention. As illustrated in
c shows a stream of Image Message Sync Packets that are segmented into Data Service Component Packets (DSCP) in accordance with an illustrative embodiment of the invention. The DSCP consists of a DSCP Packet Header, a Payload (containing a complete or partial segment of a Image Message Sync Packet), and a DSCP CRC word for error checking. The DSCP Packet Header contains a DSCP Sync Word and a Packet Length field. The data service component packet stream 100 is output by the wireless radio server 69. In the best mode, the wireless radio server 69 is an XM satellite radio server. However, the invention is not limited thereto. The server may be adapted for digital AM, High Definition Radio, AM, FM, Satellite Radio (e.g. Sirius) or other wireless communication technology without departing from the scope of the present teachings.
a) is a diagram showing an over the air payload channel 200 adapted for in use in connection with satellite digital audio radio service transmission of Rolling Images in accordance with the present teachings. Sync pulses 202 and 204 allow a receiver to synchronize with the bit stream. In a conventional Satellite Digital Audio Radio Service (SDARS) system, each Service Component, identified by a Service ID, normally carries the audio data stream only for one User Channel/Station as depicted in
b) is a diagram showing typical audio payload channel as conventionally disposed within an over the air payload channel adapted for the transmission of audio data in accordance with conventional teachings. In this case, the Service Component is of Type=‘Audio’. However, a Service Component Type can also be defined as ‘Transparent Data’. In this case, general applications (non-audio, typically data oriented) can utilize the Transparent Data Service Component for the communication of the specific application data. These applications apply some form of a message-formatting layer on top of the Transparent Data to coordinate and synchronize the transfer of application information from an information source to the radio receivers. In a conventional SDARS system, these general applications, carried over Service Components of type Transparent Data, are non-related and asynchronous to the Audio Services (Audio Channels carried over Service Components of Type Audio). However, this invention implements a visual-type application, which utilizes the Transparent Data Service Component for communication and for which the visual application is also related/associated to and synchronized to the existing Audio Services by the methods described herein.
c) is a diagram showing a Payload Channel 210 that contains a Service Component adapted for the transmission of image data in accordance with an illustrative embodiment of the present teachings. As depicted in
d) is a diagram showing an expansion of the Service Component Control Field (SCCF) of the Payload Channel depicted in
e) is a diagram showing the TSCC (Time Slot Control Channel) in accordance with an illustrative embodiment of the present teachings. The TSCC 211 is juxtaposed between sync pulses 202 and 204. The TSCC 211 provides information to a receiver as to how the slots are allocated in the channel 200. The remaining slots are allocated for audio and control currently. The TSCC 211 includes all information necessary to de-multiplex the bit stream to Payload Channels 220 and 210 (
As shown in
f is a diagram showing an association of picture shows to audio services and insertion into a transport layer in accordance with an illustrative embodiment of the present teachings. As shown in
As illustrated in
As discussed more fully below, in an alternative embodiment, the invention associates images to Artists/Song Labels by means of the Picture Show Label Reference Message 102 as shown in
g is a diagram showing an association of picture shows to audio services by labels and insertion into a transport layer in accordance with an alternative embodiment of the present teachings. In the baseline (audio-only) system, Artist and Song Labels are communicated and associated with the currently playing audio streams by means of the Artist/Song Label Messages. The Artist/Song Label Messages are transmitted in the Broadcast Information Channel (BIC) 101 that is shown in
f and 3g also show the Image Data Messages, the Picture Show Label Reference Message 102, and the Picture Show Service Reference Message 98 all being time multiplexed into the same Transparent Data Service Component. In this manner, all the image data associated with multiple Audio Service Components may be time multiplexed within this single Data Service Component in order to statistically multiplex the instantaneous image data throughput demands of each Picture Show. This enables optimal image frame rates for a fixed allocation of bandwidth.
In the illustrative implementation, the payload channel is transmitted over the air via a satellite network 300 such as that depicted in
The image decoder 426 converts the data into a format suitable for display and outputs the signal to a display 436. The decoder also interfaces with a memory 428. The memory 428 is provided to allow the user to store audio and image data in response to input from a user via an interface 430. The decoder 426 stores and retrieves images from memory as appropriate based on the signals decoded from the channel. The memory also provides a means of storing images that are part of a Picture Show that are transmitted at a slower rate than the rate intended for playback. The receiver 400 caches all these images to the memory 428 over a relatively long period of time. After all of the images of a Picture Show are acquired, the Picture Show may then be displayed on user request or based on signals decoded from the channel. When the playback rate approaches 66 milliseconds (15 frames per second) and greater, the Picture Show may be categorized as a ‘video’ application. As such, those skilled in the art will appreciate that the inventive system is enabled to advantageously transmit and display video information over conventionally audio only channels as well as a slide show per se.
As shown in
Next, at steps 516 and 518, the software 500 performs a CRC check. If the packets pass the CRC check (i.e. no bit errors), then at step 520, the state is checked. If the system is still in ‘synch search’ state, then at step 522 the system searches the application packet for a sync word. If the sync word is not found (step 524) or if the CRC check fails at step 518, then the system returns to step 510 and continues with the message buffer reset. If the synch word if found at step 526, then the state is set to ‘FoundSync’ and at step 528, the message length field following the sync word is read. At step 530, the application packet payload is appended to the message buffer. Next, at step 532, the accumulated message length is updated at step 532. Clearly, these steps serve to accumulate application packets to generate a complete message. At step 534, the system checks to determine if the accumulated message length is equal to the total message length. If not, then it retrieves the next packet at step 514. If so, the message accumulation process is complete (step 536) and a CRC check is performed on the message (step 538).
At step 540, if the CRC check on the message fails, the message buffer is reset and the code returns to step 510 to retrieve the next data application packet. If, however, at step 540 the CRC check on the message passes, then at step 542, the Rolling Image message is parsed.
Next, at step 544, the system checks to determine if the Rolling Image message is a content type message. If so, then at step 546, the message is image data and is stored in a Content Buffer. Then, at step 548, the Picture Show Player is informed of receipt of a new Content message.
If at step 544 the system determines that the message is a control type message, then at step 550 the message is stored in a Control Information table. At step 552, the Picture Show Player is informed of receipt of new control information.
a is a flow diagram of an illustrative implementation of software stored on physical media (not shown) and executed by the controller 422 of
Here, at step 610, the current picture show control information (i.e., from the reference message 98) is procured. Next, at step 612, the current service ID for the current user selected audio channel is obtained. Then, at step 614, the system checks to determine if Picture Show mode is enabled on the current service ID. If so, at step 616, the system 422 acquires a picture show ID from the reference message 98. If not, then the system returns to a wait state before step 610.
Next, at step 618, the system checks to determine whether a new Picture Show Image message is received with the Picture Show ID acquired in step 616. Here, the system is checking the image message buffer for an image with the Picture Show ID identified in step 616. If so, then the image is acquired in step 620, decoded in step 622 and displayed in step 624. If not, then the system returns to a wait state before step 610.
b is a flow diagram of an illustrative implementation of software stored on physical media (not shown) and executed by the controller of
Details of an Illustrative Rolling Images Messaging Layer
In the illustrative embodiment the system utilizes the following messages and message structure to implement Rolling Image Services.
Messages for Long Duration Rolling Images Implementation
Picture Show Image Message
Picture Show Instance Info Message
Picture Show Static Info Message
Picture Show Service Reference Message
Picture Show Category Reference Message
Of note in the above Picture Show Image Message structure is the Image Transition Effect field. This field specifies to the receiver 400 the image transition effect to apply between the currently displayed Picture Show Image and the newly received Picture Show Image that will be displayed next.
The present teaching has disclosed the delivery of visual/image data and methods of associating and synchronizing this image data to Audio Channels in connection with an existing conventional radio system. In the primary use case, this visual/image data is relatively slowly updated to minimize bandwidth usage in audio centric systems, e.g. 5 second image update period for a Picture Show. However, as mentioned above, using the same methods described here and with additional bit rate allocated to the utilized Transparent Data Service Component, faster image updates may be implemented. When these image update periods approach 66 msec (15 frames/sec), a video application is achieved using the same methods described. That is, those skilled in the art will appreciate that the inventive system is enabled to advantageously transmit and display video information over conventionally audio only channels as well as a slide show per se.
Thus, the present invention has been described herein with reference to a particular embodiment for a particular application. Those having ordinary skill in the art and access to the present teachings will recognize additional modifications, applications and embodiments within the scope thereof.
It is therefore intended by the appended claims to cover any and all such applications, modifications and embodiments within the scope of the present invention.
Accordingly,
Number | Name | Date | Kind |
---|---|---|---|
4633329 | Sugiyama et al. | Dec 1986 | A |
6510317 | Marko et al. | Jan 2003 | B1 |
6725022 | Clayton et al. | Apr 2004 | B1 |
6823169 | Marko et al. | Nov 2004 | B2 |
6900777 | Hebert et al. | May 2005 | B1 |
7334249 | Byers | Feb 2008 | B1 |
7380708 | Kiliccote | Jun 2008 | B1 |
7454166 | Patsiokas et al. | Nov 2008 | B2 |
20030210337 | Hall | Nov 2003 | A1 |
20030231806 | Troyanker | Dec 2003 | A1 |
20040150723 | Seo et al. | Aug 2004 | A1 |
20050013585 | Ono | Jan 2005 | A1 |
20050128286 | Richards | Jun 2005 | A1 |
20050216346 | Kusumoto et al. | Sep 2005 | A1 |
20050219366 | Hollowbush et al. | Oct 2005 | A1 |
20060044582 | Seaman et al. | Mar 2006 | A1 |
20060133770 | Shibata et al. | Jun 2006 | A1 |
20060171474 | Ramaswamy et al. | Aug 2006 | A1 |
20060187339 | Kong | Aug 2006 | A1 |
20060206582 | Finn | Sep 2006 | A1 |
20060271991 | Bae et al. | Nov 2006 | A1 |
20070033609 | Dei | Feb 2007 | A1 |
20070130292 | Tzruya et al. | Jun 2007 | A1 |
20070150375 | Yang | Jun 2007 | A1 |
20070162571 | Gupta et al. | Jul 2007 | A1 |
20070222734 | Tran | Sep 2007 | A1 |
20070242066 | Levy Rosenthal | Oct 2007 | A1 |
20070254716 | Matsuoka et al. | Nov 2007 | A1 |
20080117471 | Tsubaki et al. | May 2008 | A1 |
20080131071 | Ogikubo | Jun 2008 | A1 |
20080253565 | Fontijn | Oct 2008 | A1 |
20080318518 | Coutinho et al. | Dec 2008 | A1 |
20090216623 | Hendricks et al. | Aug 2009 | A1 |
Number | Date | Country | |
---|---|---|---|
20080125030 A1 | May 2008 | US |