Compressed image data may be transmitted in packets. During transmission, some packets may be lost, reducing quality of the image.
In the example illustrated, link 20 is configured to transmit the streams of compressed image data in a lossy environment. A lossy environment is a wireless or non-quality of service (QoS) wired protocol which may be prone to lost data. The lost data directly contributes to image quality degradation, which results, for example, in the displayed image flickering or including undesired video artifacts, rendering the video product unacceptable to viewers. In low-latency video applications, such degradation is exacerbated when the data is compressed into transmission packets to permit transmission in a real-time lossy-link environment having bandwidth constraints because each packet containing compressed data is used for decoding a large amount of imagery. Depending on the particular compression technique that is used, significant image quality degradation may occur when a single transmission packet is lost. As will be described hereafter, link 20 includes components, devices or one or more processing units that analyze the compressed data stream to determine logical boundaries of segments and selectively parse the data stream into packets in a manner so as to reduce a number of partial logical segments in individual packets. As a result, link 20 reduces the impact of a lost packet to enhance image quality in a lossy transmission environment.
As shown by
Transmitter module 30 is configured to transmit streams of image data to receiver module 32. In the example illustrated, transmitter module 30 and receiver module 32 form a wireless real-time high-resolution image link. In the example illustrated, transmitter module 30 and receiver module 32 provide a high-speed radio link, data compression, with low end-to-end delay via spatial compression methods and little or no data buffering.
Transmitter module 30 includes input interfaces or ports 42, 44, computer graphics decoder 46, video decoder 48, spatial compressor 50, packetizer 52 and transmitter 54. Input interface or ports 42 connects graphics source 22 to graphics decoder 46 of module 30. In one embodiment, input port 42 may comprise a wired presently available connector, such as, but not limited to, a Video Electronics Standards Association (VESA) 15-pin d-sub, Digital Video Interface (DVI), or DisplayPort connector. In such an embodiment, incoming computer graphics data is first decoded into an uncompressed digital computer graphics data by computer graphics decoder 46. Computer graphics decoder 46 may comprises a presently available hardware decoder, such as an AD9887A decoder device from Analog Devices of Norwood, Mass. In other embodiments, input port 42 and decoder 46 may comprise other presently available or future developed devices or may have other configurations.
Input port 44 connects video graphics source 24 to decoder 48 of module 30. In one embodiment, port 44 is wired to a presently available connector, such as, but not limited to, a composite video connector, component video connector, Super-Video (S-Video connector, Digital Video Interface (DVI) connector, High-definition Multimedia Interface (HDMI) connector or SCART connector. In such an embodiment, incoming video graphics data is first decoded into an uncompressed digital video data by computer graphics decoder 48. Video decoder 48 may comprise a presently available hardware decoder, such as an ADV7400A decoder device for an analog input from Analog Devices of Norwood, Mass. or a Sil9011 decoder device for DVI/HDMI inputs from Silicon Image of Sunnyvale, Calif. In other embodiments, input port 44 and decoder 48 may comprise other presently available or future developed devices or may have other configurations.
As indicated by broken lines, in other embodiments, transmitter module 30 may be embedded with one or both of computer graphics source 22 or video source 24. In those embodiment in which module 30 is embedded with computer graphics source 22, input port 42 may be replaced with a presently available digital interface 42′ such as a 24-bit or a 30-bit parallel data bus which provides uncompressed digital computer graphics data directly to spatial compressor 50. In such an embodiment, computer graphics decoder 46 may be omitted.
In those embodiments in which module 30 is embedded with video source 24, input port 44 may be replaced with an interface 44′ configured to transmit a presently available digital video format, such as an ITU-R BT.601, or ITU-R BT.656 format which provides uncompressed digital video data directly to spatial compressor 50. Examples of other formats include, but are not limited to 480i, 576i, 720p, 1080i and 1080p. In such an embodiment, video decoder 48 may be omitted. In other embodiments, interface 42′ and 44′ may comprise other presently available or future developed interfaces.
Spatial compressor 50 comprises a presently available or future developed device or component configured to compress the digital computer graphics data or the video data using a presently available or future developed spatial data compression algorithm. In one embodiment, spatial compressor 50 utilizes a JPEG 2000 wavelet compression algorithm as supplied by LuraTech, Inc. of San Jose, Calif. Spatial compressor 50 operates on a full frame of incoming data, one field at a time, to minimize delay to one field of video data or one frame of computer graphics data. As a result, the output of spatial compressor 50 is sequential frames of compressed computer graphics data or fields of compressed video data.
Packetizer 52 comprises one or more devices, electronic components or processing units configured to create smaller information units out of the compressed data. Such smaller units may comprise, for example, commands, data, status information and other information, from each frame of compressed data, which is of a larger size (10,000 bytes). As will be described in more detail hereafter, packetizer 52 analyzes the compressed data stream to determine logical boundaries of segments and selectively parses the data steam into packets in a manner so as to reduce an number of partial logical segments in individual packets. Such smaller information units are packets of data passed as synchronous transfers to transmitter 54.
Transmitter 54 is a component, device or one or more processing units configured to transmit compressed and packetized data from module 30 to module 32 in a lossy environment. According to the example embodiment illustrated, transmitter 54 is configured to transmit the compressed and packetized data wirelessly to module 32. In one embodiment, transmitter 54 is a unit wideband (UWB) radio transmitter. In such an embodiment, the UWB radio transmitter has a transmission range of up to, for example, but not limited to, 30 feet. The data rate of transmitter 54 may be in the range of, for example, but not limited to, 110 to 480 Mbps. In such an embodiment, transmitter 54 operates across a relatively large range of frequency bands (for example, 3.1 to 10.6 GHz) with negligible interference to existing systems using same spectrum.
Receiver module 32 receives the compressed and packetized stream of data from transmitter module 30 and manipulates or converts such data for use by either computer graphics display 26 or video display 28. Receiver module 32 includes receiver 60, depacketizer 62, spatial decompressor 64, computer graphics encoder 66, video encoder 68 and output interfaces or ports 70, 72. Receiver 60 comprises a component, device or other structure configured to receive the stream of compressed packetized data from module 30. In the particular example embodiment illustrated in which transmitter 54 is a wireless transmitter, receiver 60 is a wireless receiver. In the example embodiment illustrated, receiver 60 is an ultra wideband radio receiver configured to cooperate with transmitter 54 to receive the stream of data. In other embodiments, receiver 60 may have other configurations depending upon the configuration of transmitter 54. In still other embodiments, where data is transmitted from module 30 to receiver module 32 via electrical signals or optical signals through physical lines, transmitter 54 and receiver 60 may have other configurations or may be omitted.
Depacketizer 62 is a processing unit or a portion of a processing unit configured to receiver the compressed and packetized data from receiver 60 and to reconstruct the compressed packetized data into compressed frames of computer graphics data or video data. During such reconstruction, depacketizer 62 detects and resolves any errors in the incoming packet data. For example, depacketizer 62 detects and handles any packets that have been received twice and disposes of the redundant packets. In one embodiment, depacketizer 62 further detects and any lost packets and replaces the loss of data with, for example, zeroes or data from a previous frame the compressed digital computer graphics data are the compressed digital video data is fed to spatial decompressor 64.
Spatial decompressor 64 comprises a presently available or future developed device, component or processing unit configured to decompress the digital computer graphics data or the video data using a presently available or future developed spatial data decompression algorithm. In one embodiment, spatial compressor 64 utilizes a JPEG 2000 wavelet decompression algorithm as supplied by LuraTech, Inc. of San Jose, Calif. The stream of decompressed computer graphics data or video data are subsequently transmitted to computer graphics encoder 66 and the video encoder 68, respectively, or directly to computer graphics display 26 or video display 28.
Computer graphics encoder 66 encodes the outgoing computer graphics data into a format suitable for transmission over output port 70. In one embodiment encoder 66 is a presently available or future developed hardware encoder. Examples of a presently available computer graphics encoder include, but are not limited to, the Sil164 encoder device for a DVI output from Silicon Image of Sunnyvale, Calif. or the ADV7122 encoder device for analog output from Analog Devices of Norwood, Mass. In such an embodiment, output port 70 may comprise a wired presently available or future developed connector. Examples of such a presently available connector include, but are not limited to, a VESA 15-pin d-sub, DVI, or DisplayPort connector. In other embodiments, other encoders and connectors may be utilized.
Video graphics encoder 68 encodes the outgoing computer graphics data into a format suitable for transmission over output port 72. In one embodiment encoder 68 is a presently available or future developed hardware encoder. Examples of a presently available hardware encoder include, but are not limited to, Sil9190 encoder device for DVI/HDMI output from Silicon Image of Sunnyvale, Calif. or the ADV7320 encoder device for an analog output from Analog Devices of Norwood Mass. In such an embodiment, output port 72 is a wired presently available connector, such as, but not limited to, a composite video connector, a component video connector, an S-video connector, DVI connector, HDMI connector or SCART connector. In yet other embodiments, other encoders and connectors may be utilized.
As indicated by broken line, in other embodiments, receiver module 32 may be incorporated as part of or embedded with one or bother of computer graphics display 26 or video display 28. In such an embodiment, the compressed image data may be transmitted directly from spatial decompressor 64 to one or both of display 26 or display 28, enabling one or both of encoder 66 or encoder 68 to be omitted. In those embodiments in which module 32 is embedded with display 26, port 70 may be replaced with port 70′ which may comprise a presently available 24 bit or 30 bit parallel data bus. In those embodiments in which module 32 is embedded with display 28, port 72 may be replaces with port 72′ which may comprise a presently available digital interface such as an ITU-R BT.601 or IU-R BT.656 format. Examples of other formats include, but are not limited to, 480i, 576i, 480p, 720p, 1080i and 1080p. In other embodiments, ports 70′ and 72′ may have other configurations.
Although Link 20 has been illustrated as having each of the aforementioned functional blocks as provided by one or more processing units and electronic componentry, in other embodiments, Link 20 may be provided by other arrangements. Although Link 20 has been described as having a single transmitter module 30 and a single receiver module 32, in other embodiments. Link 20 may alternatively include a single transmitter module 30 and multiple receiver modules 32, multiple transmitter modules 30 in a single receiver module 32 are multiple transmitter modules 30 and multiple receiver modules 32.
According to one example embodiment, data stream 100 is compressed using a JPEG 2000 wavelet-based compression format. In such an embodiment, packetizer 52 may identify segment boundaries as those boundaries between information “layers”. Each information layer has sufficient data to form a complete image having a selected degree of quality or resolution. The quality or resolution of the display image will increase as more “layers” are transmitted and received. Partial “layers”, layers for which data was lost during transmission, may not be usable. In such an embodiment, packetizer 52 identifies the segment boundaries as those boundaries between such layers in the stream 100 of data being transmitted such that segments 102 each comprise one or more substantially complete layers of the compressed image. Although
As illustrated with packets 106F and 106G, in those circumstances where a segment 102 has a size that is greater than the size of a transmission packet 106, the particular segment is split across multiple transmission packets 106 with any remaining transmission packet capacity of the last packet 106 being unused. In the example illustrated, segment 6 is larger than the size of each of packets 106 such that segment 6 is split into segment portions 6a and 6b which are transmitted in packets 106F and 106G, respectively. That portion of packet 106G not taken up by segment portion 6b remains unused.
The packetization method illustrated by
According to the method 208 shown in
As illustrated with packets 206C and 206D, in those circumstances where a segment 102 has a size that is greater than the size of a transmission packet 206, the particular segment is split across multiple transmission packets 206 with any remaining transmission packet capacity of the last packet 206 being unused. In the example illustrated, segment 6 is larger than the size of each of packets 206 such that segment 6 is split into segment portions 6a and 6b which are transmitted in packets 206C and 206D, respectively. The full capacity of packets 206C is utilized while that portion of packet 206D not taken up by segment portion 6b remains unused.
As with the packetization method 108 shown in
As shown by
As shown by
In the example illustrated, QoS designated segments 1-3 are completely contained within packet 306A. QoS designated segment 4 is split into segment portion 4a which is contained within packet 306A and segment portion 4b which is contained within packet 306B along with QoS designated segment 5. Non-QoS designated segment 6, being larger than the size of packets 306, is split amongst packets 306B, 306C and 306D. Segment 6 is split into segment portion 6a which is appended to QoS designated segments 4 and 5 in packet 306B. Segment portion 6a fully utilizes the capacity of packet 306C. The remaining segment portion 6c is placed into packet 306D. In the example illustrated in
As with the packetization methods 108 and 208, packetization method 308 minimizes degradation of visual image quality resulting from the loss of one or more packets 306 in a lossy environment. In particular, the loss of any one packet 306 that has one or more logical segments 102 contained therein does not result in a loss of neighboring logical segments 102 that are contained in subsequent packets 306 since any segment or combination of segments 102 that are smaller than a transmission packet 306 are completely contained in a single packet 306 and is not allowed to cross boundaries of packets 306 unless appended to a QoS designated segment where the particular segment 102 is larger than the packet size. Thus, the loss of a given packet 306 results in the loss of one or more complete segments 102 without affecting neighboring segments 102 of subsequent packets 306.
As shown by
In the example illustrated, low priority segments 1-3 are contiguously appended to one another and entirely contained within packet 406A. Because segment 4 is a low priority segment and because segment 4 cannot be “fit” within the remaining unused capacity of packet 406A, segment 4 is split into segment portions 4a and 4b. Segment portion 4a is contained within packet 406A while segment portions 4b is contained within the next success of transmission packet 406B. Since segment 5 is a high priority segment and since the strict segregation method 108 is being applied to such high priority segment, segment 5 is not appended to segment portion 4b in transmission packet 406B, but is placed in transmission packet 406B by itself. Alternatively, if the complete containment method 208 was applied to high priority segments, segment 5 would be contiguously appended to low priority segment portion 4b within transmission packet 406B since segment 5 could be completely contained within packet 406B with segment portion 4b.
As illustrated with packets 406D and 406E, in those circumstances where a segment 102 has a size that is greater than the size of a transmission packet 406, the particular segment is split across multiple transmission packets 406 with any remaining transmission packet capacity of the last packet 406 being unused. In the example illustrated, segment 6 is larger than the size of each of packets 406 such that segment 6 is split into segment portions 6a and 6b which are transmitted in packets 406D and 406E, respectively. Since the next successive segment, segment 7, is a low priority segment, segment 7 may be split. As a result, the remaining unused capacity of packet 406E is used to contain segment portion 7a of segment 7. The remainder of segment 7, segment portion 7b, is placed within packet 406F with low priority segment 8.
Overall, methods 108, 208, 308 and 408 shown and described with respect to
Although the present disclosure has been described with reference to example embodiments, workers skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope of the claimed subject matter. For example, although different example embodiments may have been described as including one or more features providing one or more benefits, it is contemplated that the described features may be interchanged with one another or alternatively be combined with one another in the described example embodiments or in other alternative embodiments. Because the technology of the present disclosure is relatively complex, not all changes in the technology are foreseeable. The present disclosure described with reference to the example embodiments and set forth in the following claims is manifestly intended to be as broad as possible. For example, unless specifically otherwise noted, the claims reciting a single particular element also encompass a plurality of such particular elements.