A media player may receive a stream of multi-media information from a media server. For example, a content provider might deliver a stream that includes high-definition audio/video program to a television, a set-top box, or a digital video recorder through a cable or satellite network in a multiplexed, multi-program stream optimized for the transport media used for broadcasting. It may be convenient to navigate through this program, especially in a time-shifted viewing or recording or playback mode. The navigation may be based on an index data extracted from the audio/video program itself, from ancillary data streams or from attributes associated with the transmitted data due to the nature of digital transmission. Moreover, it might be desirable to determine a location in the stream or time associated with events in the program or in ancillary data streams that can be used later for navigation. For example, the location or time associated with an encryption key stored in the stream might be needed to facilitate a reverse playback of multi-media information to a viewer. As another example, pictures and group of pictures markers may be detected in the video stream to facilitate the seek operations. Note that indexing information may be hidden deep in several layers of data encapsulation in typical video system, and usually full demultiplexing and separation of the elementary data stream is required for indexing.
A media device may receive multi-media content, such as a television program, from a content provider. For example,
To efficiently deliver digital media content through the network 150, the media server 110 may encode information in accordance with an image processing process, such as a Motion Picture Experts Group (MPEG) process as defined by International Organization for Standardization (ISO)/International Engineering Consortium (IEC) document number 11172-1 entitled “Information Technology—Coding of Moving Pictures and Associated Audio for Digital Storage Media” (1993). Similarly, a High Definition Television (HDTV) stream might be encoded in accordance with the MPEG4 process as defined by ISO/IEC document number 14496-1 entitled “Information Technology—Coding of Audio-Visual Objects” (2001). As still another example, a stream might be encoded in accordance with the MPEG2 process as defined by ISO/IEC document number 13818-1 entitled “Information Technology—Generic Coding of Moving Pictures and Associated Audio Information” (2000).
The media server 110 may include a first encoder 114 that retrieves original image content from a first storage device 112 and generates a first Elementary Stream (ES1) of encoded multi-media image information. In some cases, multiple channels of media content are delivered concurrently in time-multiplexed packetized manner through the network 150. To accomplish that, the media server 110 may also include a second encoder 114A that retrieves original image content from a second storage device 112a and generates ES2. Although two encoders 114, 114A are illustrate in
Referring again to
Referring again to
The media recorder 120 may receive the TS and process the image information with a multi-media stream processing unit 122. Moreover, multi-media information might be recorded in a storage unit 124 (e.g., a memory or hard disk drive). A playback device 126 may access recorded multi-media information from the storage unit 124 to generate an output signal (e.g., to be provided to a video display and/or speakers).
In some cases, the media recorder 120 may need to determine a location or time associated with an event that has occurred in the ES, PES, and/or TS. For example, the location or time associated with the desired new GOP may need to be determined when a viewer wants to skip 30 seconds of a program. Note that a substantial amount of multi-media information may be recorded in the storage unit 124 (e.g., a recorded block might include several gigabytes of data). As a result, searching through the information to determine the location in the recorded data stream associated with a desired event may be impractical.
As before, a multi-media stream processing unit 522 processes and records multi-media information in a storage unit 524, and a playback device 526 may access the recorded multi-media information and generate an output signal (e.g., when playing recorded data for a user). To reduce the amount of the recorded data and increase the maximum recording capacity, the multi-media stream processing unit 522 may extract the transport packets that belong to the program of interest and ignore (remove) all packets that are irrelevant to selected program. This operation may be done at the transport packet level, without actually demultiplexing or interpreting the program's data.
According to some embodiments, an index engine 528 is provided to detect events occurring in the ES while the ES is still encapsulated in the PES, and PES is still encapsulated in TS packets, without explicit demultiplexing the program. For example, the index engine may detect events as the multi-media information is being recorded. Moreover, the index engine 528 may store event information in an index storage or file 700. The playback device 526 may then use the stored event information to quickly access the required areas in the recorded stream and facilitate a later presentation of recorded multi-media content to a viewer. As a result, a TS demultiplexor with context indexing may detect, for example, pictures and/or GOP markers for multiple concurrent independent programs without a complete demultiplexing of elementary streams.
At 602, a multi-media TS that encapsulates at least one PES is received, wherein the PES encapsulates at least one ES. For example, the TS might be delivered from a content provider to a media device through a network.
At 604, an event occurring within the ES is detected while the ES is encapsulated in the PES. The event might be detected, for example, by a hardware index engine. According to some embodiments, the index engine may be implemented using a programmable processor and associated firmware. Note that the event may be detected in parallel with an extraction of the ES from the PES. Moreover, the event may be detected using a first copy of the PES or TS after the ES has already been extracted from another copy of the PES or TS.
The event detected in the ES might, for example, be associated with a change in an image processing process or flow. For example, the event information might include an image processing process identifier, a GOP, or a GOP header. Other examples of events in the ES include, a frame, a frame type, a frame header, a sequence, a sequence header, a slice, a slice header, a quantizer scale, a motion vector, a start of block, a picture width, a picture height, an aspect ratio, a bit rate, a picture rate, a bit pattern, a start bit pattern, and/or picture entropy parameters.
The ES event might also be associated with an encryption status change. For example, the event might indicate that an encryption protocol identifier or a decryption key has been detected.
According to some embodiments, the event may be associated with media content. In this case, the event might indicate that the ES includes a server identifier, a media content identifier, media content rating information, a program identifier, a program title, a program description, or program schedule information.
According to still other embodiments, the ES event is associated with viewer information. For example, the event might be associated with a viewer flag (e.g., a viewer might activate a button on a remote control to “bookmark” media content), a viewer preference, a viewer rule, or a viewer identifier.
A media device might, according to some embodiments, analyze information within the ES (while the ES is encapsulated in the PES) to infer the occurrence of an event. For example, heuristics might be applied to information associated with at least one motion vector or quantization coefficient to infer a scene change, a scene context, or a scene type.
The event is detected at 604 while the ES is still encapsulated in the PES. According to some embodiments, the event may be detected while the PES is still encapsulated in the TS. Moreover, according to some embodiments an event may be detected as the combined event on multiple levels on the PES or ES levels. Examples of such events might include a DTS, a PTS, a stream identifier, a packet length, a PES header, or copyright information.
Similarly, according to some embodiments an event may be detected in the TS (instead of, or in addition to, the PES and/or ES). Examples of this type of event might include a programme clock reference, error information, a packet identifier, scrambling information, discontinuity information, priority information, splice information, or payload unit start information.
At 606, event information associated with the event is stored in an index storage. The index may be, for example, stored in a memory unit or a disk storage unit.
The table includes entries identifying events that have been detected. The table also defines fields 702, 704, 706, 708 for each of the entries. The fields specify: an event identifier 702, an event type 704, an event location 706, and event information 708. The information in the index 700 may be created and updated, for example, by the index engine 528 of
The event identifier 702 may be, for example, an alphanumeric code associated with an event that has been detected in the ES. They event type 704 might indicate a type of event that has been detected (e.g., a change in image processing or flow, an encryption-related event, or an event indicating a change in media content).
The event location 706 specifies the position where the event occurred within the ES. The event location 706 might be, for example, a time from a start of a recorded block or a time offset (e.g., from the last event). According to another embodiment, the event location 706 is a disk file pointer or offset. Similarly, an event memory location or location offset might define the location of an event (e.g., within a memory buffer).
The event information 708 might provide further information about the event that was detected. For example, the event information 708 might indicate which ES parameter was detected, that a new decryption key was received, or that a rating of a program has changed. Note that when the index 700 stores information about only one type of event, the event type 704 and event information 708 may not be required.
At 804, event information is retrieved from the index 700, and the retrieved information is used to facilitate a presentation of the recorded multi-media information to the viewer at 806.
Consider, for example, an index that stores memory locations associated with GOP starts in the ES. When a user instructs the player to seek to a new portion of a program (e.g., by skipping ahead five minutes), a memory location containing an appropriate start of GOP might be retrieved from the index and used to quickly and efficiently construct an image to be provided the viewer.
As another example, an index might store time values associated with encryption information. For example, different decryption keys might be required to decrypt different portions of a program. In this case, when a viewer instructs the player to seek to a different portion of a program (e.g., by rewinding thirty seconds), a time value associated with the appropriate decryption key might be retrieved from the index, and the time value may be used to quickly find the key required to descramble the content.
As another example, an index may store information associated with the media content of an ES along with viewer-introduced index information. For example, a viewer might flag a portion of a multi-media program, and a disk location associated with the nearest start of GOP for that portion might be stored in the index. When the viewer wants to return to the flagged portion, the information in the index may be retrieved and used for that purpose.
The de-multiplexer 902 may also provide PID information to an indexing control unit 908. The PID information might include, for example, a program name. The indexing control unit 908 may also receive a current buffer pointer from the memory controller 904 and may monitor the data packets being provided to the system memory buffer 906. When the indexing control unit 908 detects that an event has occurred (e.g., based on the PID or bit patterns in the data packets being provided to the system memory buffer 906), it may store the current buffer pointer value in temporary storage 910 (e.g., to facilitate creation of an index).
Consider, for example, a case where the indexing control unit 908 is to monitor the data packets to detect when a start of picture event has occurred in the encapsulated ES. In accordance with an MPEG bit stream encoding, such an event can be detected by finding the following unique sequence of bytes in the ES:
If another “00” is detected while in state 1030, the machine advances to the next state 1040 and now monitors the data packets to detect if a byte sequence of “01” is transferred. If something other than “01” is detected, the state machine returns to the initial idle state 1010. If a “01” is detected while in state 1040, the machine advances to the next state 1050 and now monitors the data packets to detect if a byte sequence of “00” is transferred. If something other than “00” is detected, the state machine returns to the idle state 1010.
If a “00” is detected while in state 1050, the hardware state machine has detected a byte sequence of “00 00 01 00,” and it generates a signal associated with the event index at state 1060 before returning to the idle state 1010 (e.g., to detect the next start of picture event). The generated signal might be associated with, for example, a mailbox, an interrupt, or some other notification process.
In response to the generated signal, the indexing control unit 908 may store the information in temporary storage 910. The stored information might include, for example, a current buffer pointer, event time information, event location information, and/or other information associated with the detected event.
Note that in some cases, the bit sequence being detected might be encapsulated in multiple packets and therefore may not be detectable as a continuous sequence of bytes (e.g., the information might be distributed throughout a boundary if discontinuous PES or TS packets). For example, “00 00” might be encapsulated at the end of a first packet while “01 00” is encapsulated at the start of the next packet which belongs to the same PID. However, there may be a few packets that belong to other PIDs in between.
As before, the state machine is initially in an idle state 1110 and monitors data packets to detect when a byte sequence is transferred from the memory controller 904 to the system memory buffer 906. When a byte sequence is detected, the machine advances to the next state 1120 and determines if the byte sequence was “00.” If a “00” is not detected, the machine returns to the idle state 1110.
If a “00” is detected while in state 1120, the machine determines if the byte sequence occurred at the end of a packet. If the byte sequence did not occur at the end of a packet, the machine advances to the next state 1130, store the context for this stream, and will monitor the data packets until the packet from the same PID context arrives. Then, the state machine will recall the context, and will re-start its pattern search to determine if a desired pattern existed separated across the boundary of the packets.
The described “store/recall” concept allows for finding bit patterns across the boundaries of the packets, and virtually concatenates the few bytes from the tail of the preceding packet of a context with head of the next packet. The memory required for such storage might not exceed the length of the pattern of interest (four bytes in this example) per context. Moreover, it may be sufficient to avoid the need for complete demultiplexing of the stream into a separate buffer.
If the byte sequence did occur at the end of a packet, the machine advances to a Store, Parse, and Restore (SPR) process 1170. The similar approach of “store/recall” can be used, but on the next encapsulation level—now to find the patterns that may span across the boundaries of the PES packets. The storage will be associated with a different context—the PES packet context.
In particular, the context of the state machine is stored (e.g., an indication that the state machine was at state 1120 before entering the SPR process 1170). The machine will then parse data and skip PES headers that encapsulate ES information. This might be done, for example, using information in a TS header that points to a PES payload and/or information in a PES header that points to an ES payload.
When the start of the next PES packet payload for the same ES is detected, the context is restored and the machine advances to state 1130. The machine will continue detecting bit patterns (and storing context, parsing data, and restoring context between states as appropriate). If the machine reaches state 1160, the bit pattern has been detected (in a single packet or encapsulated in multiple packets), and a signal associated with the event is generated.
According to some embodiments, a hardware engine is adapted to maintain more than one context. For detecting video sequences start codes, like GOP start, this approach might include one context for PES level and one context for PID level. If other events are interpreted and used for indexing, the number of contexts may increase.
For instance, detecting Closed Caption (CC) change events and GOP start events, one needs two contexts on the PES level. If one needs detecting encryption key changes and the CC and GOP events, four contexts per stream might be required: two for PES level (one for detecting conditional access table events, one for detecting video packets) and two for PES level.
The method may be generalized to multiple streams and multiple event types Different contexts might be associated with, for example, different types of events and/or more than one ES. The complexity of implementing indexing may still be lower than that with full demultiplexing followed by indexing of the output streams.
By creating a supplemental event information file while multi-media information is recorded, a device may efficiently facilitate the presentation of the recorded information to a viewer (e.g., letting the viewer jump ahead or reverse playback a program). Moreover, when a hardware state machine (or programmable processor and associated firmware) detect events in an ES while the ES is still encapsulated in a PES and/or TS, the use of Central Processing Unit (CPU) instructions to locate an event may be reduced (and a lower-cost device may be able to efficiently locate events and facilitate a presentation of multi-media content to a viewer).
According to some embodiments, the system 1200 further includes an indexing engine 1228 to detect and store information about an event in an ES while the ES is encapsulated in a PES. The playback device 1226 may then use the stored event information to facilitate a presentation of multi-media content to a viewer.
According to some embodiments, the system 1200 further includes a remote interface 1240 to facilitate multi-media content navigation by a viewer. The remote interface 1240 might, for example, let a user control the playback device 1226 via an Infra-Red (IR) receiver or a wireless communication network (e.g., to pause or fast-forward a television program).
The following illustrates various additional embodiments. These do not constitute a definition of all possible embodiments, and those skilled in the art will understand that many other embodiments are possible. Further, although the following embodiments are briefly described for clarity, those skilled in the art will understand how to make any changes, if necessary, to the above description to accommodate these and other embodiments and applications.
Although particular types of image processes and events have been described herein, embodiments may be associated with other types of image processes and/or events. Moreover, although particular data arrangements and state machines have been described as examples, other arrangements and machines may be used.
The several embodiments described herein are solely for the purpose of illustration. Persons skilled in the art will recognize from this description other embodiments may be practiced with modifications and alterations limited only by the claims.