The invention relates generally to representing progress in a streaming video, and more particularly to systems and methods for navigating and visualizing the progress of selectably presentable branched video content.
A video progress bar is a component in a graphical user interface that is used to visualize both the progression of downloaded video (buffering) and the played or viewed video. Sometimes, the graphic is accompanied by a textual representation of the progress in a time- or percent-based format.
Typically, progress bars use a linear function, such that the advancement of a progress bar is directly proportional to the amount of video that has been completed from the total amount of video that will be played. However, varying disk, memory, processor, bandwidth and other factors often complicate this estimate.
Further, current progress indicators lack the ability to support the unique issues encountered when attempting to navigate and represent the progress of content that plays as a single, seamless video, but is made up of multiple paths and segments selectable in real-time by a viewer. Accordingly, there is a need for an visual progress indicator and accompanying navigation controls that facilitate user interaction with selectable video content.
The invention provides techniques and supporting systems for navigating and visualizing the progression of selectably presentable video content. To facilitate enhanced usability of user-facing media players, various implementations of the invention provide a progress indicator that dynamically displays the progression status of video content paths made up of selectably presentable video content segments while being viewed by a user. Aspects of the invention also provide an interactive control module that assists the user in their navigation along video content paths. The progression status is graphically represented in the form of a tree structure having linked video content segments forming the content paths, and the control module is used to navigation the video within this tree structure.
Therefore, in one aspect, a system for navigating and visualizing the progression of selectably presentable video content includes a progress indicator module for dynamically displaying the progression status of video content paths, each path made up of selectably presentable video content segments. The system further includes an interactive control module that allows a viewer to navigate along the video content paths and segments.
In some embodiments, the progression status includes a visual indicator that represents the portion of video content downloaded, the portion of video content played, and/or the portion of video content remaining for download. The progression status may include visual indicators identifying decision (branching) points. These indicators represent the points in a video when a transition may be made from one video segment to a second segment, which may be chosen from a set of multiple segment options.
In another embodiment, the progression status includes time markers that indicate when the decision points occur within the content. The time markers may be measured from the beginning of the video content path currently traversed by a viewer. The progression status may also display a time interval representing an amount of time permitted, upon reaching a decision point, for a viewer to choose which video content path to continue upon. In one embodiment, the progression status displays an amount of time until a decision point is reached.
In some embodiments, the progression status includes option indicators identifying each potential video content segment option that may be selected at a decision point. The option indicators may identify the selected segments after they are chosen at decision points. The progression status may be dynamically updated after each decision point along a traveled video content path.
In one instance, the progression status is represented as a graphical tree structure. The tree structure may contain linked video content segments forming a traveled video content path prior to a current point in time, and may further include potential segment selections branching from the content path after that point. In some embodiments, the tree structure includes linked video content segments forming all of the possible video content paths. The interactive control module may facilitate a viewer's navigation of the tree structure.
In some implementations, the progression status includes a display of statistics based on user selection of the video content segments, such as selections made by previous viewers of the video.
The progression status may include a video content length indicator. In some embodiments this length indicator is a maximum video duration based on a potentially followed video path that has the longest total length of video content segments. In other embodiments, the length indicator is a minimum video duration based on a potentially followed video content path having the shortest total length of video content segments. In yet other embodiments, the length indicator is based on the average duration of all possible video content paths.
In another embodiment, the system includes a preview module for displaying video thumbnails associated with a point in time along the video content paths. The thumbnails may represent previews of different video content segments potentially viewable at the point in time. The thumbnails may be from the currently viewed content segment, or, in other cases, from potentially viewable content segments. If the thumbnail is from a potentially viewable segment, upon selection of the video thumbnail the control module may seek to and/or display the video content segment corresponding to the thumbnail.
In another aspect, a method for navigating and visualizing the progression of selectably presentable video content includes the steps of dynamically displaying the progression status of video content paths, each path made up of selectably presentable video content segments. The method further includes the step of facilitating navigation along the video content paths.
In some embodiments, the progression status includes a visual indicator that represents the portion of video content downloaded, the portion of video content played, and/or the portion of video content remaining for download. The progression status may include visual indicators identifying decision (branching) points. These indicators represent the points in a video when a transition may be made from one video segment to a second segment, which may be chosen from a set of multiple segment options.
In another embodiment, the progression status includes time markers that indicate when the decision points occur within the content. The time markers may be measured from the beginning of the video content path currently traversed by a viewer. The progression status may also display a time interval representing an amount of time permitted, upon reaching a decision point, for a viewer to choose which video content path to continue upon. In one embodiment, the progression status displays an amount of time until a decision point is reached.
In some embodiments, the progression status includes option indicators identifying each potential video content segment option that may be selected at a decision point. The option indicators may identify the selected segments after they are chosen at decision points. The progression status may be dynamically updated after each decision point along a traveled video content path.
In one instance, the progression status is represented as a graphical tree structure. The tree structure may contain linked video content segments forming a traveled video content path prior to a current point in time, and may further include potential segment selections branching from the content path after that point. In some embodiments, the tree structure includes linked video content segments forming all of the possible video content paths. The method may further include facilitating a viewer's navigation of the tree structure.
In some implementations, the progression status includes a display of statistics based on user selection of the video content segments, such as selections made by previous viewers of the video.
The progression status may include a video content length indicator. In some embodiments this length indicator is a maximum video duration based on a potentially followed video path that has the longest total length of video content segments. In other embodiments, the length indicator is a minimum video duration based on a potentially followed video content path having the shortest total length of video content segments. In yet other embodiments, the length indicator is based on the average duration of all possible video content paths.
In another embodiment, the method further includes the step of displaying video thumbnails associated with a point in time along the video content paths. The thumbnails may represent previews of different video content segments potentially viewable at the point in time. The thumbnails may be from the currently viewed content segment, or, in other cases, from potentially viewable content segments. If the thumbnail is from a potentially viewable segment, upon selection of the video thumbnail the video may seek to and/or display the video content segment corresponding to the thumbnail.
Other aspects and advantages of the invention will become apparent from the following drawings, detailed description, and claims, all of which illustrate the principles of the invention, by way of example only
A more complete appreciation of the invention and many attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings. In the drawings, like reference characters generally refer to the same parts throughout the different views. Further, the drawings are not necessarily to scale, with emphasis instead generally being placed upon illustrating the principles of the invention.
Described herein are various embodiments of a dynamic progress indicator and interactive controls that provide for the visualization and navigation of selectably presentable video content. The progress indicator may be used in conjunction with seamlessly assembled and presented streaming video content, such as that described in U.S. patent application Ser. No. 13/033,916, entitled “System and Method for Seamless Multimedia Assembly,” and filed Feb. 24, 2011, the entirety of which is hereby incorporated by reference. Selectably presentable video content may include, for example, one or more separate video content paths and/or segments that are seamlessly presented to a viewer as a continuous video without noticeable interruptions in video and audio playback between segments. In some instances, the viewer is permitted to make choices at one or more various decision points interspersed throughout the video content, resulting in the corresponding video segment(s) and/or path(s) associated with the choices to be presented to the viewer in the selected order.
Because of the distinctive characteristics of selectably presentable video content, such as varying numbers of video segments, video segment lengths, different audio tracks and the like, these videos benefit from a special dynamic indicator and controller that can visually represent the progression of video play and buffering/download within multiple-path structures and enable viewer-directed navigation among the various paths of the video. The indicator may show progress information in real-time, and update frequently in order to present status correctly. Navigation may also be performed in real-time, and may be functionally integrated into the progress indicator and/or performed using separate buttons or other interactive controls.
The progress indicator may take several forms, with each representing the progress of branching video content. Although the indicators are represented and referred to herein as horizontal bars arranged in lines or trees, it is to be appreciated that this is an exemplary embodiment of the invention, and the progress indicators may take any suitable form, shape or orientation while still accomplishing the objects of the invention. Such forms may include, but are not limited to, circles, ovals, arcs, spirals, dials, gauges, and other forms suitable for representing progress and other information associated with media content.
Further, although the progress indicators are described herein with respect to video playback, the invention is applicable to streaming and non-streaming media, including audio, animation, video games, interactive media, and other forms of content usable in conjunction with the present systems and methods. Streaming media may include, for example, multimedia content that is continuously presented to a viewer while it is received from a content delivery source, such as a remote video server on the Web.
In some embodiments, the progress indicator is visually represented as a tree. This may include a tree structure form in which there are one or more starting video content segments, and at the end of each segment (and/or upon reaching a decision point), the segment branches out to one or more selectable content segments. Accordingly, as illustrated, a viewer may follow a left-to-right path containing various connected segments. One skilled in the art will appreciate that the tree structures described and depicted herein are merely exemplary embodiments, and any suitable progress status structure may be used to represent the branching media content. Such structures may include, but are not limited to, graphs, lists, flowcharts, hierarchical structures, state diagrams, and/or any combination of the foregoing.
The full tree form may include a visual representation of all video content paths. Still referring to
Full tree progress bar 120 is represented in railroad form; that is, one or more possible paths are represented as a single, contiguous tracks made up of their respective individual video content segments. At a decision point, a viewer may select an option that results in the video proceeding down a different path. For example, and as illustrated, at decision point 121, the viewer has made a choice (or the video player has automatically made a selection) resulting in the video previously playing on path 124 to continue instead on path 126.
In an additional example of the full tree form, progress bar 130 includes a number of content segments of varying length, forming numerous paths that can be taken throughout the viewing/downloading of a video. After the first decision point 131, the following video content segments each have different lengths. Thus, subsequent decision points occur at different times depending on which path is traversed. Video content segments may be shared among paths; for example, segment 135 can be reached after a decision point in both segments 132 and 134.
In the partial tree form 140, rather than displaying all possible video paths and/or segments, only a subset of the paths/segments is shown. For example, partial tree progress bar 140 shows only the segments of the path that have been played up to the current point in time 145, as well as the available choices 142, 144 after that time 145. The visual display may be limited to the available choices branching from the current video segment, or it may include all potential future segments. Past paths not followed and/or future segments not yet selected may be permanently hidden, or may be shown to a viewer by, for example, setting a preference or operating a GUI control.
In some embodiments, the progress bar structure is loaded prior to the video content loading, while in other embodiments, the progress bar and video content load in parallel. The progress indicator may be fully presented from the beginning of play, with chosen segments highlighted after each decision point. Alternatively, the bar may be shown only up to a certain point (e.g., up to the amount of video loaded (buffered), up to the amount of video played, or up to the end of the current video segment). The progress bar may dynamically add or remove video segments and/or paths from its display while the video is playing. In some embodiments, intelligent buffering of the video, audio, and/or other media content is performed as described in U.S. patent application Ser. No. 13/437,164, entitled “Systems and Methods for Loading More Than One Video Content at a Time,” and filed Apr. 2, 2012, the entirety of which is hereby incorporated by reference.
Traversal of the video content may be performed by selecting among options that appear on and/or around the video while the video is playing. The video segment that is played after the current playing segment is determined based on the option selected. Each option may result in a different video segment being played. The transition to the next video segment may be done immediately upon selection, at the end of the current segment, or at some other predefined point. Notably, the transition between content segments may be seamless. In other words, the video may continue playing regardless of whether segment selection is made, and no noticeable gaps appear in audio or video playback between any connecting segments. In some instances, the video continues on to a following segment after a certain amount of time if none is chosen, or may continue playing in a loop. If a video segment is missing, corrupted, or has otherwise failed to load, the progress bar may provide an indicator that the segment is missing and enable a viewer to jump to the end of (or past) the missing segment at any time.
The progress indicator may include various visualized information items in graphical and/or textual form. This information may include, for example, the progression of downloaded video content, the progression of played video content, and/or a percentage value representing the foregoing progressions. When displayed in a tree form, the progress indicator may show this information for each branch and/or content segment. In some embodiments, the progress indicator includes markers delineating branching points; i.e., points (or ranges) in time at which the viewer can select to transition to a new video content segment. These decision point markers may be placed at the ends of content segments (i.e., at the point of transition to the next content segment), or at any point (or range in time) in a video segment where a viewer is permitted to select a video path to traverse.
In some embodiments, the progress indicator includes time-based information for presentation to a viewer.
In one example, the video associated with the progress bars 200, 205 is a music video in which the viewer selects options at the decision points 210a, 210b to determine what content will be played. For the first segment 215a, the viewer may select either a tall or short performer to sing the first verse of the song. As shown, the “Tall” option is indicated as having been selected; thus, the video associated with the tall performer was played during segment 215a. Likewise, at decision point 210a, the viewer was provided with options to have either a blonde or brunette performer sing the second verse. Segment 215b of progress bar 205 indicates that the “Blonde” option was selected, and the video playing at current time 202 is therefore associated with that selection. Decision point 210b has not yet been reached, thus no selection is indicated as having been made for segment 215c.
Still referring to
In some embodiments, the progress indicator includes various statistics associated with the video content and dynamically displayed and updated in real-time. The statistics associated with the video content may include those described in U.S. patent application Ser. No. 13/034,645, entitled “System and Method for Data Mining within Interactive Multimedia,” and filed Feb. 24, 2011, the entirety of which is hereby incorporated by reference. The statistical information may be textually or graphically presented, e.g., as icons, in different colors, pictures, and/or videos. The information may appear as interactive elements on the video itself, or as part of the progress indicator.
The statistics may be associated with viewer selections based on previous plays of a video by the viewer, the viewer's friends, social networking connections, and/or all other users. For the example, the progress bar may display how popular a particular option is, and/or how many times a particular option was chosen (see
Stylized markers (e.g., stars, text, or other graphical icons) may be superimposed on or otherwise associated with the progress bar when a viewer selects the most popular option at a decision point. Previously followed video content paths may be shown in a different color or otherwise highlighted upon subsequent plays. For example, the current traversed video path may be shown in blue, while previously followed paths are colored yellow. The statistics may also be based on the structure of the video, for example: the number of video paths available to a viewer (this may be dynamically updated after each decision point), the number of overall possible paths, and/or the number of paths viewed out of the total number of possible paths.
As described above, the progress bar may include text or graphics indicating the options available for selection at decision points and/or the options that have previously been selected. These options may be displayed differently depending on the progress indicator structure. As illustrated in
In some embodiments, the options are hidden and do not initially appear on the progress indicator. In these instances, a hidden option may be displayed on the indicator only after it is selected or discovered by a viewer while playing the video. In other embodiments, certain options are not shown and/or made available to all or a subset of viewers. For example, in some circumstances, the video may be played in a “passive” mode, meaning that the viewer cannot make some or all of the choices that would be available during a normal play-through of the video. Passive mode may be used to replicate a play-through using previously made or predefined selections, and may, for example, allow a viewer to share his or her version of a video (i.e., the traversed path) with other viewers. In some instances, options may be hidden from viewers based on the location they are viewing the video from. For example, some video segments may contain copyrighted material not licensed for display in certain countries; thus, any options leading to those segments may be hidden and/or replaced by other options.
Referring to
Thumbnail previews may be shown for points before and/or points after the current point in time of the video. For points prior to the current time, the thumbnails displayed may include all possible paths at that time, or may only include the path traversed by the viewer. For points after the current time, the thumbnails displayed may include all possible paths at that time regardless of previous choices made, or may only include the paths available to the viewer based on selections previously made in viewing the video. The thumbnails may be presented in a grid or any suitable format. As further described below, the viewer may select one of the thumbnails to navigate to that point in the video.
The video may be controlled by a set of navigation and playback controls. In one embodiment, there are two types of controllers: constant and content. Constant controllers do not change form significantly (or at all) during most or all of the video playback. These controllers may be disposed outside the video, e.g., adjacent to the progress bar, and may include standard button controls such as play, pause, seek, enable/disable subtitles, and the like. Some constant controls may affect audio, video, and/or an interactive layer that sits on top of the video. For example, if the user pauses a video, the audio and interactive layer may be paused as well. Content controllers may appear on the video during playback as part of the interactive layer; these controls may be associated with the video content and may control and/or interact with it. Content controllers include, for example, the selectable options that appear prior to branching points that allow a viewer to decide how the video will continue. The interactive layer may also include links, pop-ups, advertisements, and/or other content that may permit user interaction and/or direct the viewer to other websites, videos, etc. upon selection.
Constant controls on or near the progress indicator may include a play button to start or continue playing the video, audio, and/or interactive layer, and a pause button to pause playback. While a video is paused and/or stopped, buffering of the content may continue, and in some instances, the interactive layer may continue to function so that a viewer can select a content option at a decision point while playback is paused and have the resulting selection affect the video after resuming play. A stop button may be included to allow a viewer to return to beginning of the video. In some instances, a viewer's previously selected options persist after stopping the video. Volume control and mute toggling buttons may also be included as constant controls.
The playback controls may further include seek buttons for navigating the various paths and segments of the video. Fast-forward and/or fast-backward buttons may adjust the viewer's position in the video accordingly, or may seek in fixed increments, such as three-second, five-second, or ten-second jumps. A “snap-seek” function may be provided that allows the viewer to seek predefined points in the video, such as immediately prior to decision points. Upon seeking backwards, the viewer's previous selections may be saved or reset. Upon seeking forward past a decision point, the video player may automatically select a decision (e.g., default, last used, most common), or may use a previous selection made by the viewer, if any. If the viewer attempts to seek to a point on the video timeline that has yet to be buffered, a “loading” indicator may be displayed while the video downloads.
The progress bar itself may be used for navigating within a progress status tree structure, and may include various interactive controls for doing so. For example, a viewer may jump to a specific point on the tree by selecting and/or clicking on that point. Markers on the progress bar may also be selected as specific jump points; for example, a the progress bar may display predefined time stamps and/or segment identifiers that may be selected, whereby a viewer is directed to a the identified time or segment. These markers may be placed, for example, shortly before branching points to allow the viewer to select a path to follow, shortly after branching points, or at any other point on a video content path. Again, a “loading” indicator may be displayed if the viewer attempts to view video that has not yet been buffered.
As described above, navigation controls may also include the ability to seek to a particular point in time or segment in the video by selecting a thumbnail from a group of one or more thumbnails that appears when hovering over a point on the progress bar. In selecting a particular segment to seek to, it may be assumed that the option(s) selected at decision point(s) prior to that segment (if any) are those that would need to have been made to reach the segment corresponding to the selected thumbnail. These assumed selections may be tracked for statistical purposes as previously described.
Upon seeking, the video, audio, and interactive layer remains synchronized, such that the selected segment presented to the viewer includes the same audio, video, and interactive components that would have been presented to the viewer had reached the same point without seeking. In other cases, different video, audio, and/or interactive components may be presented to the viewer, such as modified or customized options at decision points.
When the progress indicator is structured in a line form (see
The following methods may be used to estimate the total amount of time remaining in a video, and adjust the progress display accordingly: (1) assume that default selections are made for all remaining decision points; (2) use the decision point selections from the previous playback; (3) use decision point selections previously made by the current viewer and/or other viewers; (4) use the most common selection at each remaining decision point; (5) assume the longest duration path is traversed (i.e., determine the maximum total duration of each possible combination of remaining video segments); (6) assume the shortest duration path is traversed (i.e., determine the minimum total duration of each possible combination of remaining video segments); (7) use the average duration of the possible remaining paths; and/or (8) use the average duration of all possible remaining segments. In some versions the viewer may select which estimation methods are used. The progress status may display more than one calculated value; for example, the minimum and maximum times left to play may be shown together. In instances where the total video length is desired (not just the remaining length), the above calculations may include the durations of the already-traversed segments, or may include the durations of the maximum, minimum, and/or average segments or paths, regardless of whether or not traveled.
Referring further to
In some instances, the video allows for an option at a decision point to return the viewer to a segment at an earlier point in the same or a different video path, essentially allowing the viewer to enter into a video loop. In response to a looping action, the progress indicator may return the playback cursor to the desired segment on the line or tree structure, or, alternatively, the cursor may remain at the current point and the length of play may be recalculated appropriately (e.g., the video length may be extended by the length of the looped part, or an alternate method of calculation as described above may be used). To accommodate the extended playback, the progress bar may extend its length or resize its existing representation of segments to fit the looped portion.
In the instance of “unfolding” videos, the configuration of the progress bar as well as its associated time markers and other information may be updated as the video progresses. As shown in
In some embodiments, the progress indicator may support video detours; that is, interrupting the playback of the current video to view another video, and returning to the previous point in playback after the detour video completes playback or is otherwise terminated. Referring to the tree structure progress bar illustrated in
In various embodiments, the behavior of the progress indicator may dynamically change in order to influence the perception of the viewer. For example, the graphical progress playback status of the video may appear to speed up for several seconds leading up to a decision point. The playback progression status may pause at various points during the video without also pausing the corresponding video and audio content. Other instances of progress acceleration and/or deceleration may be utilized for effect. In some instances, the graphical effects such as pulses, ripples, coloring, and the like are applied to the progress indicator.
In some embodiments, an interactive content authoring tool is used to create the selectably presentable video content described herein. The progress bar may be included as part of the authoring tool to assist in the formation of this content.
One skilled in the art will recognize the various forms in which the systems and methods described herein may be implemented. For example, the invention may include a progress indicator module for progress, status, and statistical information for video content, and an interactive control module for navigating the video content. These functions may be implemented in any appropriate hardware or software. If implemented as software, the invention may execute on a system capable of running a commercial operating system such as the Microsoft Windows® operating systems, the Apple OS X® operating systems, the Apple iOS® platform, the Google Android™ platform, the Linux® operating system and other variants of UNIX® operating systems, and the like.
The software may be implemented on such hardware as a smart or dumb terminal, network computer, personal digital assistant, wireless device, smartphone, game machine, music player, mobile telephone, laptop, palmtop, wireless telephone, information appliance, workstation, minicomputer, mainframe computer, or other computing device, that is operated as a general purpose computer or a special purpose hardware device that can execute the herein described functionality. The software may be implemented on a general purpose computing device in the form of a computer including a processing unit, a system memory, and a system bus that couples various system components including the system memory to the processing unit.
The described systems may include a plurality of software processing modules stored in a memory and executed on a processor in the manner described herein. The program modules may be in the form of any or more suitable programming languages, which are converted to machine language or object code to allow the processor or processors to read the instructions. The software may be in the form of a standalone application, implemented in a multi-platform language/framework such as Java, .Net, Objective C, or in native processor executable code. Illustratively, a programming language used may include assembly language, Ada, APL, Basic, C, C++, C#, Objective C, COBOL, dBase, Forth, FORTRAN, Java, Modula-2, Pascal, Prolog, REXX, and/or JavaScript, for example.
Method steps of the techniques described herein can be performed by one or more programmable processors executing a computer program to perform functions of the invention by operating on input data and generating output. Method steps can also be performed by, and apparatus of the invention can be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit). Modules can refer to portions of the computer program and/or the processor/special circuitry that implements that functionality.
Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for executing instructions and one or more memory devices for storing instructions and data. Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in special purpose logic circuitry.
The techniques described herein can be implemented in a distributed computing system that includes a back-end component, e.g., as a data server, and/or a middleware component, e.g., an application server, and/or a front-end component, e.g., a client computer having a graphical user interface and/or a Web browser through which a user can interact with an implementation of the invention, or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet, and include both wired and wireless networks.
The system can include client and servers computers. A client and server are generally remote from each other and typically interact over a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
In various embodiments, the client computers include a web browser, client software, or both. The web browser allows the client to request a web page or other downloadable program, applet, or document (e.g., from the server(s)) with a web page request. One example of a web page is a data file that includes computer executable or interpretable information, graphics, sound, text, and/or video, that can be displayed, executed, played, processed, streamed, and/or stored and that can contain links, or pointers, to other web pages. In one embodiment, a user of the client manually requests a web page from the server. Alternatively, the client automatically makes requests with the web browser. Examples of commercially available web browser software are Microsoft® Internet Explorer®, Mozilla® Firefox®, and Apple® Safari®.
In some embodiments, the client computers include client software. The client software provides functionality to the client that provides for the implementation and execution of the features described herein. The client software may be implemented in various forms, for example, it may be in the form of a web page, widget, and/or Java, JavaScript, .Net, Silverlight, Flash, and/or other applet or plug-in that is downloaded to the client and runs in conjunction with the web browser. The client software and the web browser may be part of a single client-server interface; for example, the client software can be implemented as a “plug-in” to the web browser or to another framework or operating system. Any other suitable client software architecture, including but not limited to widget frameworks and applet technology may also be employed with the client software. The client software may also be in the form of a standalone application, implemented in a multi-platform language/framework as described above.
A communications network may connect the clients with the servers. The communication may take place via any media such as standard telephone lines, LAN or WAN links (e.g., T1, T3, 56 kb, X.25), broadband connections (ISDN, Frame Relay, ATM), wireless links (802.11, Bluetooth, GSM, CDMA, etc.), and so on. The network may carry TCP/IP protocol communications, and HTTP/HTTPS requests made by a web browser, and the connection between the clients and servers can be communicated over such TCP/IP networks. The type of network is not a limitation, however, and any suitable network may be used
In a client-server environment, the servers may be implemented on one or more server class computers that have sufficient memory, data storage, and processing power and that run a server class operating system (e.g., Oracle® Solaris®, GNU/Linux®, and the Microsoft® Windows® family of operating systems). Other types of system hardware and software than that described herein may also be used, depending on the capacity of the device and the number of users and the size of the user base.
Although internal components of the computer are not shown, those of ordinary skill in the art will appreciate that such components and the interconnections are well known. Accordingly, additional details concerning the internal construction of the computers need not be disclosed in connection with the present invention.
This application is a continuation of U.S. patent application Ser. No. 13/622,795, filed on Sep. 19, 2012, and entitled “Progress Bar for Branched Videos,” the entirety of which is incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
4569026 | Best | Feb 1986 | A |
5161034 | Klappert | Nov 1992 | A |
5568602 | Callahan et al. | Oct 1996 | A |
5568603 | Chen et al. | Oct 1996 | A |
5607356 | Schwartz | Mar 1997 | A |
5636036 | Ashbey | Jun 1997 | A |
5676551 | Knight et al. | Oct 1997 | A |
5715169 | Noguchi | Feb 1998 | A |
5734862 | Kulas | Mar 1998 | A |
5737527 | Shiels et al. | Apr 1998 | A |
5745738 | Ricard | Apr 1998 | A |
5754770 | Shiels | May 1998 | A |
5818435 | Kozuka et al. | Oct 1998 | A |
5848934 | Shiels | Dec 1998 | A |
5887110 | Sakamoto et al. | Mar 1999 | A |
5894320 | Vancelette | Apr 1999 | A |
6067400 | Saeki et al. | May 2000 | A |
6122668 | Teng et al. | Sep 2000 | A |
6128712 | Hunt et al. | Oct 2000 | A |
6191780 | Martin et al. | Feb 2001 | B1 |
6222925 | Shiels et al. | Apr 2001 | B1 |
6240555 | Shoff et al. | May 2001 | B1 |
6298482 | Seidman et al. | Oct 2001 | B1 |
6657906 | Martin | Dec 2003 | B2 |
6698020 | Zigmond et al. | Feb 2004 | B1 |
6728477 | Watkins | Apr 2004 | B1 |
6801947 | Li | Oct 2004 | B1 |
7155676 | Land et al. | Dec 2006 | B2 |
7231132 | Davenport | Jun 2007 | B1 |
7310784 | Gottlieb et al. | Dec 2007 | B1 |
7379653 | Yap et al. | May 2008 | B2 |
7444069 | Bernsley | Oct 2008 | B1 |
7472910 | Okada et al. | Jan 2009 | B1 |
7627605 | Lamere et al. | Dec 2009 | B1 |
7669128 | Bailey et al. | Feb 2010 | B2 |
7694320 | Yeo et al. | Apr 2010 | B1 |
7779438 | Davies | Aug 2010 | B2 |
7787973 | Lambert | Aug 2010 | B2 |
7917505 | van Gent et al. | Mar 2011 | B2 |
8024762 | Britt | Sep 2011 | B2 |
8065710 | Malik | Nov 2011 | B2 |
8151139 | Gordon | Apr 2012 | B1 |
8176425 | Wallace | May 2012 | B2 |
8190001 | Bernsley | May 2012 | B2 |
8276058 | Gottlieb et al. | Sep 2012 | B2 |
8281355 | Weaver et al. | Oct 2012 | B1 |
8600220 | Bloch et al. | Dec 2013 | B2 |
8612517 | Yadid et al. | Dec 2013 | B1 |
8650489 | Baum et al. | Feb 2014 | B1 |
8667395 | Hosogai et al. | Mar 2014 | B2 |
8826337 | Issa et al. | Sep 2014 | B2 |
8860882 | Bloch et al. | Oct 2014 | B2 |
8977113 | Rumteen et al. | Mar 2015 | B1 |
9009619 | Bloch et al. | Apr 2015 | B2 |
9021537 | Funge et al. | Apr 2015 | B2 |
9082092 | Henry | Jul 2015 | B1 |
9094718 | Barton et al. | Jul 2015 | B2 |
9190110 | Bloch | Nov 2015 | B2 |
9257148 | Bloch et al. | Feb 2016 | B2 |
9268774 | Kim et al. | Feb 2016 | B2 |
9271015 | Bloch et al. | Feb 2016 | B2 |
9367196 | Goldstein et al. | Jun 2016 | B1 |
9390099 | Wang et al. | Jul 2016 | B1 |
9465435 | Zhang et al. | Oct 2016 | B1 |
9520155 | Bloch et al. | Dec 2016 | B2 |
9530454 | Bloch et al. | Dec 2016 | B2 |
9607655 | Bloch et al. | Mar 2017 | B2 |
9641898 | Bloch et al. | May 2017 | B2 |
9653115 | Bloch et al. | May 2017 | B2 |
9653116 | Paulraj et al. | May 2017 | B2 |
9672868 | Bloch et al. | Jun 2017 | B2 |
9715901 | Singh et al. | Jul 2017 | B1 |
9792026 | Bloch et al. | Oct 2017 | B2 |
9792957 | Bloch et al. | Oct 2017 | B2 |
9826285 | Mishra et al. | Nov 2017 | B1 |
9967621 | Armstrong et al. | May 2018 | B2 |
20020053089 | Massey | May 2002 | A1 |
20020086724 | Miyaki et al. | Jul 2002 | A1 |
20020091455 | Williams | Jul 2002 | A1 |
20020105535 | Wallace et al. | Aug 2002 | A1 |
20020106191 | Betz et al. | Aug 2002 | A1 |
20020120456 | Berg et al. | Aug 2002 | A1 |
20020124250 | Proehl et al. | Sep 2002 | A1 |
20020129374 | Freeman et al. | Sep 2002 | A1 |
20020140719 | Amir et al. | Oct 2002 | A1 |
20020144262 | Plotnick et al. | Oct 2002 | A1 |
20020177914 | Chase | Nov 2002 | A1 |
20020194595 | Miller et al. | Dec 2002 | A1 |
20030007560 | Mayhew et al. | Jan 2003 | A1 |
20030148806 | Weiss | Aug 2003 | A1 |
20030159566 | Sater et al. | Aug 2003 | A1 |
20030183064 | Eugene et al. | Oct 2003 | A1 |
20030184598 | Graham | Oct 2003 | A1 |
20030221541 | Platt | Dec 2003 | A1 |
20040009813 | Wind | Jan 2004 | A1 |
20040019905 | Fellenstein et al. | Jan 2004 | A1 |
20040034711 | Hughes | Feb 2004 | A1 |
20040070595 | Atlas et al. | Apr 2004 | A1 |
20040091848 | Nemitz | May 2004 | A1 |
20040125124 | Kim | Jul 2004 | A1 |
20040128317 | Sull et al. | Jul 2004 | A1 |
20040138948 | Loomis | Jul 2004 | A1 |
20040172476 | Chapweske | Sep 2004 | A1 |
20040194128 | McIntyre et al. | Sep 2004 | A1 |
20040194131 | Ellis et al. | Sep 2004 | A1 |
20050019015 | Ackley et al. | Jan 2005 | A1 |
20050055377 | Dorey et al. | Mar 2005 | A1 |
20050091597 | Ackley | Apr 2005 | A1 |
20050102707 | Schnitman | May 2005 | A1 |
20050107159 | Sato | May 2005 | A1 |
20050166224 | Ficco | Jul 2005 | A1 |
20050198661 | Collins et al. | Sep 2005 | A1 |
20050210145 | Kim et al. | Sep 2005 | A1 |
20050251820 | Stefanik et al. | Nov 2005 | A1 |
20060002895 | McDonnell et al. | Jan 2006 | A1 |
20060024034 | Filo et al. | Feb 2006 | A1 |
20060028951 | Tozun et al. | Feb 2006 | A1 |
20060064733 | Norton et al. | Mar 2006 | A1 |
20060150072 | Salvucci | Jul 2006 | A1 |
20060155400 | Loomis | Jul 2006 | A1 |
20060200842 | Chapman et al. | Sep 2006 | A1 |
20060222322 | Levitan | Oct 2006 | A1 |
20060224260 | Hicken et al. | Oct 2006 | A1 |
20060274828 | Siemens et al. | Dec 2006 | A1 |
20070003149 | Nagumo et al. | Jan 2007 | A1 |
20070024706 | Brannon et al. | Feb 2007 | A1 |
20070033633 | Andrews et al. | Feb 2007 | A1 |
20070055989 | Shanks et al. | Mar 2007 | A1 |
20070099684 | Butterworth | May 2007 | A1 |
20070101369 | Dolph | May 2007 | A1 |
20070118801 | Harshbarger et al. | May 2007 | A1 |
20070157261 | Steelberg et al. | Jul 2007 | A1 |
20070162395 | Ben-Yaacov et al. | Jul 2007 | A1 |
20070226761 | Zalewski et al. | Sep 2007 | A1 |
20070239754 | Schnitman | Oct 2007 | A1 |
20070253677 | Wang | Nov 2007 | A1 |
20070253688 | Koennecke | Nov 2007 | A1 |
20070263722 | Fukuzawa | Nov 2007 | A1 |
20080019445 | Aono et al. | Jan 2008 | A1 |
20080021874 | Dahl et al. | Jan 2008 | A1 |
20080022320 | Ver Steeg | Jan 2008 | A1 |
20080031595 | Cho | Feb 2008 | A1 |
20080086456 | Rasanen et al. | Apr 2008 | A1 |
20080086754 | Chen et al. | Apr 2008 | A1 |
20080091721 | Harboe et al. | Apr 2008 | A1 |
20080092159 | Dmitriev et al. | Apr 2008 | A1 |
20080148152 | Blinnikka et al. | Jun 2008 | A1 |
20080170687 | Moors et al. | Jul 2008 | A1 |
20080177893 | Bowra et al. | Jul 2008 | A1 |
20080178232 | Velusamy | Jul 2008 | A1 |
20080276157 | Kustka et al. | Nov 2008 | A1 |
20080300967 | Buckley et al. | Dec 2008 | A1 |
20080301750 | Silfvast et al. | Dec 2008 | A1 |
20080314232 | Hansson et al. | Dec 2008 | A1 |
20090022015 | Harrison | Jan 2009 | A1 |
20090022165 | Candelore et al. | Jan 2009 | A1 |
20090024923 | Hartwig et al. | Jan 2009 | A1 |
20090055880 | Batteram et al. | Feb 2009 | A1 |
20090063681 | Ramakrishnan et al. | Mar 2009 | A1 |
20090077137 | Weda et al. | Mar 2009 | A1 |
20090079663 | Chang et al. | Mar 2009 | A1 |
20090083631 | Sidi et al. | Mar 2009 | A1 |
20090116817 | Kim et al. | May 2009 | A1 |
20090177538 | Brewer et al. | Jul 2009 | A1 |
20090191971 | Avent | Jul 2009 | A1 |
20090195652 | Gal | Aug 2009 | A1 |
20090199697 | Lehtiniemi et al. | Aug 2009 | A1 |
20090228572 | Wall et al. | Sep 2009 | A1 |
20090254827 | Gonze et al. | Oct 2009 | A1 |
20090258708 | Figueroa | Oct 2009 | A1 |
20090265746 | Halen et al. | Oct 2009 | A1 |
20090297118 | Fink et al. | Dec 2009 | A1 |
20090320075 | Marko | Dec 2009 | A1 |
20100017820 | Thevathasan et al. | Jan 2010 | A1 |
20100042496 | Wang et al. | Feb 2010 | A1 |
20100069159 | Yamada et al. | Mar 2010 | A1 |
20100077290 | Pueyo | Mar 2010 | A1 |
20100088726 | Curtis et al. | Apr 2010 | A1 |
20100146145 | Tippin et al. | Jun 2010 | A1 |
20100153512 | Balassanian et al. | Jun 2010 | A1 |
20100161792 | Palm et al. | Jun 2010 | A1 |
20100162344 | Casagrande et al. | Jun 2010 | A1 |
20100167816 | Perlman et al. | Jul 2010 | A1 |
20100186032 | Pradeep et al. | Jul 2010 | A1 |
20100186579 | Schnitman | Jul 2010 | A1 |
20100210351 | Berman | Aug 2010 | A1 |
20100262336 | Rivas et al. | Oct 2010 | A1 |
20100267450 | McMain | Oct 2010 | A1 |
20100268361 | Mantel et al. | Oct 2010 | A1 |
20100278509 | Nagano et al. | Nov 2010 | A1 |
20100287033 | Mathur | Nov 2010 | A1 |
20100287475 | van Zwol et al. | Nov 2010 | A1 |
20100293455 | Bloch | Nov 2010 | A1 |
20100332404 | Valin | Dec 2010 | A1 |
20110000797 | Henry | Jan 2011 | A1 |
20110007797 | Palmer et al. | Jan 2011 | A1 |
20110010742 | White | Jan 2011 | A1 |
20110026898 | Lussier et al. | Feb 2011 | A1 |
20110033167 | Arling et al. | Feb 2011 | A1 |
20110041059 | Amarasingham et al. | Feb 2011 | A1 |
20110078023 | Aldrey et al. | Mar 2011 | A1 |
20110078740 | Bolyukh et al. | Mar 2011 | A1 |
20110096225 | Candelore | Apr 2011 | A1 |
20110126106 | Ben Shaul et al. | May 2011 | A1 |
20110131493 | Dahl | Jun 2011 | A1 |
20110138331 | Pugsley et al. | Jun 2011 | A1 |
20110163969 | Anzures et al. | Jul 2011 | A1 |
20110191684 | Greenberg | Aug 2011 | A1 |
20110191801 | Vytheeswaran | Aug 2011 | A1 |
20110197131 | Duffin et al. | Aug 2011 | A1 |
20110200116 | Bloch et al. | Aug 2011 | A1 |
20110202562 | Bloch et al. | Aug 2011 | A1 |
20110238494 | Park | Sep 2011 | A1 |
20110246885 | Pantos et al. | Oct 2011 | A1 |
20110252320 | Arrasvuori et al. | Oct 2011 | A1 |
20110264755 | Salvatore De Villiers | Oct 2011 | A1 |
20110282745 | Meoded et al. | Nov 2011 | A1 |
20110282906 | Wong | Nov 2011 | A1 |
20110307786 | Shuster | Dec 2011 | A1 |
20110307919 | Weerasinghe | Dec 2011 | A1 |
20110307920 | Blanchard et al. | Dec 2011 | A1 |
20120004960 | Ma et al. | Jan 2012 | A1 |
20120005287 | Gadel et al. | Jan 2012 | A1 |
20120017141 | Eelen et al. | Jan 2012 | A1 |
20120062576 | Rosenthal et al. | Mar 2012 | A1 |
20120081389 | Dilts | Apr 2012 | A1 |
20120089911 | Hosking et al. | Apr 2012 | A1 |
20120094768 | McCaddon et al. | Apr 2012 | A1 |
20120110618 | Kilar et al. | May 2012 | A1 |
20120110620 | Kilar et al. | May 2012 | A1 |
20120134646 | Alexander | May 2012 | A1 |
20120147954 | Kasai et al. | Jun 2012 | A1 |
20120179970 | Hayes | Jul 2012 | A1 |
20120198412 | Creighton et al. | Aug 2012 | A1 |
20120263263 | Olsen et al. | Oct 2012 | A1 |
20120308206 | Kulas | Dec 2012 | A1 |
20130028573 | Hoofien et al. | Jan 2013 | A1 |
20130031582 | Tinsman et al. | Jan 2013 | A1 |
20130039632 | Feinson | Feb 2013 | A1 |
20130046847 | Zavesky et al. | Feb 2013 | A1 |
20130054728 | Amir et al. | Feb 2013 | A1 |
20130055321 | Cline et al. | Feb 2013 | A1 |
20130061263 | Issa et al. | Mar 2013 | A1 |
20130097643 | Stone et al. | Apr 2013 | A1 |
20130117248 | Bhogal et al. | May 2013 | A1 |
20130125181 | Montemayor et al. | May 2013 | A1 |
20130129308 | Kam et al. | May 2013 | A1 |
20130177294 | Kennberg | Jul 2013 | A1 |
20130188923 | Hartley et al. | Jul 2013 | A1 |
20130204710 | Boland et al. | Aug 2013 | A1 |
20130219425 | Swartz | Aug 2013 | A1 |
20130254292 | Bradley | Sep 2013 | A1 |
20130259442 | Bloch et al. | Oct 2013 | A1 |
20130282917 | Reznik et al. | Oct 2013 | A1 |
20130308926 | Jang et al. | Nov 2013 | A1 |
20130328888 | Beaver et al. | Dec 2013 | A1 |
20140019865 | Shah | Jan 2014 | A1 |
20140025839 | Marko et al. | Jan 2014 | A1 |
20140040273 | Cooper et al. | Feb 2014 | A1 |
20140040280 | Slaney et al. | Feb 2014 | A1 |
20140046946 | Friedmann et al. | Feb 2014 | A2 |
20140078397 | Bloch et al. | Mar 2014 | A1 |
20140082666 | Bloch et al. | Mar 2014 | A1 |
20140085196 | Zucker et al. | Mar 2014 | A1 |
20140094313 | Watson et al. | Apr 2014 | A1 |
20140101550 | Zises | Apr 2014 | A1 |
20140129618 | Panje et al. | May 2014 | A1 |
20140152564 | Gulezian et al. | Jun 2014 | A1 |
20140156677 | Collins, III et al. | Jun 2014 | A1 |
20140178051 | Bloch et al. | Jun 2014 | A1 |
20140186008 | Eyer | Jul 2014 | A1 |
20140194211 | Chimes et al. | Jul 2014 | A1 |
20140220535 | Angelone | Aug 2014 | A1 |
20140237520 | Rothschild et al. | Aug 2014 | A1 |
20140245152 | Carter et al. | Aug 2014 | A1 |
20140270680 | Bloch et al. | Sep 2014 | A1 |
20140282013 | Amijee | Sep 2014 | A1 |
20140282642 | Needham et al. | Sep 2014 | A1 |
20140380167 | Bloch et al. | Dec 2014 | A1 |
20150007234 | Rasanen et al. | Jan 2015 | A1 |
20150012369 | Dharmaji et al. | Jan 2015 | A1 |
20150015789 | Guntur et al. | Jan 2015 | A1 |
20150046946 | Hassell et al. | Feb 2015 | A1 |
20150058342 | Kim et al. | Feb 2015 | A1 |
20150067723 | Bloch et al. | Mar 2015 | A1 |
20150104155 | Bloch et al. | Apr 2015 | A1 |
20150179224 | Bloch et al. | Jun 2015 | A1 |
20150181271 | Conn et al. | Jun 2015 | A1 |
20150181301 | Bloch et al. | Jun 2015 | A1 |
20150185965 | Belliveau et al. | Jul 2015 | A1 |
20150195601 | Hahm | Jul 2015 | A1 |
20150199116 | Bloch et al. | Jul 2015 | A1 |
20150201187 | Ryo | Jul 2015 | A1 |
20150258454 | King et al. | Sep 2015 | A1 |
20150293675 | Bloch et al. | Oct 2015 | A1 |
20150294685 | Bloch et al. | Oct 2015 | A1 |
20150304698 | Redol | Oct 2015 | A1 |
20150331942 | Tan | Nov 2015 | A1 |
20160062540 | Yang et al. | Mar 2016 | A1 |
20160066051 | Caidar et al. | Mar 2016 | A1 |
20160104513 | Bloch et al. | Apr 2016 | A1 |
20160105724 | Bloch et al. | Apr 2016 | A1 |
20160132203 | Seto et al. | May 2016 | A1 |
20160162179 | Annett et al. | Jun 2016 | A1 |
20160170948 | Bloch | Jun 2016 | A1 |
20160173944 | Kilar et al. | Jun 2016 | A1 |
20160192009 | Sugio et al. | Jun 2016 | A1 |
20160217829 | Bloch et al. | Jul 2016 | A1 |
20160224573 | Shahraray et al. | Aug 2016 | A1 |
20160277779 | Zhang et al. | Sep 2016 | A1 |
20160303608 | Jossick | Oct 2016 | A1 |
20160322054 | Bloch et al. | Nov 2016 | A1 |
20160323608 | Bloch et al. | Nov 2016 | A1 |
20170062012 | Bloch et al. | Mar 2017 | A1 |
20170142486 | Masuda | May 2017 | A1 |
20170178409 | Bloch et al. | Jun 2017 | A1 |
20170178601 | Bloch et al. | Jun 2017 | A1 |
20170195736 | Chai et al. | Jul 2017 | A1 |
20170289220 | Bloch et al. | Oct 2017 | A1 |
20170295410 | Bloch et al. | Oct 2017 | A1 |
20180025078 | Quennesson | Jan 2018 | A1 |
20180068019 | Novikoff et al. | Mar 2018 | A1 |
Number | Date | Country |
---|---|---|
2639491 | Mar 2010 | CA |
004038801 | Jun 1992 | DE |
10053720 | Apr 2002 | DE |
0965371 | Dec 1999 | EP |
1033157 | Sep 2000 | EP |
2104105 | Sep 2009 | EP |
2359916 | Sep 2001 | GB |
2428329 | Jan 2007 | GB |
2008005288 | Jan 2008 | JP |
2004-0005068 | Jan 2004 | KR |
2010-0037413 | Apr 2010 | KR |
1996013810 | May 1996 | WO |
WO-2000059224 | Oct 2000 | WO |
WO-2007062223 | May 2007 | WO |
WO-2007138546 | Dec 2007 | WO |
WO-2008001350 | Jan 2008 | WO |
WO-2008052009 | May 2008 | WO |
WO-2008057444 | May 2008 | WO |
WO-2009137919 | Nov 2009 | WO |
Entry |
---|
An ffmpeg and SDL Tutorial, “Tutorial 05: Synching Video,” Retrieved from internet on Mar. 15, 2013: <http://dranger.com/ffmpeq/tutoria105.html>, (4 pages). |
Archos Gen 5 English User Manual Version 3.0, Jul. 26, 2007, pp. 1-81. |
Barlett, Mitch, “iTunes 11: How to Queue Next Song,” Technipages, Oct. 6, 2008, pp. 1-8, retrieved on Dec. 26, 2013 from the Internet http://www.technipages.com/itunes-queue-next-song.html. |
Gregor Miller et al. “MiniDiver: A Novel Mobile Media Playback Interface for Rich Video Content on an iPhoneTM”, Entertainment Computing A ICEC 2009, Sep. 3, 2009, pp. 98-109. |
International Search Report for International Patent Application PCT/IL2012/000080 dated Aug. 9, 2012 (4 pages). |
International Search Report for International Patent Application PCT/IL2012/000081 dated Jun. 28, 2012 (4 pages). |
International Search Report for International Patent Application PCT/IL2010/000362 dated Aug. 25, 2010 (2 pages). |
International Search Report and Written Opinion for International Patent Application PCT/IB2013/001000 dated Jul. 31, 2013 (12 pages). |
Labs.byHook: “Ogg Vorbis Encoder for Flash: Alchemy Series Part 1,” [Online] Internet Article, Retrieved on Jun. 14, 2012 from the Internet: URL:http://labs.byhook.com/2011/02/15/ogg-vorbis-encoder-for-flash-alchemy-series-part-1/, 2011, (pp. 1-8). |
Sodagar, I., (2011) “The MPEG-DASH Standard for Multimedia Streaming Over the Internet”, IEEE Multimedia, IEEE Service Center, New York, NY US, vol. 18, No. 4, pp. 62-67. |
Supplemental European Search Report for EP10774637.2 (PCT/IL2010/000362) dated Jun. 20, 2012 (6 pages). |
Supplemental European Search Report for EP13184145 dated Jan. 30, 2014 (6 pages). |
Yang, H, et al., “Time Stamp Synchronization in Video Systems,” Teletronics Technology Corporation, <http://www.ttcdas.com/products/daus_encoders/pdf/_tech_papers/tp_2010_time_stamp_video_system.pdf>, Abstract, (8 pages). |
Number | Date | Country | |
---|---|---|---|
20150199116 A1 | Jul 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13622795 | Sep 2012 | US |
Child | 14639579 | US |