Video player with seamless contraction

Information

  • Patent Grant
  • 8370746
  • Patent Number
    8,370,746
  • Date Filed
    Tuesday, October 30, 2007
    17 years ago
  • Date Issued
    Tuesday, February 5, 2013
    11 years ago
Abstract
A relativity controller is a scroll bar/window combination that provides a way to see data in relation to both the context of its wholeness and the salience of its contents. To accomplish this, the linear density or other appearance of the scroll bar (acting as a ruler or scale) varies with the density of the document salience (as indicated by different kinds of annotations or marks). It also provides a way to zoom between perspectives. This is usable on many different data types: including sound, video, graphics, calendars and word processors.
Description

This invention relates to a computer system, and in particular to computer tools to improve user perspectives and enhance navigation or browsing of information sources stored in or available via the computer.


BACKGROUND OF INVENTION

As computer accessing of large quantities of information increases, the ability of users to navigate large information spaces and to maintain visualization or personal perspectives thereof decreases [1] (bracketed numbers reference publications identified in Appendix A).


The need for this type of control has been expressed most recently by Furnas [2], Mills [3], Degen [4], and Chimera [5].


Furnas' solution to the problem of understanding the limited information available in a window of large information structures is to provide in the window the detailed region to be considered in the context of important preceding of succeeding parts of the large structure. For example, to edit lines in the middle of a program, the window would also display, say, declarations at the beginning of the program. No magnification of desired information or shrinkage of undesired information is employed; rather, the desired program information is normally displayed, and many parts of the program are omitted from the display.


Mills et al addressed the issue of giving users access to video data by magnifying time through successive hierarchial extraction of increasingly detailed segments. Each expanded segment view was displayed in a separate window of the display. And each segment view, as well as the total video view, including the time lines associated therewith, were linearly arranged from a temporal standpoint.


Degen et al moved marks on audio tape to a digitized counterpart document scroll bar, and let the user change the visual scaling of time within a single window, as well as the speed of playback. But, again, the visual representations, whether of the original size or of the zoomed expanded size, had a linear temporal structure.


Chimera, on the other hand, maintained a full display within the window but was unable to provide a zooming feature or expanded segment view of a text listing. Instead, Chimera used scroll bars that, independent of the original data's representation, indicate relative values of list attributes by respectively scaling proportions of list item indicators, according to those attributes, in the scroll bars.


Furnas shows in a single window multiple fisheye views of document segments. But Furnas doesn't disclose how a user can select which segments to display, or the means to magnify certain segments, or the means to control the degree of magnification, nor does Furnas provided a scroll bar or its equivalent as a convenient interface for the user to manipulate the display.


SUMMARY OF INVENTION

An object of the invention is a computer system providing improved means to allow users to extract important segments of computer-displayed information in the form of video, sound, graphics or text while maintaining a general view of the information.


Another object of the invention is a computerized system and method to enable users better to navigate or visualize large information spaces.


In accordance with one aspect of the present invention, means are provided to enable a user to visibly mark points or segments of displayed information, which will enable the user to quickly navigate to the marked displays.


In another aspect of the invention, a scroll bar is displayed alongside the information display, and the visible mark or marks appears on the scroll bar at locations corresponding to the desired information.


In accordance with a further aspect of the present invention, a computerized system provides, the user with means to shrink less, important or less significant portions of the information displayed, with the result of magnifying the portions that the user deems significant. In accordance with this aspect, the invention can be viewed as a user-friendly relativity controller tool that enables users to specify what is important to them, and modify the portion of their perceptual space that that information takes up, in a fisheye variant.


In accordance with another aspect of the invention, the resultant information can still occupy the same window where originally displayed, but with certain segments shrunk and other segments in comparison standing out or becoming more prominent.


In accordance with still other aspects of the invention, the relativity controller of the invention is implemented by simply pointing to the screen and actuating a control device. In a preferred embodiment, a mouse button is pressed to mark the beginning and end of segments of the information to be marked. A further feature is that multiple segments can be marked in this manner. Thus, the relativity controller of the invention not only allows users to mark the scope of one or more salient segments, but also will cause the display to simultaneously shrink the non-marked portions and in effect zoom into the multiple-marked segments in a single step. The result is a non-linear display of the available information. As a further feature, simultaneously with selective zooming of the information, the display of the scroll bar is correspondingly, modified to show in the context of the total information the marked and non-marked portions of the displayed information.


The major benefits is to allow users to quickly navigate through a large information space and to control the salience of the displayed information in the context of the full display while conserving display area, sometime, called desktop real estate. Moreover, maintaining a single window for the data and giving users the ability to visually navigate across the whole data via the scroll bar together with the ability to select the salient segments as well as the level of zoom, all in a single step, greatly enhances the ability of the user to cope intelligently and rapidly with large information structures containing large numbers of objects.


The above and further objects, details and advantages of the present invention will become apparent from the following detailed description of preferred embodiments thereof, when read in conjunction with the accompanying drawings.


SUMMARY OF DRAWINGS


FIG. 1 is a block diagram of a typical computer system;



FIGS. 2-6 schematically illustrate one form of the invention for use with audio representations;



FIG. 7 is an enlarged view of a scroll bar in accordance with another form of the invention;



FIG. 8 is a combined screen display and scroll bar of audio information in accordance with the invention;



FIGS. 9-11 illustrate various screen displays of text information produced by one form of the computerized system of the invention;



FIG. 12 shows various screen displays of video information produced by another form of the computerized system of the invention;



FIG. 13 illustrates, schematically, various cursor shapes produced by a system of the invention;



FIG. 14 shows a screen menu that can be used with the system of the invention;



FIGS. 15-33 are flow charts for implementing one form of computerized system of the invention.







DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS


FIG. 1 shows a typical computerized system 10, comprising a console 11 containing a CPU 12, memory 13 in the form of RAM, ROM and disk, and I/O circuitry 14 connected to a monitor 15 having a display screen 16, and control devices in the form of keyboard 17 and a mouse 18. The mouse 18 typically contains button switches 19 operated by a user of the system. A cursor or pointer 20 is typically displayed on the screen 16 and its position is controllable via the mouse 18 or the keyboard 17 as is well known. A typical window 21 is displayed on the screen 16, with a title bar 22 in the window.


The following terms used herein will have following meanings.


“Object” means any representation of information or of a data structure that can be displayed on the monitor screen, and includes one or more text, characters, one or more sound representations such as a digital sample, a video representation such as a video frame, and in general any graphic's element.


“Control device” means devices manipulated by users to move cursors around a screen, and include a mouse and keyboard.


“Pointing” to an object on screen means actuating the control device to move the cursor so that it is over or adjacent the object. When the cursor is a pointer such an arrow, it means moving the arrow tip close to the object.


“Clicking” on an object means to press and quickly release a switch on the control device, such as a button on a mouse, when the cursor is pointing to the object.


“Dragging” means to click on the object, and while holding the switch activated, to manipulate the control device to move the object to a new screen location, and then to release the switch to fix the new screen location of the object.


“Double-clicking” an object on screen is by pointing to the object and clicking twice rapidly, often used for special control purposes.


“Shrinking” the display of objects means reducing the time or space normally allocated to display the objects, and includes shrinking, them to the point where they essentially disappear from the display.


A “scroll bar” is a common control device displayed alongside a window, having, typically, at opposite ends small arrowed scroll boxes or buttons that when clicked on by the user causes the window contents to scroll.


A “button” Or “box” on a scroll bar is a representation of a control device for use with a mouse.


A “thumb” is a button or box on the scroll bar, between its ends, which moves and whose location on the scroll bar corresponds to the location in the whole information of the current view.



FIG. 2-14 illustrate several Ways in which the invention may be used. While the invention will be described in connection with a Macintosh personal computer (PC), which employs a graphics user interface (GUI), it is also usable with other PCs or workstations using other operating systems with GUIs, such as UNIX with X-windows, and DOS with Windows.


The first example concerns a sound representation. As illustrated in FIG. 2 on top, a user can sample, audio 24 into a computer 25 as described in the Macintosh user's manual, or alternatively record audio 24 onto tape 26 and then sample into the computer. 25. The computer processes the sound data 28 into a visual representation 29, based, for, example, on Cochlear models, principal component analysis, or Fast Fourier Transforms, as shown in FIG. 3. The result is displayed on the monitor screen 16 and can also be heard 30 by the user. The typical monitor screen contains a scroll bar 32 for scrolling through the sound representation using a left arrow-button 33 to scroll to the left, and a right arrow button 34 to scroll to the right. A thumb representation or button 35 which is displayed on the scroll bar shows by its location the portion of the sound representation displayed in the context of the whole sound. In other words, if the thumb 35 is at the center of the scroll bar 32, then the sound displayed is at the middle of the recording.


In a usual GUI display, a horizontal title bar 37 is located on top and a vertical menu or tool bar 38 is displayed at the left side. Clicking on any of the icons displayed in the tool bar will invoke appropriate software routines to carry out the function indicated by the icon. In this particular example, the user desires to annotate the sound representation, and the icons can represent an EDIT function, or a DRAW function including certain graphic symbols to be pasted into the sound representation.


In accordance with an aspect of the present invention, the computer has been trained or customized to recognize meaningful objects and mark them. In this particular case, a meaningful object can be any sound representation above a certain amplitude, i.e., loud sounds, but the computer can choose instead certain frequencies or ranges or certain sound sequences. Marking means with respect to the data structure representing the object to add a tag bit or other data representing a marked time or space position or point. If it is desired to mark a segment, meaning a temporal sequence of objects, then one tag data can represent the beginning of the marked segment, and another data bit can represent the end of the marked segment.


In accordance with another aspect of the present invention, the mark is displayed on the display. In FIG. 5 a diamond mark 40 is shown to indicate the temporal position of the large amplitude sound 41. When marks 40 are displayed at the salient points, the user can quickly fast-forward through the unmarked areas and then stop at or slowly play the marked points or segments by observing the mark or by programming the computer to automatically stop at marked points.


In accordance with another aspect of the invention, the scroll bar temporal representation is modified to display the marked points or segments. In the embodiment illustrated in FIG. 5, a density representation on the scroll bar is modified, with high density regions 42 representing unmarked segments, and low density regions 43 representing marked segments. Thus, while only a portion of the whole stored sound representation may be displayed in the window shown, the scroll bar in the window shown will show the positions of the marked segments of salient points relative to the whole set of objects stored. Thus, the user can quickly navigate to the salient points by the conventional fast forward or rewind buttons to reach and observe the annotated regions. FIG. 6 illustrates the customized annotation 44 added by the user to the sound representation. These annotations are also useful for indexing, hyper-navigation, and multi-sound catalogs. It is understood that marking 43 on the scroll bar can be used separately (FIG. 6) or together with marking 40 (FIG. 5) on the document display.



FIG. 7 shows an enlarged view of the scroll bar 32 indicating how high density 42 and low density 43 appearances can indicate non-marked and marked segments, respectively.


It will also be observed that, the scope or range of the marked objects is visible on the scroll bar 32 by the width of low density segments 43.



FIG. 8 shows another view of a screen window with a title bar 45 and a scroll bar 46 having scroll buttons 47 and 48, and a conventional window size button 49. An audio representation is displayed with marks 50 on the display and marks 51 on the scroll bar indicated by the arrows.


In accordance with a further feature of the invention, means can be provided to execute a relativity, controller function. This can be implemented automatically whenever a marking of salient points is made, or it can be implemented by, for example, pointing to the scroll bar, clicking, and then dragging the mouse perpendicular to the scroll bar, or it can be implemented, as explained later, by clicking on a special button added to the scroll bar and then dragging the mouse. In the flow charts described later, an option key is also used when clicking on the scroll bar. When the relativity controller function is activated, the computer modifies the linear temporal representation of the sound into a non-linear representation, with the non-marked segments shrunken in time and the marked segments expanded in time into the resultant empty regions and thus magnified. This is also illustrated in FIG. 8 which displays a large portion containing marked segments 51 indicated by the arrows and unmarked segments 52. If the user then plays through that portion of recorded sound, it will play at normal speed through the marked segments 51, but will fast-forward at, say, twice the normal speed through the unmarked segments 52. In the resultant display, the marked segments 51, having been expanded in time, show actual digital samples, whereas the unmarked segments 52 condense the samples into black bars.



FIGS. 9 and 10 indicate the effects on a text document. FIG. 9 shows a day calendar 54 with a linear representation of time events from the hours of 8 a.m.-1 a.m. FIG. 10 shows the representation obtained in the invention. In this case, a vertical scroll bar 55 is at the right containing the usual scroll boxes 56 at top and bottom. In this case, the salient points as shown are determined by the user. The remaining times, being of less importance in this example, are shrunk temporally. No magnification of the salient times has occurred but now the range of times shown has expanded to 7 a.m.-2 a.m. The resultant non-linear representation is replicated in the scroll bar 55 by the density of the horizontal lines.



FIG. 11 shows another example of application of the invention to text documents. FIG. 11 depicts one page 57 of linear spatially depicted text, which would normally be displayed as a single screen with its accompanying vertical scroll bar 58 with, in this case, a relativity, controller box 60. Three text lines have been highlighted 61 as salient. The thumb 62, it will be noted, has a certain size (height), showing as before one marked salient segment on the page. When the relativity controller box 60 is clicked on and dragged rightward (toward the right side of the mouse pad), reference numeral 64 now shows the resultant screen display. Note now that the non-salient (non-highlighted) parts of the original screen 57 have been shrunk or condensed, so that now not only the original marked segment 61 is visible but also a second marked segment 65. Note further that the thumb 62a has enlarged to indicate the increased number of visible salient points. Screen 64 also shows how the unmarked “insignificant” text above and below the salient segments shrink up, and disappear in the third screen 67 shown at the right when scaling perspective, as other salient segments 68 come into view. Thus, when the second screen 64 has segments 61,65 marked, and when the controller box 60 is clicked on and again dragged to the right, more of the succeeding text can be displayed as illustrated in the third screen 67. Again note the increased size of the thumb 62b. In all three cases, the scroll bar 58 illustrates at 70 the scope of the salient segments and thus the modified text representation. The text abstract generated 67 in this process could be presented in library search systems, so that the user could then more readily scan and expand the view to give more details as desired.



FIG. 12 depicts application of the invention to video, for example, with Apple QuickTime video. QuickTime allows a user to play through a video presentation with a window just like playing a video tape on a VCR, except that QuickTime also allows editing of one or more of the frames making up the video. In this case, three successive screens 72, 73 and 74 are depicted, only one of which would appear at a time in the window. Each screen has its accompanying scroll bar 75 having a relativity controller box 76 at the left end, and a thumb 77 showing the temporal position of the video frame being displayed. Note also markings 78 displayed on the scroll bar 75 to represent marked segments. This figure and FIG. 13 also illustrates user control of the magnification. Note that the relativity controller box 76 also shows different sized scroll representations to inform the user of its function. In this example, when the user clicks on the controller box 76, a cursor 80 is displayed oh the screen. While holding down the mouse button, when the user moves his or her mouse upwards, represented by image 82 (which is not actually displayed) the cursor 80 appearance changes with a larger white space region 81 to indicate higher magnification. During the movement, which is reversible—up for increased magnification, down for reduced magnification—, only one cursor image 79 is displayed, the full line image, representing the selected magnification level. The other grayed images are not displayed. When the user releases the mouse button, the selection of the magnification level is completed and may be stored with the data structure representation of the video if desired. In this instance, seven discrete levels of magnification are possible, but the invention also contemplates continuous change in magnification level. In the latter case, it is preferred to display a slide control with button, movable by the user to select the desired magnification level such as is used in the Macintosh volume control.


Note, further, in FIG. 12 how the user's marked segments 78 of video get longer and the scroller (above and below mark) gets lighter as the spacing between marks condenses and darkens when the user scales perspective by moving the mouse upwards, toward the top of the mouse pad. Also, note how the scroll bar appearance changes to reflect the size of the marks in relation to the length of the whole video.


Marking of the video can occur in the same manner as the audio, such as pressing a mouse button when the cursor is on the video to mark the beginning of a segment and releasing the button to mark the end of the segment. The resultant marks can be displayed on the video, or in the scroll bar, or on both.


In this aspect of the invention, not only is the user allowed to select and display the scope of salient segments, but as a further feature allows the user to vary the degree of magnification of the salient segments. It will also be understood that, besides size, other scroll bar changes can be used to represent the salient segments and/or different levels of magnification. For example, different colors can be used to represent on the scroll bar the salient and non-salient segments selected at different times or by different users, and if desired the intensity of the color used to illustrate level of magnifications.


The relativity controller application program will not interfere, with the normal functions available in programs such as Apple QuickTime, while providing the additional functions described above. A listing of available functions for a preferred embodiment, which is not meant to be limiting, appears below, to be used with, for example, an EDIT menu as depicted in FIG. 14.


The edit menu allows the user to perform the normal functions on displayed information, as well as the ability to remove any marks made by the user on the screen display or the scroll bar. what appears below is a description of functions available to the user to carry out the invention. One way of implementing these functions in software are shown in the program flow charts illustrated in FIGS. 15-33. The functions included are for the video of FIGS. 12 and 13, but obviously can be modified and applied to audio or text. Also, in the description of these button functions, the relativity controller has also been referred as the scale perspective button.

    • Adjust position in movie time
      • click or drag mouse in scroll bar
    • Play & Pause movies from/to anywhere in movie (beginning <-> end)
      • press Play/Pause button
    • Mark segments of movies (while playing or paused)
      • press mouse (and hold down for duration of mark) within movie window
    • Review individual marked segments
      • click on a mark, and press Play Segment button
    • Adjust relationship between marked and unmarked segments
      • click on Scale Perspective button and drag mouse vertically (up into movie->relative scaling; down out from movie->absolute scaling)
      • option-click on slider thumb and drag mouse vertically (one step navigation to/from specific point in time & resolution)
    • Remove 1 or all marks in movie
      • click on a mark, and select Remove Marker (cmd-R) from Edit Menu
      • click on a mark, gesture (press, drag, lift) left with mouse within mouse within movie window
      • select Remove All Markers from Edit Menu
    • Change current marking color
      • use Apple Color' control panel to choose Selection Color (may facilitate collaboration of groups of people (diff. color per person))
    • Copy frames or marked segments to the Macintosh clipboard & other applications
      • select Copy Frame or Copy Segment (cmd-C) from the Edit Menu
    • Save document marks and perspective
      • select Save (cmd-S) from File Menu
    • High Speed annotation:
      • adjust Scale, press Play, then mark segments
      • can also be used by scaling to fast-forward to stop when hits a marked segment.


As is conventional in the Macintosh, the left button 83 (FIG. 12) on the scroll bar represents the play button which then converts to pause during play. The right button 84 can thus be used by clicking as a play segment or play mark button.


Various features of the invention as well as modifications are also indicated below:

    • Marks can be drawn inside scroll bar to keep desktop real estate usage down
    • Marks can be colored to indicate different users or states of notation
    • Scale Perspective cursor changes (while adjusting scale) to reflect size of segments in scroller (also dynamically changing)
    • Relationship between marked and unmarked segments is reflected in:
      • speed of playback
        • (unmarked segments, speed-up with perspective relativity)
      • size of marks in scroll bar
        • (unmarked segments shrink in proportion to play speed)
        • (marked segments enlarge to fill the remaining scroll bar space)
      • color of scroll bar area: indicating density of scale (looks like depth of field)
        • (unmarked segments get darker in proportion to size in scroll bar & speed)
        • (marked segments get lighter in proportion to size in scroll bar)
    • At more relative scale, user has higher resolution access to time in that, area
      • (moving scroll thumb passes through less frames per pixel)
    • Overlapping marked areas join to form single marks (with 1 scope & 1 color if desired)
      • if new mark falls between original startTime and endTime, then
        • newColor=¼(new)+¾(orig.)
      • if new mark overlaps original startTime or endTime, then
        • newColor=(orig.+new)/2
      • if new mark Overlaps both original startTime and endTime, then
        • newColor=¾(new)+¼(orig.)
    • ‘Save’ menu item is enabled when user modifies marks or scale
      • (perspective is part of the document)
    • Marks & scale are saved inside movie files as QuickTime ‘user data’
      • Mark data consists of scope (startTime, endTime) and color when chosen (RGB)
    • Gives audio feedback when removing marks from segments
    • Marked movies have unique ‘stamped movie’ icon on Desktop


Also listed below is a summary of a few data types with examples of how the invention can be applied:













DATA TYPE:
APPLICATIONS:







Schedules
personal profile based time scaling


Sound
annotation & editing


Video
annotation & editing


Text
data retrieval & abstract searching


CAD & PICT
scaling space to dimensions of experiential perspective









Implementation of the various forms of the invention will be evident to those skilled in the art. Reference is made to “Inside Macintosh”, (published by Addsion-Wesley) which provides the code for developers for various kinds of interface constructs, such as scroll bars, control bars, slide controls, and boxes used therein, as well as how to display them in different colors or appearances, and how to invoke program routines when a user clicks on a box or icon, and how to change the appearance of an icon when a routine is executed. See, also, U.S. Pat. No. 4,931,783, which describes operation of a system with the Apple Graphical User Interface, whose contents are herein incorporated by reference.


To further assist those skilled in the art, FIGS. 15-33 are flow charts of one form of program suitable to implement a user selecting and displaying in accordance with the invention desired salient of a video presentation.


The person skilled in the art will have no trouble in understanding and implementing the flow charts illustrated. Virtually all of the statements printed in the flow chart boxes are understandable, and no need exists to repeat the text herein. However, certain statements require some explanation. The statements in the blocks indicated by double lines, such as block 85 in FIG. 17, represent calls to subroutines as labeled that are detailed in another of the figures. Thus, the Track Thumb routine 85 flow chart is shown in FIG. 29. In the labels, “button” refers to the mouse button, a “pressed” button, as in the Macintosh, changes its 3-d appearance to appear pressed; an “unpressed” button is the reverse. “Play segment” refers to the right button 84 in FIG. 12. The examples given are with a colored scroll bar to represent the marked segments. “Zoom” designates magnification level. “Stamp cursor” means that when the screen cursor is moved within the movie displays, the cursor shape changes to resemble a rubber hand stamp, indicating to the user that by clicking, he or she can mark (stamp) the document to indicate a salient point. “Play speed” refers to play speed of the video. “Update scroller” means to redo the scroll bar to show user selections. Conducting tests are indicated in the boxes by question marks (?); Y of N indicates the test was or was not successful.


To summarize some important aspects of the invention:

    • Linear density of scroll bar (i.e. ruler) can be varied with document salience density;
    • Scrolling rate can be varied with document content density;
    • Amount of document in window can be varied with document salience density;
    • The zoom control function can be implemented by clicking the scale perspective button and dragging perpendicular to scroll bar to zoom between perspectives:
      • drag out from document—>absolute scaling and
      • drag towards document—>relative scaling.


As a further alternative, the user can press an option key and click on the scroll bar, which will jump the thumb to the pointer position and simultaneously allow the user to scroll by moving the mouse horizontally and to change scale or magnification by moving the mouse perpendicularly (vertically) to the scroller. These changes will be visible on the screen display as Well as on the scroll bar.


Since the program of the invention runs as an application, clicking on the document display can readily be used to add to the document data structure in memory the time or spatial position of the salient marked display portion when/where the pointer rested.


Marking data structures will be evident to those skilled in the art. For text documents, adding a mark is generally similar to adding a formatting or printing code to the stored text. Marking video is similar to text marking, except that remembering character position is replaced by remembering time position and storing it in the user data portion of the movie.


As further marking alternatives, for video, the mouse button for marking can be held depressed while the video plays and released to define a marking point or segment. For text, the salient text can be highlighted and a menu dropped to select a marking function.


Although there have been described what are at present considered to be the preferred embodiments of the invention, it will be understood that the invention may be embodied in other specific forms without departing from the essential characteristics thereof. The present embodiments are therefore to be considered in all respects as illustrative, and hot restrictive. This scope of the invention is indicated by the appended claims rather than by the foregoing description.

Claims
  • 1. A video player for playing a read-only video content presentation comprising a plurality of video segments associable with indicia of which segment(s) are potentially salient wherein at least one salient segment is determined to be salient and at least one non-salient segment is not determined to be salient; the video player capable of: simultaneously reducing the allocation of time for playing all non-salient segment(s) without modifying any content of the non-salient segment(s) and without materially changing the allocation of time for playing any salient segment(s); andpresenting for display a time bar that indicates a shorter presentation playtime in response to such reducing the allocation of time for playing all non-salient segment(s).
  • 2. The video player of claim 1 further capable of presenting for display a user interface for user input in determining which segment(s) are salient.
  • 3. The video player of claim 1 further capable of presenting for display a user interface for user control in reducing the allocation of time for the non-salient segment(s).
  • 4. The video player of claim 1 wherein no time is allocated to the display of the non-salient segment(s).
  • 5. The video player of claim 1 wherein the non-salient segment(s) are skipped during play of the presentation.
  • 6. The video player of claim 1 wherein only the salient segment(s) appear to be played in the presentation.
  • 7. The video player of claim 1 further capable of presenting for display the time bar with indicia of the current play position in the presentation.
  • 8. The video player of claim 7 further capable of presenting for display the time bar with indicia of the location of the salient segment(s) in the presentation.
  • 9. The video player of claim 7 further capable of presenting for display the time bar with indicia of the location of the non-salient segment(s) in the presentation.
  • 10. The video player of claim 7 further capable of presenting for display the time bar with both indicia of the location of the salient segment(s) and indicia of the location of the non-salient segment(s) in the presentation.
  • 11. The video player of claim 1 further capable of reading video segment(s) or video segment associated indicia of salience from RAM memory.
  • 12. The video player of claim 1 further capable of fetching video segment(s) or video segment associated indicia of salience from disk(s).
  • 13. The video player of claim 1 further capable of receiving video segment(s) or video segment associated indicia of salience from a remote large information space.
  • 14. A video player for playing a read-only video content presentation comprising a plurality of video segments associable with indicia of which segment(s) are potentially salient wherein at least one salient segment is determined to be salient and at least one non-salient segment is not determined to be salient; the video player capable of: providing a user with an affordance to elect to reduce the allocation of time for playing all non-salient segment(s);simultaneously reducing the allocation of time for playing all non-salient segment(s) without modifying any content of the non-salient segment(s) and without materially changing the allocation of time for playing any salient segment(s); andpresenting for display a time bar that indicates a shorter presentation playtime in response to such reducing the allocation of time for playing all non-salient segment(s).
  • 15. The video player of claim 14 further capable of presenting for display a user interface for user input in determining which segment(s) are salient.
  • 16. The video player of claim 14 further capable of presenting for display a user interface for user control in reducing the allocation of time for the non-salient segment(s).
  • 17. The video player of claim 14 wherein no time is allocated to the display of the non-salient segment(s).
  • 18. The video player of claim 14 wherein the non-salient segment(s) are skipped during play of the presentation.
  • 19. The video player of claim 14 wherein only the salient segment(s) appear to be played in the presentation.
  • 20. The video player of claim 14 further capable of presenting for display the time bar with indicia of the current play position in the presentation.
  • 21. The video player of claim 20 further capable of presenting for display the time bar with indicia of the location of the salient segment(s) in the presentation.
  • 22. The video player of claim 20 further capable of presenting for display the time bar with indicia of the location of the non-salient segment(s) in the presentation.
  • 23. The video player of claim 20 further capable of presenting for display the time bar with both indicia of the location of the salient segment(s) and indicia of the location of the non-salient segment(s) in the presentation.
  • 24. The video player of claim 14 further capable of reading video segment(s) or video segment associated indicia of salience from RAM memory.
  • 25. The video player of claim 14 further capable of fetching video segment(s) or video segment associated indicia of salience from disk(s).
  • 26. The video player of claim 14 further capable of receiving video segment(s) or video segment associated indicia of salience from a remote large information space.
  • 27. A video player for playing a read-only video content presentation comprising a plurality of video segments to which indicia of which segment(s) are potentially salient can be associated, wherein at least one salient segment is determined to be salient and at least one non-salient segment is not determined to be salient; the video player capable of: allowing the appearance of only the salient segment(s), whereby the video playtime is shortened without modifying any content of the non-salient segment(s), during play of the presentation; andpresenting for display a time bar that indicates a shorter video playtime in response to such shortening of the video playtime.
  • 28. The video player of claim 27 further capable of presenting for display a user interface for user input in determining which segment(s) are salient.
  • 29. The video player of claim 27 wherein the non-salient segment(s) are skipped during play of the presentation.
  • 30. The video player of claim 27 further capable of presenting for display the time bar with indicia of the current play position in the presentation.
  • 31. The video player of claim 30 further capable of presenting for display the time bar with indicia of the location of the salient segment(s) in the presentation.
  • 32. The video player of claim 30 further capable of presenting for display the time bar with indicia of the location of the non-salient segment(s) in the presentation.
  • 33. The video player of claim 30 further capable of presenting for display the time bar with both indicia of the location of the salient segment(s) and indicia of the location of the non-salient segment(s) in the presentation.
  • 34. The video player of claim 27 further capable of reading video segment(s) or video segment associated indicia of salience from RAM memory.
  • 35. The video player of claim 27 further capable of fetching video segment(s) or video segment associated indicia of salience from disk(s).
  • 36. The video player of claim 27 further capable of receiving video segment(s) or video segment associated indicia of salience from a remote large information space.
Parent Case Info

This is a continuation of prior application Ser. No. 09/947,196 filed Sep. 4, 2001, which is a continuation of prior application Ser. No. 09/451,594, filed Nov. 30, 1999, now U.S. Pat. No. 6,335,730, which is a continuation of prior application Ser. No. 08/844,466 filed Apr. 18, 1997, now U.S. Pat. No. 6,177,938, which is a continuation of prior application Ser. No. 07/990,339 filed Dec. 14, 1992 now U.S. Pat. No. 5,623,588.

US Referenced Citations (186)
Number Name Date Kind
4305131 Best Dec 1981 A
4333152 Best Jun 1982 A
4415271 Mori Nov 1983 A
4445187 Best Apr 1984 A
4520404 Von Kohorn May 1985 A
4569026 Best Feb 1986 A
4591840 Curtis et al. May 1986 A
4645238 Vincent et al. Feb 1987 A
4685003 Westland Aug 1987 A
4695953 Blair et al. Sep 1987 A
4711543 Blair et al. Dec 1987 A
4754342 Duffy Jun 1988 A
4755811 Slavin et al. Jul 1988 A
4779252 Custers et al. Oct 1988 A
4780839 Hirayama Oct 1988 A
4786967 Smith, III et al. Nov 1988 A
4790028 Ramage Dec 1988 A
4800379 Yeomans Jan 1989 A
4875096 Baer et al. Oct 1989 A
4888638 Bohn Dec 1989 A
4930160 Vogel May 1990 A
4931783 Atkinson Jun 1990 A
5023727 Boyd et al. Jun 1991 A
5023851 Murray et al. Jun 1991 A
RE33662 Blair et al. Aug 1991 E
5039937 Mandt et al. Aug 1991 A
5050961 Venolia Sep 1991 A
5055924 Skutta Oct 1991 A
5076584 Openiano Dec 1991 A
5101364 Davenport et al. Mar 1992 A
5107343 Kawai Apr 1992 A
5109482 Bohrman Apr 1992 A
5122886 Tanaka Jun 1992 A
5129057 Strope et al. Jul 1992 A
5146212 Venolia Sep 1992 A
5155591 Wachob Oct 1992 A
5159668 Kaasila Oct 1992 A
5172111 Olivo, Jr. Dec 1992 A
5175631 Juri et al. Dec 1992 A
5204969 Capps et al. Apr 1993 A
5220540 Nishida et al. Jun 1993 A
5247438 Subas et al. Sep 1993 A
5261031 Saito Nov 1993 A
5313297 Fukui et al. May 1994 A
5333247 Gest et al. Jul 1994 A
5339391 Wroblewski et al. Aug 1994 A
5341466 Perlin et al. Aug 1994 A
5359712 Cohen et al. Oct 1994 A
5365360 Torres Nov 1994 A
5369570 Parad Nov 1994 A
5371532 Gelman et al. Dec 1994 A
5371846 Bates Dec 1994 A
5386493 Degen et al. Jan 1995 A
5388197 Rayner Feb 1995 A
5422468 Abecassis Jun 1995 A
5428731 Powers, III Jun 1995 A
5434678 Abecassis Jul 1995 A
5434954 Kawauchi et al. Jul 1995 A
5438356 Ushiki et al. Aug 1995 A
5442744 Piech et al. Aug 1995 A
5446833 Miller et al. Aug 1995 A
5446882 Capps et al. Aug 1995 A
5466882 Lee Nov 1995 A
5479600 Wroblewski et al. Dec 1995 A
5510808 Cina, Jr. et al. Apr 1996 A
5513306 Mills et al. Apr 1996 A
5524195 Clanton, III et al. Jun 1996 A
5524637 Erickson Jun 1996 A
5532715 Bates et al. Jul 1996 A
5537141 Harper et al. Jul 1996 A
5553221 Reimer et al. Sep 1996 A
5557724 Sampat et al. Sep 1996 A
5559949 Reimer et al. Sep 1996 A
5574567 Cookson et al. Nov 1996 A
5579463 Takano et al. Nov 1996 A
5586216 Degen et al. Dec 1996 A
5589945 Abecassis Dec 1996 A
5596705 Reimer et al. Jan 1997 A
5598276 Cookson et al. Jan 1997 A
5607356 Schwartz Mar 1997 A
5610653 Abecassis Mar 1997 A
5623588 Gould Apr 1997 A
5623589 Needham et al. Apr 1997 A
5630006 Hirayama et al. May 1997 A
5634849 Abecassis Jun 1997 A
5636036 Ashbey Jun 1997 A
5644507 Ostrover et al. Jul 1997 A
5648918 Hubbard Jul 1997 A
5664046 Abecassis Sep 1997 A
5684918 Abecassis Nov 1997 A
5692212 Roach et al. Nov 1997 A
5696869 Abecassis Dec 1997 A
5696905 Reimer et al. Dec 1997 A
5708767 Yeo et al. Jan 1998 A
5708845 Wistendahl et al. Jan 1998 A
5715400 Reimer et al. Feb 1998 A
5717814 Abecassis Feb 1998 A
5724472 Abecassis Mar 1998 A
5737479 Fujinami Apr 1998 A
5737527 Shiels et al. Apr 1998 A
5737552 Lavallee et al. Apr 1998 A
5745710 Clanton, III et al. Apr 1998 A
5751953 Shiels et al. May 1998 A
5754770 Shiels et al. May 1998 A
5771334 Yamauchi et al. Jun 1998 A
5774666 Portuesi Jun 1998 A
5781730 Reimer et al. Jul 1998 A
5781886 Tsujiuchi Jul 1998 A
5799280 Degen et al. Aug 1998 A
5805806 McArthur Sep 1998 A
5815671 Morrison Sep 1998 A
5828788 Chiang et al. Oct 1998 A
5828995 Satyamurti et al. Oct 1998 A
5841979 Schulhof et al. Nov 1998 A
5848934 Shiels et al. Dec 1998 A
5861881 Freeman et al. Jan 1999 A
5864868 Contois Jan 1999 A
5872927 Shiels et al. Feb 1999 A
5892507 Moorby et al. Apr 1999 A
5892966 Petrick et al. Apr 1999 A
5905845 Okada et al. May 1999 A
5907658 Murase et al. May 1999 A
5913013 Abecassis Jun 1999 A
5915067 Nonomura et al. Jun 1999 A
5936625 Kahl et al. Aug 1999 A
5945998 Eick Aug 1999 A
5953485 Abecassis Sep 1999 A
5973663 Bates et al. Oct 1999 A
5987211 Abecassis Nov 1999 A
5999173 Ubillos Dec 1999 A
5999696 Tsuga et al. Dec 1999 A
5999698 Nakai et al. Dec 1999 A
6002833 Abecassis Dec 1999 A
6006273 Ostrover et al. Dec 1999 A
6011895 Abecassis Jan 2000 A
6018612 Thomason et al. Jan 2000 A
6026446 Ostrover et al. Feb 2000 A
6038367 Abecassis Mar 2000 A
6061062 Venolia May 2000 A
6065042 Reimer et al. May 2000 A
6067401 Abecassis May 2000 A
6072934 Abecassis Jun 2000 A
6091886 Abecassis Jul 2000 A
6108281 Tozaki et al. Aug 2000 A
6111567 Savchenko et al. Aug 2000 A
6128712 Hunt et al. Oct 2000 A
6144375 Jain et al. Nov 2000 A
6148140 Okada et al. Nov 2000 A
6151444 Abecassis Nov 2000 A
6175840 Chen et al. Jan 2001 B1
6177938 Gould Jan 2001 B1
6181332 Salahshour et al. Jan 2001 B1
6185365 Murase et al. Feb 2001 B1
6192340 Abecassis Feb 2001 B1
6208805 Abecassis Mar 2001 B1
6215491 Gould Apr 2001 B1
6219052 Gould Apr 2001 B1
6222925 Shiels et al. Apr 2001 B1
6260194 Shiels et al. Jul 2001 B1
6269216 Abecassis Jul 2001 B1
6289165 Abecassis Sep 2001 B1
6304715 Abecassis Oct 2001 B1
6335730 Gould Jan 2002 B1
6336002 Yamauchi et al. Jan 2002 B1
6343298 Savchenko et al. Jan 2002 B1
6356707 Murase et al. Mar 2002 B1
6366732 Murase et al. Apr 2002 B1
6370199 Bock et al. Apr 2002 B1
6377996 Lumelsky et al. Apr 2002 B1
6385388 Lewis et al. May 2002 B1
6393158 Gould et al. May 2002 B1
6408128 Abecassis Jun 2002 B1
6446261 Rosser Sep 2002 B1
6457025 Judson Sep 2002 B2
6463207 Abecassis Oct 2002 B1
6490405 Speed et al. Dec 2002 B1
6501515 Iwamura Dec 2002 B1
6553178 Abecassis Apr 2003 B2
6580870 Kanazawa et al. Jun 2003 B1
6615270 Gould et al. Sep 2003 B2
6621980 Gould et al. Sep 2003 B1
6714723 Abecassis Mar 2004 B2
7054547 Abecassis May 2006 B1
7286747 Lewis et al. Oct 2007 B1
7467218 Gould et al. Dec 2008 B2
7890648 Gould et al. Feb 2011 B2
Foreign Referenced Citations (5)
Number Date Country
96122685 May 2003 CN
0346979 Dec 1989 EP
0677842 Oct 1995 EP
0814475 Dec 1997 EP
08278882 Sep 1998 JP
Non-Patent Literature Citations (66)
Entry
Gould, Eric J., “Relativity Controller, Reflecting User Perspective in Document Spaces” Inter CHI '93 Adjunct Proceedings.
IBM “Systems Application Architecture, Common User Access Advanced Interface Design Guide” IBM © 1989.
Mackinlay, Jack, “The perspective Wall: Detail and Context Smoothly Integrated” Xerox 1991.
Furnas, George, “Generalized Fisheye View” Bell Communications © 1986.
Mills, et al, “A Magnifier Tool for Video Data” ACM © 1992.
Dagen, et al, “Working with Audio: Integrating Personal Tape Recorders and Desktop Computers” ACM © 1992.
Chimera, Richard, “Value Bars: An Information Visualization and Navigation Tool for fvlulit-Attribute Listings” CHI 1992.
“Relativity Controller: Reflecting User Perspective in Document Spaces,” Gould, Eric J., InterCHI'93 Adjunct Proceedings (1993) [01-Gould.pdf].
“IBM Systems Application Architecture: Common User Access Advanced Interface Design Guide,” IBM (1989).
“The Perspective Wall: Detail and Context Smoothly Integrated,” Mackinlay, Jack, Xerox (1991) [03-Mackinlay.pdf].
“Generalized Fisheye Views,” Furnas, George, Bell Communications (1986) [04-Furnas.pdf].
“A Magnifier Tool for Video Data,” Mills, Michael, et al., ACM (1992) [05-Mills.pdf].
“Working with Audio: Integrating Personal Tape Recorders and Desktop Computers,” Degen, Leo, et al., ACM (1992) [06-Degen.pdf].
“Value Bars: An Information Visualization and Navigation Tool for Multi-Attribute Listings,” Chimera, Richard, CHI'92 (1992) [07-Chimera.pdf].
“Philips CDV 495/CDV496 Operating Instructions” [08-CDV496—OM—PHILIPS—EN.pdf].
“Toshiba DVD Video Player SD3107 Owner's Manual,” [09-sd-3107—om—e.pdf].
“Full Service Network,” Wikipedia <http://en.wikipedia.org/wiki/Full—Service—Network> [10-FSN-01-Full Service Network—Wikipedia, the free encyclopedia.pdf].
“Full Service Network (FSN) in Orlando Florida,” <http://www.ust.hk/˜webiway/content/USA/Trial/fsn.html> [11-FSN-02-Full Service Network.pdf].
“Time Warner Cable's Full Service Network Unveils New Navigator,” Business Wire, Tuesday, Apr. 30, 1996 <http://www.allbusiness.com/media-telecommunications/telecommunications/7225702-1.html> [12-FSN-03-Time Warner Cable's Full Service Network.pdf].
“Time-Warner's Home of the 21st Century,” by Davis, Arnold, Educom Review vol. 31 No. 1, Jan./Feb. 1996, <http://net.educause.edu/apps/er/review/reviewArticles/31130.html> [13-FSN-04-Educom Review.pdf].
“Touch of a Button,” Spartanburg Herald-Journal Home Section C p. 1 and continuation page <http://news.google.com/newspapers?nid=1876&dat=19950124&id=9LMeAAAAIBAJ&sjid=Ss8EAAAAIBAJ&pg=4619,2458911> [14-FSN-05-G-SHJ-950124.pdf].
“Design Issues for Interactive Television Systems,” by B. Furht, D. Kalra, F. Kitson, A.A. Rodriguez, and W.E. Wall, IEEE Computer, vol. 28, No. 5, May 1995, pp. 25-39, <http://www.cse.fau.edu/˜borko/Paper—Computer1995.pdf> [15-Furht-Paper—Computer1995.pdf].
“Movie-maps: An application of the optical videodisc to computer graphics,” Lippman, A., Proceedings of the 7th Annual Conference on Computer Graphics and Interactive Techniques, Seattle, Washington, pp. 32-42 (1980) [16-MovieMaps.pdf].
“New Orleans in Transition: The Interactive Delivery of a Cinematic Case Study,” Davenport, G. (Aug. 1987) (Revised from remarks given at the International Congress for Design Planning and Theory, Park Plaza Hotel, Boston, 1987) [17-NewOrleans.pdf].
“Creating and Viewing the Elastic Charles—a Hypermedia Journal,” Brøndmo, H.; Davenport, G. (1989) (Revised from document published in the Hypertext II Conference Proceedings, York, England, Jul. 1989) [18-ElasticCharles.pdf].
“Orchestrating Digital Micromovies,” Davenport, G. et al., Leonardo, vol. 26 No. 4, pp. 283-288, (1993) [19-Micromovies.pdf].
“HyperCafe: Narrative and Aesthetic Properties of Hypervideo,” Sawhney et al., Hypertext '96 Proceedings of the Seventh ACM Conference on Hypertext, (1996) [20-HyperCafe.pdf].
“HotVideo: The Cool Way to Link,” IBM Research Magazine, vol. 3 (1997) [21-HotVideo.pdf].
“Adding Hyperlinks to Digital Television,” Bove, V. Michael, Jr. et al., Proc. SMPTE 140th Technical Conference, (1998) [22-Hyperlinks.pdf].
“Construction of Interactive Movie System for Multi-Person Participation,” Nakatsu, Ryohei, et al., ICMCS 1998: 228-232 (1998) [23-InteractiveMovie.pdf].
Application File for U.S. Appl. No. 07/990,339, filed Dec. 14, 1992, US Pat. 5,623,588 now abandoned.
Application File for U.S. Appl. No. 08/844,466, filed Apr. 16, 1997, US Pat. 6,177,938.
Application File for U.S. Appl. No. 09/451,595, filed Nov. 30, 1999, US Pat. 6,219,052.
Application File for U.S. Appl. No. 09/452,275, filed Nov. 30, 1999, US Pat. 6,215,491.
Application File for U.S. Appl. No. 09/451,594, filed Nov. 30, 1999, US Pat. 6,335,730.
Application File for U.S. Appl. No. 09/947,196, filed Sep. 4, 2001.
Application File for U.S. Appl. No. 11/978,965, filed Oct. 30, 2007.
Application File for U.S. Appl. No. 11/978,964, filed Oct. 30, 2007.
Application File for U.S. Appl. No. 11/978,945, filed Oct. 30, 2007.
Application File for U.S. Appl. No. 12/248,931, filed Oct. 10, 2008.
Application File for U.S. Appl. No. 09/298,336, filed Apr. 23, 1999, US Pat. 6,393,158.
Application File for U.S. Appl. No. 10/107,945, filed Mar. 26, 2002, US Pat. 6,615,270.
Application File for U.S. Appl. No. 10/603,581, filed Jun. 24, 2003, US Pat. 7,467,218.
Application File for U.S. Appl. No. 11/978,966, filed Oct. 30, 2007, US Pat. 7,890,648.
Application File for U.S. Appl. No. 11/978,967, filed Oct. 30, 2007, abandoned.
Application File for U.S. Appl. No. 11/978,954, filed Oct. 30, 2007, abandoned.
Application File for U.S. Appl. No. 12/941,830, filed Nov. 8, 2010.
Application File for U.S. Appl. No. 09/298,681, filed Apr. 23, 1999, US Pat. 6,621,980.
Application File for U.S. Appl. No. 09/298,586, filed Apr. 23, 1999, abandoned.
Application File for U.S. Appl. No. 10/360,271, filed Feb. 7, 2003, abandoned.
Lavallee Application File for U.S. Appl. No. 08/508,971, filed Jul. 28, 1995, US Pat. 5,737,552.
Daniels Application File for U.S. Appl. No. 08/038,240, filed Mar. 29, 1993.
Daniels Application File for U.S. Appl. No. 08/306642, filed Sep. 15, 1994.
Daniels Provisional Application File for U.S. Appl. No. 60/014,959, filed Apr. 8, 1996.
Daniels Application File for U.S. Appl. No. 08/641,517, filed May 1, 1996.
Daniels Application File for U.S. Appl. No. 08/900,417, filed Jul. 25, 1997.
Daniels Application File for U.S. Appl. No. 09/992,190, filed Nov. 16, 2001, US Pat. 6,973,669.
Daniels Application File for U.S. Appl. No. 11/250,807, filed Oct. 14, 2005, US Pat. 7,437,751.
Daniels Application File for U.S. Appl. No. 12/246,161, filed Oct. 6, 2008.
Re-Examination File for Control No. 90/011,365 filed Dec. 3, 2010.
Re-Examination File for Control No. 95/001,504 filed Dec. 9, 2010.
Re-Examination File for Control No. 95/001,506 filed Feb. 15, 2011.
Docket Sheet from Civil Action 1:10-cv-00319-SS dated May 13, 2011.
Documents from Civil Action 1:10-cv-00319-SS as of May 13, 2011.
Docket Sheet from Civil Action 1:10-cv-00533-SS dated May 13, 2011.
Documents from Civil Action 1:10-cv-00533-SS as of May 13, 2011.
Related Publications (1)
Number Date Country
20080184145 A1 Jul 2008 US
Continuations (4)
Number Date Country
Parent 09947196 Sep 2001 US
Child 11978965 US
Parent 09451594 Nov 1999 US
Child 09947196 US
Parent 08844466 Apr 1997 US
Child 09451594 US
Parent 07990339 Dec 1992 US
Child 08844466 US