1. Field of Invention
This invention is directed to systems and methods for individuals and groups to index, locate and review recently recorded events.
2. Description of Related Art
In interactions among people, conversation often ranges widely and previous topics and ideas are often forgotten. Similarly, a person performing a task can make a recording of the task to allow that person's performance to be reviewed. It is possible to make audio or video recordings as records of these interactions and actions, because of the linear nature of those records.
However, it is difficult to refer back to the previously-recorded portions of a recording of an on-going activity, especially if the user would like to immediately include a portion of the recording of the activity at a later portion of the recording that same on-going activity.
Searching a traditional audio or video tape, even if it has been produced with a time code, requires a secondary index, such as meeting notes, to find and replay those recordings within the immediate context of a meeting.
Additionally, the fine-grained synchronization of ideas and activity within a group might be the provocation for replaying some earlier recorded-material. For example, someone may ask “Where were we?” or a new participant may enter the room and join in the discussion and thus require an explanation of the current state of the discussion.
This invention provides systems and methods for using structured representations to index recordings of activity.
This invention separately provides systems and methods that allow users to index, locate and review recorded events.
This invention separately provides systems and methods that enable a user to replay a previously-recorded portion of a recording and have that replayed portion copied or added to the recording at a later point.
The systems and methods of this invention use an index and a digital audio and/or video recording to provide access to recent digitally recorded material. For example, users initiate recordings of audio and/or video data, make an index, i.e., make notes or agenda about the contents of the recordings, and use those notes to randomly access the recordings for playback.
The systems and methods of this invention additionally provide an interface that correlates events and the corresponding notes with the recordings. This interface uses visual representations that are part of the content of meetings as the basis of marking and identifying events.
Furthermore, the systems and methods of this invention provide an environment in which a user who is recording and indexing an activity can replay a previous portion of the recording of that same activity and have the previously recorded section re-recorded into the recording at the current indexing point.
In an exemplary scenario, a meeting has been called, to begin at a predetermined time. The agenda includes five items and the agenda has been input into a system according to this invention and is displayed to the participants. During the meeting, as each new agenda item is begun, the moderator “checks off” that new agenda item on the agenda index to a recording of the meeting. This creates an index to the recording indicating the checked agenda item is being discussed. A participant is called out of the meeting, and thus misses a couple of items. Following the meeting, the participant can select the agenda items that were missed. The recording of the discussion for those selected events is replayed based on the selected agenda items.
While this is a simple example, the idea has a number of powerful variations. Specifically, the index or “agenda items” might be elements of diagrams or process representations. The representations used for the index, and for retrieving segments of the recording need not contain much detail. The “agenda items” can also be used to guide participants through structured recording activities.
Continuing with the above example, as each task item on the agenda comes up for review, that agenda item is ‘checked’. Following the activity, the various participants have a record of what was said about the status of tasks that may be of importance to them.
By way of another example, the systems and methods of this invention could be employed by someone repairing a complex piece of equipment. This piece of equipment has very high tolerances but the service technician may be unfamiliar with the machine. As the machine is disassembled, the technician's actions are recorded. As sub-assemblies are removed, the technician clicks on the image of the sub-assembly in the documentation on a display to create an index into the recording. During re-assembly, the technician encounters a part that does not seem to fit. The technician can then click on the image in the display. The previously-recorded removal sequence performed by that technician is replayed, to show that technician how he removed the part and thus help him replace the part.
In a further exemplary scenario, an insurance claims adjuster can use the systems and methods of this invention when examining an automobile. The claims adjuster has a video camera. The system displays an image of a claim form. It highlights sections that the adjuster should record with the video camera. The adjuster then clicks through the parts that are irrelevant to this particular claim and continues on to the next desired element.
The systems and methods of this invention actively mediate between the temporal recording and notes and records made using other media or systems. The systems and methods of this invention link images used in the supported processes with the recordings of those activities.
These and other features and advantages of this invention are described in or are apparent from the following detailed description of the preferred embodiments.
The preferred embodiments of this invention will be described in detail, with reference to the following figures, wherein:
Specifically, the index 105 in the object description file 100 can be a predefined document 110 which is loaded into the system prior to commencement of the activity to be recorded. Furthermore, the index 105 in the object description file 100 can be generated from user input 120. In this instance, as an activity proceeds, the moderator of that activity can index, i.e., timestamp, specific portions of the recorded/recording activity and label them with a corresponding name. Additionally, the index 105 in the object description file 100 can be extracted from an already existing document 130. In this instance, the already existing document 130 can be further annotated. Furthermore, the object description file 100 can be derived from a scanned document 140 which is input to the system.
All of these methods of input provide an object description file 100 which is an index 105, such as, for example, the agenda 115 shown in
The recording subsystem 210 records audio and/or video data of the activity. The editing subsystem 220 allows users to return after recording the activity to further augment the recording.
The playback subsystem 230 allows activity participants or others to view a previously-recorded activity at a later time, which may occur during the recording of the activity itself. Thus, the playback subsystem 230, in conjunction with the recording subsystem 210, allows simultaneous recording of the activity and playing back of the recording, so that activity participants may review a previously-recorded and indexed portion of the activity, while that review is simultaneously being recorded by the recording system 210 and indexed to the current index heading.
The input/output interface 250 enables communication between the activity indexing system 200 and the various types of recording and playback devices. The audio/video storage 260 is a memory device that is capable of storing audio and/or video data.
The functionality of the input/output devices in
As described in the previous examples, a user 500 could, for example, be a service technician or a meeting moderator who wishes to record the proceedings of a particular activity. Through the input/output devices, a selector 330, a microphone 255 and a video recorder 259, the user's activities are recorded in the audio/video storage 260 via the appropriate input/output devices. It should be appreciated that this set of input devices is exemplary only and that any known or later developed input device would be used instead of or in addition to those shown in
The selector 330, in response to the user's 500 actions, appropriately selects an item from the agenda 115 displayed on the display 300 to which the recorded events will be indexed. Therefore, as the activity progresses, the selected index items are associated with particular portions of the recording of the activity being recorded. In particular, the recording of the activity is indexed to the index 105 and stored in the object description file 100 in conjunction with a system clock 320. As described earlier, this indexing can occur progressively, i.e., as the user steps through an agenda which is shown, for example, on the display 300, or “on the fly.” In the case of “on the fly” indexing, as new “headings” are encountered or generated during the activity, index markings associated with those activities are registered in the object description file 100.
However, it should be appreciated the activity indexing system 200 is not limited to the particular embodiment shown in
In step S600, a determination is made whether another index item corresponding to the now-current location of the recording is desired. If an additional index item is to be added, control jumps back to step S400. However, if additional index items are not required, control continues to step S700. In step S700, recording ends. Control then continues to step S800, where the control sequence ends.
In step S1500, recording of the activity relative to this agenda item ends. Then, in step S1600, a determination is made whether another index item has been selected or added. If further indexing and recording is desired, control jumps back to step S1200. Otherwise, control continues to step S1700. In step S1700, the control sequence ends.
With reference to
At time t7, a stop command is encountered, pausing the recording of the activity. Next, at time t8, the recording of the activity begins again. At time t9, another previously-recorded agenda item is replayed as part of the fourth agenda item. Then, at time t10, the replaying of that previously-recorded agenda item stops. At time t11, the recording is indexed to the fifth agenda item, as the activity being recorded moves from the fourth agenda item to the fifth agenda item.
As shown in
It is, therefore, apparent that there has been provided in accordance with the present invention a method and apparatus for using structured representations to index recordings of activity. While this invention has been described in conjunction with preferred embodiments thereof, it is evident that many alternatives, modifications, and variations be apparent to those skilled in the art. Accordingly, applicants intend to embrace all such alternatives, modifications and variations to fall within the sphere and scope of this invention.
Number | Name | Date | Kind |
---|---|---|---|
4963995 | Lang | Oct 1990 | A |
5164865 | Shaw | Nov 1992 | A |
5307456 | MacKay | Apr 1994 | A |
5317732 | Gerlach, Jr. et al. | May 1994 | A |
5359712 | Cohen et al. | Oct 1994 | A |
5371551 | Logan et al. | Dec 1994 | A |
5388197 | Rayner | Feb 1995 | A |
5390027 | Henmi et al. | Feb 1995 | A |
5404316 | Klingler et al. | Apr 1995 | A |
5461711 | Wang et al. | Oct 1995 | A |
5546191 | Hibi et al. | Aug 1996 | A |
5550965 | Gabbe et al. | Aug 1996 | A |
5574845 | Benson et al. | Nov 1996 | A |
5613032 | Cruz et al. | Mar 1997 | A |
5625739 | Kotani | Apr 1997 | A |
5636078 | Tsai | Jun 1997 | A |
5701383 | Russo et al. | Dec 1997 | A |
5717879 | Moran et al. | Feb 1998 | A |
5761371 | Ohno et al. | Jun 1998 | A |
5826206 | Nemeth | Oct 1998 | A |
5833468 | Guy et al. | Nov 1998 | A |
5894306 | Ichimura | Apr 1999 | A |
5926605 | Ichimura | Jul 1999 | A |
5949952 | Bennett et al. | Sep 1999 | A |
5974219 | Fujita et al. | Oct 1999 | A |
6024577 | Wadahama et al. | Feb 2000 | A |
6052508 | Mincy et al. | Apr 2000 | A |
6128014 | Nakagawa et al. | Oct 2000 | A |
6188831 | Ichimura | Feb 2001 | B1 |
6282510 | Bennett et al. | Aug 2001 | B1 |
6385386 | Aotake | May 2002 | B1 |
6452615 | Chiu et al. | Sep 2002 | B1 |
Number | Date | Country |
---|---|---|
0 495 612 | Jul 1992 | EP |
WO9222983 | Dec 1992 | WO |