1. Field of the Invention
The present invention relates to a data management apparatus, a data management method, and a computer-readable recording medium thereof.
2. Description of the Related Art
By recording audio and video data pertaining to, for example, a conference or a lecture and enabling the data to be viewed and listened to afterward along with conference minutes, handouts, etc., it is possible for contents of the conference or the lecture to be reviewed or conveyed to absentees. Further, there is a system that enables contents of a presentation, a conference, a lecture or the like to be browsed together with material used in the presentation, the conference, the lecture or the like by using application software. The application software records contents of the presentation, the conference, the lecture or the like (video footage), stores the contents used in the presentation, the conference, the lecture of the like, and fabricates data enabling the recorded contents to be viewed and listened to in synchronization with the material used in the presentation, the conference, the lecture or the like.
Further, Japanese Laid-Open Patent Publication No. 2005-210408 discloses an example of a method for associating visual data to printed material where printing contents (i.e. contents to be printed) are delivered in association with visual data. With this example, a screen(s) extracted from video contents is stored in association with printing contents and allows the extracted screen and the printing contents to be simultaneously displayed in a case of printing out the printing contents. Thereby, the user can easily confirm the printing contents.
In general, the above-described system is configured to mainly display visual and audio data (hereinafter also simply referred to as “contents”) and additionally display material corresponding to the contents. Although the user can perform operations such as fast-forward or skipping with the system, it is, as a rule, necessary for the user to reproduce the entire contents for understanding the content of the contents. Therefore, in a case where the user desires to view and listen to a portion of the contents corresponding to particular material, the user needs to manually find the location corresponding to the portion of the contents by reproducing the contents. Finding the desired portion of the contents is difficult for the user.
Japanese Laid-Open Patent Publication No. 2005-210408 discloses a technology that facilitates usability for the user by storing visual contents in association with printing contents and making the visual contents available in a case where the visual contents are delivered in association with the printing contents. However, Japanese Laid-Open Patent Publication No. 2005-210408 is not aimed to facilitate reproduction of contents based on corresponding material.
The present invention may provide a data management apparatus, a data management method, and a computer-readable recording medium that substantially eliminate one or more of the problems caused by the limitations and disadvantages of the related art.
Features and advantages of the present invention are set forth in the description which follows, and in part will become apparent from the description and the accompanying drawings, or may be learned by practice of the invention according to the teachings provided in the description. Objects as well as other features and advantages of the present invention will be realized and attained by a data management apparatus, a data management method, and a computer-readable recording medium particularly pointed out in the specification in such full, clear, concise, and exact terms as to enable a person having ordinary skill in the art to practice the invention.
To achieve these and other advantages and in accordance with the purpose of the invention, as embodied and broadly described herein, an embodiment of the present invention provides a computer-readable recording medium on which a program is recorded for causing a computer to execute a data management method, the data management method including the steps of: obtaining document identification data used for identifying a target document; obtaining page identification data used for identifying a page of the target document; obtaining document use data indicating a display time and a display location in which each page of the target document was displayed; obtaining recording data indicating a recording time and a recording location in which AV (Audio Visual) data was recorded in a case where the recording location is within a predetermined range from the display location; identifying a portion of the AV data corresponding to the display time of the page of the target document; and outputting access data that provides access to the portion of the AV data.
Other objects, features and advantages of the present invention will become more apparent from the following detailed description when read in conjunction with the accompanying drawings.
The network A is connected to a network B via a public line 8 (e.g., the Internet, public switched network). The network B is connected to a user terminal (data processing terminal) 7 operated by a user different from the user operating the user terminal 2. The user terminal 2 is connected to a web camera 6 that photographs dynamic images and inputs the images to the user terminal 2. With this configuration, the user of the user terminal 2 and other users in the vicinity of the user of the user terminal 2 can share visual data, audio data, and document material with the user of the user terminal 7 and hold a network conference with the user of the user terminal 7.
In this embodiment, the image forming apparatus 1 is a multifunction machine including functions such as a photographing function, an image forming function, and a communicating function. Thereby, the image forming apparatus 1 can be used as a printer, a facsimile machine, a scanner, and a copier. One or more applications used for holding the network meeting are installed in the user terminals 2, 7. Thereby, the user terminals 2, 7 can provide a network conference function. The application server 3 is a server in which a document management application is installed.
The database 4 stores, for example, contents data (i.e. audio/visual data), data pertaining to the time at which the contents have been recorded, data pertaining to the location of the recorded contents, data pertaining to document material, and data pertaining to the actual time at which the document data has been browsed or displayed. The projector 5 obtains data pertaining to the GUI (Graphic User Interface) of the network conference of the user terminal 2 via the network A and projects the obtained data onto, for example, a screen or a whiteboard. Although not illustrated in
In addition to including the function for achieving the network conference function, the application installed in the user terminals 2, 7 also includes a function for generating data to be stored in the database 4. The document management application, which is installed in the application server 3, includes a function for reproducing a portion of contents in correspondence with a browse location of document material based on the data stored in the database 4. The function(s) of the application installed in the user terminal 2, 7, and the application server 3 are described in detail below.
Next, a hardware configuration of the image forming apparatus 1, the user terminals 2, 7, the application server 3, and the database 4 is described with reference to
As illustrated in
The CPU 10 is an arithmetic part that controls the entire operations of the user terminal 2. The RAM 20 is a volatile recording medium that can read and write data at high speed. The RAM 20 serves as a working area enabling the CPU to process data. The ROM 30 is a non-volatile recording medium dedicated for having data read out therefrom. The ROM 30 stores programs such as firmware. The HDD 40 is also a non-volatile recording medium that can read and write data. The HDD 40 stores, for example, an OS (Operating System), various control programs, and application programs.
The I/F 50 connects the bus 80 to various hardware and networks and controls the connection between the bus and the various hardware and networks. The LCD 60 is a visual user interface for enabling the user of the user terminal 2 to confirm the status of the user terminal 2. The operation part 70 is a user interface such as a keyboard or a mouse for enabling the user to input data to the user terminal 2. In a case where the application server 3 is used as a server, user interfaces such as the LCD 60 and the operation part 70 may be omitted from the configuration of the application server 3 as illustrated in
With the above-described hardware configuration, a program (software control part) recorded to the ROM 30, the HDD 40 or a computer-readable recording medium (e.g., optical disk) 90 is read out by the RAM 20 and executed in accordance with the controls of the CPU 10. Accordingly, with the combination of hardware and software, the functions of the user terminals 2, 7, the image forming apparatus 1, the application server 3, and the database 4 can be executed.
Next, the functions (functional parts) of the user terminal 2 according to an embodiment of the present invention are described.
The network I/F 210 is an interface for establishing communications between the user terminal 2 and other devices via a network (e.g., the network or A and B). The external I/F 220 is an interface for connecting the user terminal 2 to an external device (e.g., web camera). The external I/F 220 may be, for example, an Ethernet (registered trademark) or a USB (Universal Serial Bus) interface. The I/F 50 of
The controller 200 is a combination of hardware (e.g., integrated circuit) and software (software control part) and serves as a control part that controls the entire user terminal 2. More specifically, the functions of the controller 200 are performed by loading a program recorded to the ROM 30, the HDD, or a computer-readable recording medium (e.g., optical disk) 90 to a volatile memory (e.g., RAM 20) and performing calculations with the CPU 10 in accordance with the program.
The network control part 201 obtains data input from the network I/F 210 and transmits data to other devices via the network I/F 210. The I/F control part 202 controls external devices connected to the external I/F 220 and obtains data input from the external devices via the external I/F 220.
The functions of the network conference application 203 is performed by loading an application program recorded to the ROM 30, the HDD, or a computer-readable recording medium (e.g., optical disk) 90 to a volatile memory (e.g., RAM 20) and performing calculations with the CPU 10 in accordance with the application program. The application program is a program for realizing a network conference with other data processing terminals via a network (e.g., the network 8). One function of the network conference application 203 is a network conference function that establishes a session between the user terminal 2 and another data processing terminal having the network conference application 203 installed therein and connected to the user terminal 2 via a network (e.g., the network 8) and enables data such as presentation data and audio/visual data to be displayed to both the user terminal and the other data processing terminal.
The network conference application 203 also includes, for example, an audio visual recording function, a recording data generating function, a document recording function, and a document use data recording function. The audio visual recording function is a function that records audio and visual data that have been recorded at a network conference. The recording data generating function is a function that generates data pertaining to the recording of audio and visual data in a case where audio data or visual data is recorded. The document recording function is a function that records document material displayed at a network conference. The document use data recording function is a function that records the manner in which document material has been displayed at a network conference.
The network conference application 203 generates an AV (Audio Visual) file based on audio or visual data input to the web camera 6 via the external I/F 220 at a network conference by using the audio visual recording function. The network control part 201 stores the generated AV file in the database 4 via, for example, the network A.
The network conference application 203 retains data of document material displayed at a network conference by using the document recording function. The network control part 201 stores the retained document material in the database 4 via, for example, the network A. The recording data generating function and the document use data recording function are described in detail below.
The display control part 204 instructs the LCD 60 to display the status (e.g., GUI (Graphic User Interface) of the network conference application 203) of the user terminal 2. The operation control part 205 obtains signals corresponding to the user's operations performed on the operation part 70 and inputs the signals to corresponding software (e.g., network conference application) of the user terminal 2.
Next, the recording data generating function and the document use data recording function of the network conference application 203 according to an embodiment of the present invention are described.
The “time/date data” includes data for specifying a document such as “document file name”, “URL (Uniform Resource Locator)”, “page number”, “display start time”, and “display period”. The “document file name” and “URL” are data that indicate a storage area in the database 4 in which document material is stored. That is, the “document file name” and “URL” are data indicating a file path of the database 4. The “page number”, “display start time”, and “display period” are timeline data for indicating a timeline in which a document file has been used. For example, the page of a displayed document file and the actual time and length of displaying the document file can be determined based on the data of “page number”, “display start time”, and “display period”.
Further, data of “extracted character string” is assigned to each “page No.” in the “time/date data”. The “extracted character string” is data indicating a character string included in the corresponding page. The “extracted character string” enables character data included in each page of a document file to be recognized. Thus, document material can be searched based on character data by referring to data of “extracted character string”. In a case where plural document files are displayed in a single network conference, the timeline data and the data of “extracted character string” are generated in correspondence with each document file in the time/date data.
On the other hand, “location data” includes data indicating the location in which a network conference has been held (i.e. location of a data processing terminal including the network conference application 203 that executed the network conference function). As illustrated in
The “location data” of
The “time/date data” includes data for specifying an AV file such as “AV file name”, “URL (Uniform Resource Locator)”, “recording start time”, “recording period”. The “AV file name” and “URL” are data that indicate a storage area in the database 4 in which an AV file is stored. That is, the “AV file name” and “URL” are data indicating a file path of the database 4. The “recording start time” and the “recording period” are timeline data for indicating a timeline of the recording data. The “location data” is the same as the location data included in the document use data. As described above, the location data may not only include the location of one terminal of a network conference but may also include the location of a terminal of a counterpart of the network conference (e.g., location of the user terminal 2 and location of the user terminal 7). Accordingly, the recording data stored in the database 4 is used as contents recording data indicating the time and the location in which contents (audio/visual contents) were recorded. Accordingly, the database 4 functions as a contents recording data storage part.
As described above, the document use data and the recording data illustrated in
Next, an example of a configuration of the application server 3 is described with reference to
The document management application 302 includes a document browsing function. The document browsing function is a function that instructs the user terminal 2 or the user terminal 7 (via a network (e.g., networks A, B)) to display data of document material that is stored in the database 4 after the network conference by the user terminal 2 or the user terminal 7 is finished. The document management application 302 also includes a document use data searching function and an AV data searching function. The document use data searching function is a function that searches for the above-described document use data. The AV data searching function is a function that searches for an AV file based on the above-described recording data. By using the document use data searching function and the AV data searching function, the document management application 302 can provide the below-described function of reproducing audio/visual data based on a browse location of a document material to be browsed.
Next, an exemplary operation of the document management application 302 is described with reference to
The document identification data is, for example, data indicating a storage area in which document material is stored in the database 4. In other words, the document identification data is, for example, data indicating a file path such as a URL. Accordingly, after the document management application 302 obtains the designated document material from the database 4 based on the document identification data and transmits the data of the obtained document material to the user terminal 2, the browsing of the document material is started via the browser (Step S701). In this embodiment, the document material is obtained from the database 4 based on the document identification data obtained by the document management application 302. Alternatively, the document management application 302 may obtain data that is unique to the desired document material (unique document material data) for identifying the desired document material. In this alternative case, the user transmits the unique document material data to the application server 3 via a network by operating the user terminal 2.
When the browsing is started, the document management application 302 obtains document use data stored in the database 4 as illustrated in
When the document use data is obtained, the document management application 302 searches for and obtains recording data stored in the database 4 as illustrated in
Because the network conference system 100 of this embodiment is for enabling an AV file to be searched and viewed/listened to based on document material, the timeline illustrated in
Further, the timeline illustrated in
By separately recording and storing audio data, visual data, and data of document materials in association with the time in which the audio data, the visual data, and the data of document materials were recorded or displayed, all of the audio data, the visual data, and the data of document materials can be made to correspond to the same time axis as illustrated in
In Step S705, the document management application 302, first, functions as a reproduction location identifying part that identifies a reproduction location of an AV file corresponding to the page to be browsed based on a page identification data (i.e. data that identifies the page of document material to be browsed) and data of the timeline illustrated in
The data to be accessed by the user may be, for example, data indicating a storage area in the database 4 in which a corresponding AV file is stored (i.e. file path) and a URL indicating the reproduction location of the corresponding AV file. That is, the document management application 302 generates and outputs data of a screen including, for example, a button for requesting access to the URL indicating the reproduction location of the corresponding AV file.
At the time when the browsing of document material is started, the first page is always displayed. Therefore, the document management application 302 generates and outputs data of a GUI for displaying a button to be used in reproducing the recorded location of audio/video data corresponding to the first page. That is, before Step S701, the document management application 302 functions as a page identification data obtaining part that obtains page identification data used for identifying a page to be displayed (i.e. data identifying the first page).
After data of the GUI is output in Step S705, the browser using the document browsing function of the document management application 302 displays a page of document material designated to be browsed along with a button for reproducing a corresponding recorded portion of audio/video data as illustrated in
The document browsing GUI illustrated in
For example, in a case of displaying “page 4” of “material 1” with the document browsing GUI, “page 4” of “material 1” will be displayed in the “browsing page display space” of
When an instruction to reproduce an AV file is input to the document management application 302 via a network by clicking a reproduction button in the screen illustrated in
In addition to the process of streaming, the document management application 302 may add data designating the reproduction location (reproduction location designation data) for starting reproduction to the obtained AV data and transmit the AV data together with the reproduction location designation data to the browser (i.e. user terminal 2) in Step S708. Accordingly, the browser can start reproduction of the AV data from the reproduction location designated by the reproduction location designation data.
Then, in a case where the user operating the browser changes the page of the document material being browsed (Yes in Step S709), the document management application 302 obtains page identification data (i.e. data that identifies the page of document material to be browsed) via the network and repeats the processes performed in Steps S705-S708. In a case where the page of the document material is not changed and browsing of the document material is finished (Yes in Step S710), the document management application 302 terminates the operation illustrated in
Hence, in a case of generating an AV file containing, for example, audio data and visual data recorded in a network conference or the like by using the document management system according to the above-described embodiment of the present invention, recording data (including, for example, data pertaining to the time and date of the recording and data pertaining to the location of the recording as illustrated in
The document management application 302 associates document material data, audio data, and visual data that are stored separately based on “time/date data” and “location data” included in the recording data and the document use data and determines whether the document material data, the audio data, and the visual data were generated in the same network conference or the like. In other words, in a case where the document management application 302 determines that document material data, audio data, visual data indicate the same location data or a location within a predetermined range according to “document use data” and “recording data”, the document management application 302 determines that the audio data and visual data, which were recorded during the period when the document material was displayed, contain explanations or discussion pertaining to the document material. Accordingly, the document management application 302 generates a link to the audio data and the visual data in correspondence with each page of the document material.
Accordingly, in a case where a user browsing document material having plural pages desires to further understand a particular page of the document material and seeks visual data and/or audio data that explains the particular page, the user can immediately start reproduction of the audio data and/or video data corresponding to the particular page. Thereby, the user can easily reproduce contents corresponding to a particular portion (e.g., a page) of the document material.
In the process of obtaining recording data in Step S703 of
As described above, the “location data” of the recording data and the document use data may not only include the location data of one of the terminals of the network conference but also the location data of a terminal of a counterpart(s) of the network conference (e.g., user terminals, 2, 7). Therefore, in this case, the document management application 302 can obtain recording data of both terminals of the network conference in Step S703 of
Next, a function of printing (outputting) a page of a document material via the document management application 302 is described in a state where the document material is being browsed.
The processes performed in Steps S1001-S1005 of
Then, the document management application 302 generates data of a link that enables the identified reproduction location (e.g., reproduction location identified in a URL format) of the AV file to be reproduced. Then, the document management application 302 converts the data of the link into an encoded data format that can be visually read out (Step 1008). In this example, the data of the link is converted into a QR code (registered trademark). Then, the document management application 302 assigns the QR code (registered trademark) to a blank space of the target page of the document material and outputs image data of the target page including the assigned QR code (registered trademark) to a terminal (e.g., user terminal 2) having a browser operated by the user (Step S1009). Thereby, the user terminal 2 can generate a printing job based on the image data output from the document management application 302 and print the target page of the document material.
Accordingly, by photographing the QRL code (registered trademark) printed on the printed target page with a camera of a mobile terminal (e.g., mobile phone) having a dedicated application or a web camera connected to a data processing terminal (e.g., PC), the user can access the database 4 with the mobile terminal or the data processing terminal and listen to the audio data or view the visual data corresponding to the printed target page.
Hence, with the network conference system 100 according to the above-described embodiment, because document material can be associated with recorded audio/visual data (contents) with respect to actual time and location, the user can easily reproduce the audio/visual data (contents) corresponding to a portion of a printed document material.
In the operations described above with reference to
Similar to the Step S704 of
In Step S1204, the document management application 302 performs the processes of identifying the reproduction location of the AV file corresponding to a target page, generates access data corresponding to the identified reproduction location, and outputs the generated access data in a manner similar to the processes performed in Step S705 of
After link data of all of the pages of the document material are generated, the document management application 302 adds the link data in correspondence with “page number” of the document use data illustrated in
Accordingly, in a case where the user of, for example, the user terminal 2 accesses the document management application 302 with the browser of the user terminal 2 and browses a document material stored in the database 4, the document management application 302 can proceed to the process of Step S705 after obtaining the document use data in Step S702. Thereby, the processes performed in Steps S703 and S704 of
In the operation illustrated in
The operation of
Although the processes of the above-described embodiments are performed in a case where the network conference application 203 is installed in the user terminal 2 and the user terminal 7, the same advantages can be attained even in a case where an application (e.g., document management application 302) is installed in a server and operated via a browser.
According to the above-described embodiments of the present invention, in a case where document material and a page of the document material are designated, an AV file corresponding to the designated document material and a page of the document material can be identified by using data indicating time/date data (i.e. data indicating the time/date in which the audio data and visual date were stored) and data indicating a location (i.e. data indicating a location in which the document material was browsed) as a key. Thereby, the AV file corresponding to the designated document material and the page of the document material can be viewed and listened to by the user.
Therefore, the configurations of the document use data and the recording data are not limited to those illustrated in
In the above-described embodiments, an AV file recorded in a network conference is searched for with respect to each page of document material displayed in the network conference. The network conference is merely an example. The above-described embodiments may be applied to other systems that associate AV data and document material and use the associated AV data and document material. For example, the above-described embodiments may be applied to a system used for, for example, an audio chat, a video chat, or an online lecture.
The above-described embodiments may also be applied to an ordinary lecture that is not systemized as long as the time in which a page of document material (e.g., a handout for students of the lecture) is displayed in association with the actual time of the lecture and the AV file (e.g., audio/visual data of a lecturer or a student) is recorded in association with the actual time of the lecture. Thus, the same advantages can also be attained for the ordinary lecture by applying the above-described embodiments. In other words, the document management function of the application server 3 can be achieved as long as data such as document use data and recording data are stored in the database 4 regardless of whether data are recorded in the database 4 by the network conference functions of the user terminals 2, 7. In this case, the document use data of
Although the document management application 302 installed in the application server 3 is used to perform document management according to the above-described embodiments, document management may also be performed with a device other than the application server 302 (e.g., image forming apparatus 1, projector 5) as long as the device is connected to a network (e.g., networks A, B).
According to the above-described embodiments, the network conference application 203 installed in the user terminal 2 is used to record document use data and recording data in the database 4. Alternatively, the projector 5 may record document use data and recording data in the database 4. In this alternative case, the projector 5 may be provided with a unique function for generating document use data and recording data based on data input to be projected by the projector 5. Alternatively, the network conference application may be installed in the projector 5 for recording document use data and recording data in the database 4.
The present invention is not limited to the specifically disclosed embodiments, and variations and modifications may be made without departing from the scope of the present invention.
The present application is based on Japanese Priority Application No. 2010-243572 filed on Oct. 29, 2010, the entire contents of which are hereby incorporated herein by reference.
Number | Date | Country | Kind |
---|---|---|---|
2010-243572 | Oct 2010 | JP | national |