This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2007-269076 filed Oct. 16, 2007.
1. Technical Field
The present invention relates to an information processing apparatus, a storing method, and a computer readable recording medium, and in particular to an information processing apparatus which is connected to an image capture device capturing an object, and plural external terminals inputting annotation information to an image captured by the image capture device, a storing method storing an image, and a computer readable recording medium causing a computer to execute a process.
2. Related Art
There have been known remote indication systems, each of the remote indication systems including a server (e.g. a computer) connected to a video camera and a projector, and a remote client (e.g. a computer) connected to the server via a network.
According to an aspect of the invention, there is provided an information processing apparatus that is connected to an image capture device, and a plurality of external terminals that inputs annotation information to an image captured by the image capture device, including: an acquiring portion that acquires input information from the plurality of external terminals; a storing portion that stores a synthesis image in which the captured image and the annotation information are synthesized; and a controlling portion that causes the storing portion to store the synthesis image when the acquisition of the input information by the acquiring portion is not executed for a predetermined time period.
Exemplary embodiments of the present invention will be described in detail based on the following figures, wherein:
A description will now be given, with reference to the accompanying drawings, of an exemplary embodiment of the present invention.
A video camera 5 (an image capture device) is connected to the PC 1. The video camera 5 captures a reflected image of a screen 10 including an object 8, and outputs a captured image to the PC 1.
The PC 1 outputs the image captured by the camera 5 to the PC 2 and the PC 2′ via the network 3. The PC 2 is connected to a display 21 (a display portion), and an input unit 14 which is composed of a mouse or the like. The display 21 displays the captured image (specifically, the captured image of the screen 10 including the object 8 in
A group of buttons such as a pen button, a text button, and an erase button, and icons defined by lines and colors are displayed on the window 205 (or 205′).
For example, when the pen button is selected by a mouse pointer which moves in the window 205 (or 205′) in response to movement of the input unit 14 (or 14′) (e.g. the mouse) operated by the user of the PC 2 (or 2′), and then a figure or the like on the object 8 in the display area 12 is drawn by the movement of the mouse pointer, the information about the figure (specifically, the coordinates (x, y) representing the figure in the display area 12) is output from the PC 2 to the annotation information distribution unit 11 connected to the network 3. Here, the figure drawn on the object includes an image of any types such as a line, a character, a symbol, a figure, a color, and a font.
The annotation information distribution unit 11 outputs the information about the figure to the PC 1, the PC 2, and the PC 2′. The PC 1 stores the information about the figure input from the annotation information distribution unit 11. Further, the PC 1 synthesizes the information about the figure and the image captured by the video camera 5 in predetermined timing, and stores the synthesized image in an image storing unit 13. The PC 2 (or 2′) synthesizes the information about the figure input from the annotation information distribution unit 11 and the image which is captured by the video camera 5 and transmitted from the PC 1, and displays the synthesized image on the display area 12 in the window 205 (or 205′).
The PC 2 (or 2′) outputs control commands to the PC 1, so as to control operations of the video camera 5 (e.g. the capture angles, the brightness and the like of images captured by the camera 5).
A block diagram in
The annotation information storing unit 101 receives the annotation information transmitted from the annotation information distribution unit 11 via the network 3 (see
The PC 2 (or 2′) includes an annotation information receiving unit 107, an image receiving unit 108, a display image generating unit 109, and an annotation information obtaining unit 110.
The annotation information receiving unit 107 receives the annotation information transmitted from the annotation information distribution unit 11 via the network 3 (see
A block diagram in
Here, the operation of each unit which composes the storing timing determination unit 104 will be described with reference to a flowchart shown in
In step S10, the annotation information acceptance unit 131 (see
Here, the annotation information includes “annotation beginning information” which is information showing the beginning of the annotation, “annotation end information” which is information showing the end of the annotation, “other information” which is information showing the continuance of the annotation. For example, when the input unit 14 (or 14′) of the PC 2 (or 2′) is a mouse, the annotation beginning information represents “the depression of a left button of the mouse (in a state where a mouse pointer is included in the display area)”, and the annotation end information represents “the release of the left button of the mouse”. Further, when the input unit 14 (or 14′) is a tablet or a touch panel, the annotation beginning information represents “pushing down the pen”, and the annotation end information represents “lifting the pen”. When a device recognizing a position of the pen from an image is used, a state where the pen appears in the image may be assumed to be the annotation beginning information, and a state where the pen disappears from the image may be assumed to be the annotation end information.
Then, the annotation information acceptance unit 131 receives the annotation information, and when the answer to the determination of step S10 is “YES”, the procedure proceeds to step S12. In step S12, the annotation information selection unit 132 (see
Then, the annotation information acceptance unit 131 receives the annotation beginning information, and when the answer to the determination of step S14 is “YES”, the procedure proceeds to step S16. In step S16, the annotation information selection unit 132 adds writing information to the FILO 133. In this case, the writing information means information which shows that the writing has done, and includes information which shows whether the annotation is written by any one of the PC 2 and PC 2′ and shows time when the annotation is written, for example. Further, this is not limitative, but the writing information may be merely data which shows that the writing has done.
When the procedure of step S16 is finished, the procedure proceeds to step S18. In step S18, the timer 134 is reset, and the procedure returns to step S10. In the exemplary embodiment of the present invention, the annotation beginning information are sequentially input from the PC 2 and the PC 2′, respectively, the procedures of steps S10, S12, S14, S16, and S18 are repeated twice, and two pieces of the writing information are stored in the FILO 133. Hereafter, a description will be given on the assumption of this.
When the answers to the determination of steps S10 and S12 are “YES” (i.e., when the annotation information has been the annotation end information), the procedure proceeds to step S20. In step S20, the annotation information selection unit 132 extracts the writing information from the FILO 133 by a FILO (First In Last Out) process.
In next step S22, the annotation information selection unit 132 determines whether the FILO 133 is empty. Now, one piece of the writing information remains yet, and therefore the answer to the determination of step S22 is “NO”. The procedure returns to step S10. When the annotation end information is input again, the answers to the determination of steps S10 and S12 are “YES”, and the procedure proceeds to step S20. In step S20, the annotation information selection unit 132 extracts the writing information from the FILO 133, and then the procedure proceeds to step S22. In this case, since the FILO 133 is empty, the answer to the determination of step S22 is “YES”, and the procedure proceeds to step S24. In step S24, the timer 134 starts measurement of time.
Then, a process in which the annotation information acceptance unit 131 determines in step S26 whether the annotation beginning information has been received, and a process in which the timing determination unit 135 (see
When the answer to the determination of step S26 is “YES” during the above-mentioned repetition, the procedure proceeds to step S16. On the other hand, when the answer to the determination of step S28 is “YES” (i.e., the annotation is not written for the predetermined time period since the FILO 133 is empty), the procedure proceeds to step S30. In step S30, the timing determination unit 135 notifies the storing image generating unit 103 (see
Then, the procedure proceeds to step S32. In step S32, the annotation information selection unit 132 determines whether the PC 1, the PC 2 or the PC 2′ has executed a complete process. When the answer to the determination of step S32 is “NO”, the procedure proceeds to step S18. In step S18, the timer 134 is reset, and then the procedure returns to step S10. The above-mentioned procedures and determination are repeated. When the answer to the determination of step S32 is “YES”, all procedures in
As described above, the user holds a meeting for which the remote indication system 100 is used, and then can produce minutes of the meeting with the synthetic image stored in the image storing unit 13.
As described in detail above, according to the exemplary embodiment, when the acquisition of input information from plural external terminals (i.e., the PC 2 and the PC2′) is not executed for the predetermined time period, the storing timing determination unit 104 causes the storing image generating unit 103 to store the synthetic image (i.e., the image in which the captured image and the annotation image are synthesized) in the image storing unit 13. Thus, even when plural external terminals (and plural users) exist, it is possible to store the synthetic image in appropriate timing (e.g. the timing of the break between arguments in the meeting for which the plural external terminals (i.e., the PC 2 and the PC2′) are used). Further, existence or nonexistence of change in the synthetic image can be determined without the image process or the like, and it is therefore possible to inhibit the apparatus and processes from complicating.
According to the exemplary embodiment, when pieces of annotation end information having the same number as pieces of acquired annotation beginning information are acquired with the FILO 133 and the timer 134, and then the predetermined time period has elapsed, the storing timing determination unit 104 causes the storing image generating unit 103 to store the synthetic image in the image storing unit 13. It is therefore possible to store the synthetic image in appropriate timing with a simple arrangement.
In the above-mentioned exemplary embodiment, although the synthesis of the captured image and the annotation image is executed by the image process, the exemplary embodiment is not limited to this. For example, as shown in
In the above-mentioned exemplary embodiment, although the PC 1 does not include a display, an input unit and the like, the exemplary embodiment is not limited to this. For example, the PC 1 may include the display displaying the captured image, and the input unit composed of a mouse, a keyboard, and the like. In this case, in addition to the above arrangements, the PC 1 may further include an annotation information obtaining unit 110 in common with the PC 2.
In the above-mentioned exemplary embodiment, although the annotation information distribution unit 11 is disposed on the network 3, the exemplary embodiment is not limited to this. For example, the PC 1 may realize the functions of the annotation information distribution unit 11. Further, the functions of the annotation information distribution unit 11 may be realized by not only the PC 1 but the PC 2 or the PC 2′.
Although the remote indication system 100 in accordance with the above-mentioned exemplary embodiment includes two clients (i.e., the PCs 2 and 2′), the exemplary embodiment is not limited to this, and the remote indication system 100 may include three or more clients.
Alternatively, a recording medium having the software program for realizing the functions of the PC 1, the PC 2, and the PC 2′ recorded thereon may be provided to each PC, and the CPU of each PC may read and execute the program recorded on the recording medium. In this manner, the same effects as those of the above described exemplary embodiment can also be achieved. The recording medium for supplying the program may be a CD-ROM, a DVD, a SD card, or the like.
Also, the CPU of each PC may execute the software program for realizing the functions of each PC. In this manner, the same effects as those of the above-described exemplary embodiment can also be achieved.
Although the remote indication system 100 in accordance with the above-mentioned exemplary embodiment uses the FILO as a memory, the exemplary embodiment is not limited to this, and the remote indication system 100 may use a FIFO (First In First Out) type memory. In this case, information from the memory is acquired in such a manner as the FIFO (First In First Out).
It should be understood that the present invention is not limited to the above-described exemplary embodiment, and various modifications may be made to them without departing from the scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2007-269076 | Oct 2007 | JP | national |