Exemplary embodiments of the present invention will be described in detail based on the following figures, wherein:
Exemplary embodiments of the present invention will be described in detail based on the following figures, wherein:
A description will now be given, with reference to the accompanying drawings, of exemplary embodiments of the present invention.
A description will now be given, with reference to the accompanying drawings, of exemplary embodiments of the present invention.
The normal camera 20 is composed of, for example, a CCD camera or the like, and is capable of recording the target TG, for example, a whiteboard, at first resolution. Such recorded image is imported into the image processing apparatus 50.
The high-definition camera 30 is composed of, for example, a CCD camera or the like, and is capable of recording the target TG, for example, a white board, at second resolution higher than the first resolution. Such recorded image data is imported into the image processing apparatus 50. The size of the image data obtained by the high-definition camera 30 is greater than that obtained by the normal camera 20 because of a high definition thereof, when the image data of an identical region is recorded. Here, the high-definition camera 30 is so provided as to record a region substantially identical or a region that has a common region to that recorded by the normal camera 20.
The projector 40 is composed of a crystal liquid projector or the like, and projects the image data obtained from the image processing apparatus 50 onto the target TG. The projector 40 is capable of projecting light of the image data onto the region substantially identical or the common region to those recorded by the high-definition camera 30 and the normal camera 20.
The computer 60 is connected by a display apparatus 70, an input device such as a mouse or the like, as shown in
The computer 100 is connected by a display apparatus 110 such as a crystal liquid display apparatus, CRT, or the like, and an input device such as a mouse 130, and the like. The display apparatus 110 displays image data on a screen for editing the images recorded by the normal camera 20 and recorded by the high-definition camera 30 at the target TG side, or editing the annotation image. The mouse 130 is used for operating various buttons provided on the editing screen, when an instruction related to, for example, the annotation image to be projected onto the target TG is created. By use of the terminal apparatus made up of the computer 100 and the like, a user is able to draw an annotation image, with which an instruction is given to the image, while watching the image of the target TG or the like on the screen of the display apparatus 110.
The operations to create the annotation image with a mouse 120, display apparatus 110, and the computer 100 by the user may be represented as the vector graphics data like the SVG (Scalable Vector Graphics) format in the image processing apparatus 50 and the computers 100 and 60.
Then the annotation image as vector graphics data form rather than a pixel form may be transmitted between the computer 100 and the image processing apparatus 50 through the network 300 to reduce its data size.
The image processing apparatus 50 is capable of making image data from the vector graphics data to pixel data to show image data at the projector 40.
The both of the computers 100 and 60 also have the ability to create image data from the vector graphics data to pixel data to show image data on each of the display apparatuses 110 and 70.
In addition, the vector graphics data may be represented with one of CAD (Computer-Aided Design) formats in compliance with the TSO10303 STEP/AP202 standard or the with another format that is used in a commercial CAD system.
Referring now to
The controller 501 is composed of a commonly used Central Processing Unit (CPU); an internal memory; and the like, and controls: the memory 502 of the image processing apparatus 50; the image inputting portion 503; the high-definition image obtaining portion 504; the normal image obtaining portion 505; the annotation image creating portion 506; the projection image creating portion 507; the communication portion 508; the time management portion 509; the internal bus 510; and various data.
The memory 502 is composed of a commonly used semiconductor memory; a disk device; and the like, and retains, accumulates, and stores the data processed in the image processing apparatus 50. Also, the image data retained, accumulated, or stored in the memory 502 can be output to the projector 40, as needed.
The image inputting portion 503 is composed of a commonly used semiconductor memory or the like, and stores the image data after the image data is input from the computer 60. The aforementioned image data can be created by commonly used application software or the like operating on the computer 60.
The high-definition image obtaining portion 504 is composed of a commonly used semiconductor memory or the like, and acquires the image recorded by the high-definition camera 30.
The high-definition camera 30 may obtain digital image data at higher resolution than that of the normal camera 20.
The normal image obtaining portion 505 is composed of a commonly used semiconductor memory or the like, and acquires the image recorded by the normal camera 20.
The normal camera 20 may obtain digital image data at lower resolution than that of the high-definition camera 30.
The annotation image creating portion 506 is composed of a commonly used CPU; an internal memory; and the like, and creates an annotation image by decoding a draw command relating to the annotation image given from a terminal apparatus such as the computer 100 or the like.
The annotation image creating portion 506 is capable of creating the annotation image data from SVG form to pixel form. That is, the annotation image creating portion 506 is capable of interpreting SVG data and render or create graphics data in pixel form.
The projection image creating portion 507 creates an image to be projected from the projector 40. Specifically, the projection image creating portion 507 creates an image to be projected, by use of the image supplied from the image inputting portion 503, the image supplied from the annotation image creating portion 506, the image stored in the memory 502, and the like, as necessary.
The communication portion 508 is composed of: a CPU; a communication circuit; and the like, and communicates with various data that includes the image data and the annotation data between the computer 100, which is a terminal apparatus, via the network 300.
The time management portion 509 is composed of: an internal system clock; a counter; a timer; and the like, and controls process timings and times of: the controller 501; the memory 502; the image inputting portion 503; the high-definition image obtaining portion 504; the normal image obtaining portion 505; the annotation image creating portion 506; the projection image creating portion 507; the communication portion 508; and the internal bus 510.
The internal bus 510 is composed of: a control bus for control; and a data bus for data, and transmits: control data; image data; graphic data; the high-definition image data; and the like.
Next, a description will be given, with reference to
Firstly, a description will be given of the communication processing between the image processing apparatus 50 and the computer 100 through the network 300. The communication processing routine in
Next, the image processing apparatus 50 determines whether or not a command is received from the computer 100 (step ST2). The command may be composed of: for example, a draw command to draw annotation image data; a select command to select a desired region to obtain high-definition image data; a move command to move the annotation image. The other commands like ‘delete’, ‘copy’ and ‘paste’ may be transmitted and performed instead of the move command. Here, the annotation image data is image data for giving an instruction, or an explanation, or additional information and for sharing information between remote sites by use of the image data, and includes any image data such as a graphic image, text image, and the like.
If the draw command is received, a process is performed to project the annotation image data corresponding to the draw command onto the target TG or the white board (step ST3 and step ST4).
In
That is, a user gives a draw command of the star mark with the computer 100.
The image processing apparatus 50 has a function of calibration of positioning or layout between the annotation image data AN and the target TG. For example, the image processing apparatus 50 calibrates the layout between areas to be recorded by the normal camera 20 and the high-definition camera 30 and an area to be projected by the projector 40. This calibration may be done by the geometrical transformation like Affine transformation of image processing.
If the draw command is not received, it is determined whether the select command is received (step ST5). If the select command is received, it is determined whether the annotation image data is projected onto the target TG (step ST6). Then, if the annotation image data is projected onto the target TG, the annotation image data is temporarily deleted (turned off) (step ST7). The annotation image data is temporarily deleted, if the annotation image data is projected onto the target TG, as described. This is because the image processing apparatus 50 does not need to send the annotation image designated at the computer 100 from the image processing apparatus 5 to the computer 100. The computer 100 is capable of retain the annotation image data therein.
In addition, when the high-definition camera 30 records image data, the annotation image data on the target TG might be a noise.
Then the annotation image data is temporarily deleted so that affecting record of image data at high resolution may be avoided by the annotation image data as a noise. In other words, the high-definition camera 30 is capable of recording image data of the target TG without the annotation image data in order to obtain better image data in terms of image quality.
It is easy to compose the both of the image data from the high-definition camera 30 and the annotation image data on the image processing apparatus 5 or on the computer 100 and the computer 60.
Next, the image data recorded by the high-definition camera 30 is acquired (step ST8), described later, the image corresponding to the region selected by the computer 100. The recorded image is sent to the computer 100 (step ST9). When the annotation image is temporarily turned off at step ST7, the annotation image is projected again (step ST11).
At step ST5, if the command is not the select command, the command is determined to be the move command and the annotation image data being projected is moved (step ST12).
The other commands like ‘delete’, ‘copy’, and ‘paste’ may be processed in step ST5 instead of the move command.
Next, a description will now be given of an example of image processing performed by the image processing apparatus 50. The image process routine in
If the draw command is not sent, the image obtained from the image data supplied from the computer 60 is projected (step ST25). For example, referring to
If the draw command has been sent, the image data supplied from the computer 60 and the annotation image data supplied from the computer 100 are combined (step ST23). Such combined image data is projected from the projector 40 (step ST24). For example, when the draw command of the annotation image data is received in the state of
Next, a description will now be given of a process example at the computer 100. On receiving the recorded image data from the image processing apparatus 50, the computer 100 outputs the recorded image data to the display apparatus 110. The image data related to the white board shown in
A user at the computer 100 side performs an input operation as needed, while watching the display shown in
In
When a user operates the various buttons BT or the like on the screen and the draw command is input, the annotation image data AN is drawn on the screen of the display apparatus 110, as shown in
If the command is not the draw command at step ST42, it is determined whether or not the command is a select command (step ST46). If the command is the select command, the select process is performed to correspond to the select command. Specifically, if the user cannot recognize the characters written in the calendar CAL in the image data IM on the display apparatus 110, each of which is represented as an asterisk ‘*’ in
Here, a description will be given of a process example of the computer 100 at the time of sending the select command to the image processing apparatus 50. Referring to
As another example, as shown in
The wavelet transform like JPEG 2000 or MPEG-4 systems can be used to obtain image data at lower resolution from original image data at higher resolution. The transformed or encoded image data with the wavelet transform can extract a part of image data at lower resolution from the transformed or encoded image data.
In accordance with an exemplary embodiment previously described, the annotation image data is forcibly turned off when the high-definition image data is obtained. However, the present invention is not limited to this. For example, a period of time while the projector 40 is not projecting the annotation image data can be controlled by use of the time management portion 509, so that the high-definition image data may be obtained during the period.
The horizontal axis represents time. In
Meanwhile, in the state of DUR5, the high-definition camera 30 or the camera 20A record image data at high resolution. The high-definition camera 30 or 20A does not record image data in the state DUR4 and DUR6.
Control of the status of the projection of annotation image data and the status of the high-definition camera 30 or the camera 20A may be repeated. For example, DUR1 (ON) and DUR2 (OFF) for the projection of the annotation image data and DUR4 (OFF) and DUR5 (ON) for the recording by the high-definition camera 30 or the camera 20A may be repeated. Then DUR3 (ON) may be thought as a repeat of DUR1. Also DUR6 (OFF) may be regarded as a repeat of DUR4.
The above control may be done electrically by the image processing apparatuses 50, especially with the time management portion 509.
In addition, the above control can be also done physically or mechanically by the time management portion 509 and a specific camera and projector unit in
The time management portion 509 of the image processing apparatuses 50 controls a rotation speed of the mirror unit 203 so that the camera unit 201 obtains the high-definition image data during DUR5 in
The centers of the passes of the both projected light from the projector unit 401 and captured light by the camera unit 201 are exactly corresponded to each other so that no parallax happens in
In the above-described embodiments, the normal image data and the high-definition image data are selectively sent to the computer at a remote site. The normal image data means image data at normal resolution or lower resolution than the high-definition image data. However, the present invention is not limited to this. For example, a configuration may be employed such that the normal camera 20 and the high-definition camera 30 may be controlled on a time division basis, and the image data recorded by the normal camera 20 and that recorded by the high-definition camera 30 are acquired all the time so as to send to the computer 100 at a remote site. At this time, the transmission frame rate of the high-definition image is made smaller than that of the image data having smaller resolution than the high-definition image data, for example, 60 frames per second. For example, 10 frames are sent every second, thereby controlling the quality thereof. Also, when the high-definition image data is transmitted, the high-definition image and the normal image data may be multiplexed at different frame rates, or may be sent simultaneously at different bands.
As described above, the normal image data and the high-definition image data can be composed or superimposed or multiplexed.
In the above-described embodiments, a description has been given of the case where the normal image data and the high-definition image data are displayed on the common display apparatus 110. However, the display apparatus for the normal image data and that for the high-definition image data may be connected to the computer 100 and may be displayed independently. The normal image data and the high-definition image data may be transmitted over different communication lines, may be multiplexed and transmitted, or may be transmitted on different bands. For example, the normal image data may be transmitted by wireless and the high-definition image data may be transmitted over a (an optical) cable.
In addition, for example, the image data at normal resolution may be assigned to 100 kilobits per second and the high-definition image data may be assigned to 100 megabits per second for transmission, so the communication quality may be controlled. In a similar manner, “so-called” frame rate or a record time interval of the image data at normal resolution may be 30 frames per second and that of the high-definition image data may be one frame per second, so the image quality or the communication quality may be controlled. The transmission system of the normal image data and that of the high-definition image data may have identical protocol or may have different ones. For example, the normal image data may be transmitted by means of so-called HTTP protocol, and the high-definition image data may be so-called FTP protocol.
In the above-described embodiments, the computer 100 or the image processing apparatus 50 may delete the annotation image data and draw the annotation image data again on the display apparatus 110 that shows the image data from the normal camera 20 in order to prevent confusion between projected annotation data that is captured by the normal camera 20 and transmitted to the computer 100 and original drawings that the user draw on the display apparatus 110.
The computer 60 may have the same function of giving an annotation image with the computer 100.
Also the user may provide image data from a digital camera or application software with the computer 100 and the computer 100 may send the image data to the image processing apparatus 50 to project the image data through the projector 40.
If the user does not need to watch the image data in the image processing apparatus 50, the computer 60, the display apparatus 70 and the mouse 80 might not be configured to implement this invention.
An image processing method employed according to an aspect of the present invention is performed with a Central Processing Unit (CPU), Read Only Memory (ROM), Random Access Memory (RAM), and the like, by installing a program from a portable memory device or a storage device such as a hard disc device, CD-ROM, DVD, or a flexible disc or downloading the program through a communications line. Then the steps of program are executed as the CPU operates the program.
The foregoing description of the exemplary embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The exemplary embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.
This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2006-251992 filed Sep. 19, 2006.
Number | Date | Country | Kind |
---|---|---|---|
2006-251992 | Sep 2006 | JP | national |