This patent application is based on and claims priority under 35 U.S.C. §119 to Japanese Patent Application Nos. 2010-044013, filed on Mar. 1, 2010, and 2010-285919, filed on Dec. 22, 2010, in the Japanese Patent Office, the entire disclosure of which is hereby incorporated herein by reference.
The present invention generally relates to sharing of drawing data, and more specifically to an apparatus, system, and method of sharing drawing data among a plurality of remotely located sites.
The recent technology allows image data to be shared among a plurality of remotely located sites to facilitate communication among the plurality of sites. For example, Japanese Patent Application Publication No. 2005-203886 describes a system, which includes two projectors that are located at different places, and a server that transmits an image to each of the projectors for projection onto a screen through each projector. Each projector captures an image drawn onto the whiteboard, and transmits the captured image to the server. The server combines an original image with the captured image received from the projector, and sends the combined image to another projector for projection onto the screen through another projector.
In the system described in Japanese Patent Application Publication No. 2005-203886, projection of the combined image onto the screen is interpreted at a predetermined interval as the system needs to capture the image drawn onto the whiteboard to generate the combined image for projection.
In view of the above, one aspect of the present invention provides a system, apparatus, and method of sharing drawing data among a plurality of remotely located sites, each capable of generating an image to be projected with improved visibility.
Another aspect of the present invention provides a system, apparatus, and method of sharing drawing data among a plurality of remotely located sites, each capable of generating an image to be projected more seamlessly.
A more complete appreciation of the disclosure and many of the attendant advantages and features thereof can be readily obtained and understood from the following detailed description with reference to the accompanying drawings, wherein:
The accompanying drawings are intended to depict example embodiments of the present invention and should not be interpreted to limit the scope thereof The accompanying drawings are not to be considered as drawn to scale unless explicitly noted.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “includes” and/or “including”, when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
In describing example embodiments shown in the drawings, specific terminology is employed for the sake of clarity. However, the present disclosure is not intended to be limited to the specific terminology so selected and it is to be understood that each specific element includes all technical equivalents that operate in a similar manner.
The site 110a is provided with a drawing object (“object”) 112a, a projection device 114a, an image capturing device 116a, and a drawing image sharing apparatus 118a. The site 110b is provided with a drawing object (“object”) 112b, a projection device 114b, an image capturing device 116b, and a drawing image sharing apparatus 118b, which are similar in function and structure to the drawing object (“object”) 112a, the projection device 114a, the image capturing device 116a, and the drawing image sharing apparatus 118a, which are each provided at the site 110a.
The object 112a allows the user at the site 110a to draw thereon any desired drawings, while displaying thereon an image that is projected by the projection device 114a. For example, the object 112a may be implemented by any desired material on which the user at the site 110a can write any drawings such as characters or figures, such as a white board, a black board, or a paper sheet. In
The projection device 114a is any device that projects an image onto the object 112a, such as the drawing image of the site 110b that is received from the site 110b through the network 120. For example, the projection device 114a may be implemented by a projector. In
The image capturing device 116a is any device that captures an image being displayed onto the object 112a as a captured image. The captured image includes a drawing image that reflects drawings drawn by the user onto the object 112a and a projection image that is projected by the projection device 114a based on the drawing image that reflects drawings drawn onto the object 112b by the user at the site 112b. For example, the image capturing device 116a may be implemented by a digital camera, digital video camera, web camera, etc., which is capable of capturing an image being displayed onto the object 112a. In this example, the image capturing device 116a captures the image being displayed onto the object 112a, according to an instruction received from the drawing image sharing apparatus 118a, when the drawing image sharing apparatus 118a receives the drawing image drawn onto the object 112b from the site 110b. Alternatively, the image capturing device 116a may capture the image being displayed onto the object 112a at a predetermined time interval under control of the drawing image sharing apparatus 118a.
The drawing image sharing apparatus 118a extracts the drawing image that reflects drawings drawn onto the object 112a from the captured image of the object 112a that is captured by the image capturing device 116a, and sends the extracted drawing image of the site 110a to the site 110b. The drawing image sharing apparatus 118a further causes the projection device 114a to display a drawing image that reflects drawings drawn onto the object 112b and received from the site 110b, onto the object 112a. In this manner, the drawing image that reflects drawings drawn by each user is shared among a plurality of sites. For example, the drawing image sharing apparatus 118a may be implemented by any desired computer such as a notebook computer or a desktop computer.
The drawing image sharing apparatus 118a includes a processor such as a central processing unit (CPU), and a memory. The processor may be implemented by the PENTIUM processor or PENTIUM compatible processor. The memory may be implemented by a volatile memory such as a random access memory (RAM), and a nonvolatile memory such as a read only memory (ROM) or a hard disk device (HDD). More specifically, the ROM or HDD stores various programs and data to be used by the processor. The RAM functions as a work memory of the processor to allow the processor to execute the program read out from the ROM or HDD. In this example, the memory of the drawing image sharing apparatus 118a stores therein a drawing image sharing program, which is executed by the processor under an operating systems, such as Windows-series OS, MAC OS, UNIX OS, or LINUX OS. The drawing image sharing program may be written in any desired programming language such as assembler, C, C++, Java, Java Script, PERL, RUBY, PHYTON, etc. Upon execution of the drawing image sharing program, the processor causes a computer to operate or function as the drawing image sharing apparatus 118a. The drawing image sharing apparatus 118a further includes an interface, such as a peripheral device interface for connection with a peripheral device such as the projection device 114a or the image capturing device 116a, and a network interface for connection with the network 120. Through the interface, the drawing image sharing apparatus 118a communicates with the peripheral device to transmit various data such as the image to be projected by the projection device 114a to the projection device 114a or the captured image captured by the image capturing device 116a from the image capturing device 116a. The drawing image sharing apparatus 118a communicates with the drawing image sharing apparatus 118b at the site 110b to transmit or receive various data such as the drawing image to or from the drawing image sharing apparatus 118b through the network 120.
Any one of operations or functions of the drawing image sharing apparatus 118a may be performed according to a plurality of instructions that are written in the form of computer program using any desired programming language and is instable onto any computer. The drawing image sharing program may be stored in any desired memory such as a hard disk device, or a removable medium such as CD-ROM, MO, flexible disk, EEPROM, or EPROM, for example, for distribution. Further, the drawing image sharing program may be distributed over the network in the form readable by a device such as a computer.
Although not described above, the devices or apparatuses provided at the site 110b operate in a substantially similar manner as the devices or apparatuses provided at the site 110a.
Further, a structure of the drawing image sharing system 100 is not limited to this example shown in
The image receive 210 receives the drawing image that is sent from a remotely located site such as the site 110b through the network 120. The image receive 210 stores the drawing image of the remotely located site in the RAM, and notifies the shared image obtainer 212 that the drawing image is received.
The shared image obtainer 212 obtains an image to be shared with the remotely located site, which includes an image being displayed onto an area to be shared with the remotely located site. More specifically, in this example, the shared image obtainer 212 causes the image capturing device 116a to capture an image of the object 112a including the drawing image drawn onto the object 112a and the projection image projected by the projection device 114a as a captured image. The shared image obtainer 212 performs perspective transformation on the captured image obtained by the image capturing device 116a, for example, by applying image correction to the captured image such as keystone correction, aspect ratio correction to correct the length-to-width ratio, and/or rotation correction.
As illustrated in
In alternative to the above-described example, the shared image obtainer 212 may specify the shared area of the object 112a and calculate the size of the shared area, without using the markers 1120 to 1126 of the marker image. In such case, the image supplier 214 causes the projection device 114a to project a white color image, which is stored in the memory through the data buffer 220, onto the object 112a. The shared image obtainer 212 may specify the coordinate values of corners of the shared area, based on the difference in brightness between an area where the white color image is projected and the area where the white color image is not projected. Once the coordinate values of corners of the shared area are specified, the shared image obtainer 212 calculates the size of the shared area using the corner coordinate values of the shared area. In this example, it is assumed that the data buffer 220 previously stores at least one of the marker image having the markers 1120 to 1126 and the white color image having a predetermined size in any desired memory of the drawing image sharing apparatus 118a. For the descriptive purposes, the marker image and the white color image or any image that may be used for specifying the area to be shared may be collectively referred to as a reference image.
When the shared image obtainer 212 receives notification from the image receive 210 that the drawing image is received from the remotely located site, the shared image obtainer 212 causes the image capturing device 116a to capture an image of the object 112a to obtain the captured image from the image capturing device 116a. The shared image obtainer 212 applies perspective transformation to the captured image, and extracts an image of the shared area from the captured image using the coordinate values of the corners of the shared image and/or the size of the shared area that are respectively read out from the memory. The extracted image of the shared area is stored in the memory such as the RAM or the HDD of the drawing image sharing apparatus 118a. Once the image of the shared area is stored, the shared image obtainer 212 sends notification to the image supplier 214 that processing of the captured image is completed. Alternatively, the shared image obtainer 212 may send notification to the image supplier 214 when the shared image obtainer 212 sends an instruction for capturing the image of the object 112a to the image capturing device 116a. When the notification from the shared image obtainer 212 is received, the image supplier 214 supplies the image of the shared area to the projection device 114a for projection onto the object 112a.
In the above-described example, the shared image obtainer 212 of the drawing image sharing apparatus 118a applies perspective transformation to the captured image. Alternatively, the drawing image sharing system 100 may be provided with a perspective transformation circuit that applies perspective transformation to the captured image captured by the image capturing device 116a according to an instruction received from the shared image obtainer 212 of the drawing image sharing apparatus 118a.
Still referring to
As described above, the image supplier 214 causes the projection device 114a to display the marker image onto the object 112a. The marker image may be previously stored in the nonvolatile memory of the drawing image sharing apparatus 118a such as the HDD. The image supplier 214 reads out the marker image from the nonvolatile memory through the data buffer 220 to display the marker image through the projection device 114a. In this example, the image supplier 214 causes the projection device 114a to display the marker image onto the object 112a when the drawing image sharing apparatus 118a is powered on. Alternatively, the image supplier 214 may cause the projection device 114a to display the marker image at any time, for example, according to an instruction received from the shared image obtainer 212 or at a predetermined time interval that is previously set by the user or by default.
In this example, the marker image is defined by the markers of blue color, however, any other color may be used in replace of blue color. Further, in this example, the markers defining the marker image each have a rectangular shape, however, any other shape such as triangular or quadrant shape may be used in replace of rectangular shape. Further, in this example, the shared area is specified by the markers each corresponding to the corner of the shared area, however, the shared area may be specified in various other ways. For example, each of the markers may be arranged at the middle of the border line of the shared area to specify the shared area.
Still referring to
The differential image generator 216 erodes or dilates the boundaries of the drawing image in the previous projection image obtained through the data buffer 220 using a filter including, for example, Morphological operations such as Erode and Dilate filter, or smoothing filter. This suppresses the negative influences that may be otherwise caused by the positional shift of the drawing image in the previous projection image with respect to the captured image.
The differential image generator 216 respectively segments the previous projection image and the captured image into color components of red (R), green (G), and blue (B) to obtain the images of R, G, and B color components for the previous projection image and the captured image.
In one example, the differential image generator 216 generates the differential image that reflects the difference between the previous projection image and the captured image, by calculating the average ratio value of each pixel respectively in the previous projection image and the captured image for each of the R, G, and B color component images. The average ratio value of a target pixel is a brightness value of the target pixel, which is calculated by multiplying the brightness value of the background color with the average ratio of the brightness value of the target pixel with respect to the average value of the brightness values of pixels surrounding the target pixel. The average ratio value of a target pixel having the coordinate value (x, y) is calculated as follows.
First, the differential image generator 216 specifies a rectangle having the length m and the width n, which surrounds the target pixel (x, y). In this example, m and n are each previously defined as an arbitrary integer.
Second, the differential image generator 216 calculates the average value AVE of the brightness values of the pixels that are included in the m×n rectangle, using an integral image of the captured image for a selected color component.
Third, the differential image generator 216 multiplies the brightness value of the background color of the captured image with the average ratio value of the target pixel (x,y). The average ratio value of the target pixel (x.y) is the ratio of the brightness value of the target pixel (x,y) with respect to the average value AVE of the brightness values of the pixels included in the m×n rectangle. Assuming that the object 112a is a white board, the brightness value of the background color of the captured image is 200. When the average value AVE is greater than 1, the average ratio value of the target pixel (x,y) is assumed to be the same as the brightness value of the background color. Further, the brightness value of the background color may be previously defined by the user.
In another example, the differential image generator 216 generates the differential image by calculating the average differential value of each pixel respectively in the projection image and the captured image for each of the R, G, and B color component images. The average differential value of a target pixel is a brightness value of the target pixel, which is calculated by subtracting the average value of the brightness values of pixels surrounding the target pixel from the brightness value of the target pixel. The average differential value of a target pixel having the coordinate value (x, y) is calculated as follows.
First, the differential image generator 216 specifies a rectangle having the length m and the width n, which surrounds the target pixel (x, y). In this example, m and n are each previously defined as an arbitrary integer.
Second, the differential image generator 216 calculates the average value AVE of the brightness values of the pixels that are included in the m×n rectangle, using an integral image of the captured image for a selected color component.
Third, the differential image generator 216 subtracts the average value AVE of the brightness values of the pixels included in the m×n rectangle, from the brightness value of the target pixel (x,y).
In one example, the differential image generator 216 calculates the difference in pixel values between the average ratio value of the target pixel in the captured image and the brightness value of the target pixel in the previous projection image to obtain a differential value of the target pixel between the captured image and the previous projection image. In another example, the differential image generator 216 calculates the difference in pixel values between the average differential value of the target pixel in the captured image and the average differential value of the target pixel in the previous projection image to obtain a differential value of the target pixel between the captured image and the previous projection image. The differential image generator 216 obtains the differential value between the captured image and the previous projection image for all of the pixels, respectively, for the R, G, and B color component images, and generates R, G, and B color components respectively having pixels with the obtained differential values. The differential image generator 216 further combines the R, G, and B color components to obtain the differential image.
The differential image generator 216 may further apply filtering process to the differential image to cut the yellow color component of the differential image. The differential image generator 216 compares the differential value of the brightness value between the background color and one of the red color component and the green color component, with respect to the differential value of the brightness value between the background color and the blue color component. More specifically, the differential image generator 216 obtains the differential value of the brightness value between the background color and the red color component (“red differential value”), the differential value of the brightness value between the background color and the green color component (“green differential value”), and the differential value of the brightness value between the background color and the blue color component (“blue differential value”). When the blue differential value is greater than the greater one of the red differential value and the green differential value, the differential image generator 216 sets the blue differential value to a value obtained by multiplying the greater one of the red differential value and the green differential value with a constant. When the blue differential value is equal to or less than the greater one of the red differential value and the green differential value, the differential image generator 216 keeps the blue differential value as it is.
For example, assuming that the brightness value of the background color is 200, the brightness value of the red color component is 190, the brightness value of the green color component is 180, and the brightness value of the blue color component is 140, the differential values with respect to the background color are 10 for the red color component, 20 for the green color component, and 60 for the blue color component. In such case, the blue differential value of 60 is greater than either one of the red differential value of 10 and the green differential value of 20. Accordingly, the blue differential value is set to 40, which is obtained by multiplying the greatest one of the red differential value and the green differential value, which is 20, with a constant such as “2”. The brightness value of the blue color component is thus 160, which is obtained by subtracting the newly set blue differential value of 40 from the background brightness value of 200. In this example, the constant may be any arbitrary number. In order to sufficiently reduce the level of the yellow color component affected by a light emitted from the projection device 114a, the constant may be preferably set to an arbitrary number that ranges between 1 and 2.
Once the differential image is generated, the differential image generator 216 stores the projection image that the image supplier 214 just transmitted to the projection device 114a in the memory such as the HDD through the data buffer 220, as the previous projected image. The differential image generator 216 further notifies the image transmit 218 that generation of the differential image is completed.
The image transmit 218 sends the differential image generated by the differential image generator 216 to the remotely located site, such as the drawing image sharing apparatus 118b at the site 110b, through the network 120. When the image transmit 218 receives notification from the differential image generator 216 indicating generation of the differential image, the image transmit 218 sends the differential image, which is the drawing image of the site 110a that is not reflected in the previous projected image, to the drawing image sharing apparatus 118b at the site 110b through the network 120.
As described above, the drawing image sharing apparatus 118a generates the differential image, which reflects the drawing image of the site 110a, to the drawing image sharing apparatus 118b that is remotely located.
In case of the system described in Japanese Patent Application Publication No. 2005-203886, the drawing image of the site 110a and the drawing image of the site 110b are combined before projection such that the resultant combined image may suffer from echo due to the drawing images being superimposed one above the other. In contrary, according to the above-described example of the present invention, since the drawing image of one site is transmitted to the other site for projection without being combined with the drawing image of the other site, the resultant projected image is not affected by echo phenomenon, thus improving visibility of the projection image.
Referring now to
At S301, the image receive 210 determines whether the drawing image is received from the remotely located site through the network 120. When it is determined that the drawing image is not received (“NO” at S301), the operation repeats S301. When it is determined that the drawing image is received (“YES” at S301), the image receive 210 notifies the shared image obtainer 212 that the drawing image is received, and the operation proceeds to S302.
At S302, the shared image obtainer 212 causes the image capturing device 116a to capture an image of the object 112a to obtain the captured image, and notifies the image supplier 214 that the captured image is obtained.
At S303, the shared image obtainer 212 extracts a shared area from the captured image to obtain an image to be shared (“shared image”), and applies image correction such as perspective transformation to the shared image. When processing of the captured image is completed, the shared image obtainer 212 notifies the image supplier 214 that processing of the captured image is completed.
At S304, the image supplier 214 sends the drawing image of the remotely located site, which is received at the image receive 210, to the projection device 114a to cause the projection device 114a to project the drawing image onto the object 112a.
At S305, the differential image generator 216 generates a differential image, which reflects the difference between the previous projection image previously projected and stored in the memory, and the captured image.
At S306, the differential image generator 216 stores the projection image that is sent by the image supplier 214 to the projection device 114a at S304, in the memory through the data buffer 220, as the previous projection image to be used for processing the shared image for the next time. The differential image generator 216 further sends an instruction to the image transmit 218 to cause the image transmit 218 to send the differential image generated at S305.
At S307, the image transmit 218 sends the differential image to the drawing image sharing apparatus 118b at the remotely located site according to the instruction received from the differential image generator 216, and the operation returns to S301 to determine whether another drawing image is received from the remotely located site.
As described above, in this example, the drawing data sharing system of
At S401, the differential image generator 216 obtains the previous projection image from the memory such as the HDD through the data buffer 220, and applies image processing to the previous projection image to dilate the boundaries of the drawing image in the previous projection image.
At S402, the differential image generator 216 obtains the captured image, from the RAM through the data buffer 220. The differential image generator 216 segments the previous projection image that is obtained at S401 and the captured image into color components of R, G, and B, respectively.
At S403, the differential image generator 216 selects one of the color components of R, G, and B for processing.
At S404, the differential image generator 216 generates an integral image of the captured image for the selected one of the R, G, and B color components.
At S405, the differential image generator 216 selects a target pixel, respectively, in the previous projection image and the captured image, for processing.
At S406, the differential image generator 216 performs calculation of an average ratio value of the target pixel in the previous projection image and the captured image for the selected one of R, G, and B color components as described below referring to
At S407, the differential image generator 216 calculates a differential value of the target pixel, as described below referring to
At S408, the differential image generator 216 determines whether all pixels in the previous projection image and the captured image have been processed. When it is determined that all pixels in the previous projection image and the captured image are processed (“YES” at S408), the operation proceeds S409. When it is determined that all pixels in the previous projection image and the captured image are not processed (“NO” at S408), the operation returns to S405 to select another target pixel for processing.
At S409, the differential image generator 216 determines whether all color components have been processed. When it is determined that all color components have been processed (“YES” at S409), the operation proceeds to S410. When it is determined that all color components are not processed (“NO” at S409), the operation returns to S403 to select another color component for processing.
At S410, the differential image generator 216 generates differential images for R, G, and B color components, by combining the pixels each having the brightness value, or the differential value, calculated at S407. The differential image generator 216 then combines the images of R, G, and B color components to generate a differential image.
At S411, the differential image generator 216 removes yellow color components from the differential image, for example, by applying filtering processing to the differential image, and the operation ends.
The operation of
At S501, the differential image generator 216 specifies a rectangular area having the m×n size that surrounds the target pixel (x,y).
At S502, the differential image generator 216 calculates an average value AVE of the brightness values of pixels included in the rectangular area specified at S501, using the integral image of the captured image for the selected one of the color components obtained at S404.
At S503, the differential image generator 216 calculates an average ratio value of the target pixel (x,y) using the brightness value of the background color of the captured image, the brightness value of the target pixel (x,y), and the average value AVE of the brightness values of pixels included in the rectangular area obtained at S502, and the operation ends.
At S601, the differential image generator 216 specifies a rectangular area having the m×n size that surrounds the target pixel (x,y).
At S602, the differential image generator 216 calculates an average value AVE of the brightness values of pixels included in the rectangular area specified at S601, using the integral image of the captured image for the selected one of color components obtained at S404.
At S603, the differential image generator 216 subtracts the average value AVE of the brightness values of pixels included in the rectangular area, from the brightness value of the target pixel (x,y), to calculate the average differential value of the target pixel (x,y), and the operation ends.
At S701, the differential image generator 216 calculates a differential value of the target pixel, which is the difference between a value of the target pixel in the captured image and a value of the target pixel in the target pixel in the previous projection image.
At S702, the differential image generator 216 determines whether the differential value of the target pixel, which is calculated at S701, is greater than a threshold. When it is determined that the differential value of the target pixel is greater than the threshold (“YES” at S702), the operation proceeds to S703. When it is determined that the differential value of the target pixel is equal to or less than the threshold (“NO” at S702), the operation proceeds to S704. The threshold is any arbitrary value, which is previously determined. In this example, the threshold is set to −5.
At S703, the differential image generator 216 sets the differential value of the target pixel to 0.
At S704, the differential image generator 216 applies correction processing to the target pixel to improve the color intensity of the target pixel, by multiplying the differential value of the target pixel with a constant. In this example, the constant is any number that ranges between 1 and 2. This suppresses the negative influences to the color intensities of the captured image due to capturing, thus improving the visibility of the differential image.
As described above, the brightness value of the target pixel in the differential image is corrected. When the captured image is brighter in pixel values than the projection image, the differential image generator 216 suppresses the negative influences due to capturing, such as the reflection light or the influence by a lighting device. When the differential value between the captured image and the projection image is relatively small, it is assumed that the differential value is caused due to noise. In such case, the negative influences by noise is eliminated in the differential image.
At S705, the differential image generator 216 adds the differential value of the target pixel with the brightness value of the background color, and the operation ends.
Further, in this example, as illustrated in
In another example, the data buffer 220 of the drawing image sharing apparatus 118a may function as the shared data buffer 240 that stores the drawing image of the site 110a and the drawing image of the site 110b.
In another example, the shared data buffer 240 may be provided in the drawing image sharing apparatus 118a such that the shared data buffer 240 stores the drawing image received from the drawing image sharing apparatus 118b in a specific memory address of the shared data buffer 240. In such case, the image receive 210 reads the drawing image received from the remotely located site from the shared data buffer 240.
Referring to
When it is determined that the predetermined time period has elapsed (“YES” at S801), the operation proceeds to S802.
At S802, the drawing image sharing apparatus 118a reads out the drawing image, which is received from the remotely located site and stored in the storage device through the shared data buffer 240 at S811, from the storage device through the shared data buffer 240. The drawing image being read may be stored in the local memory, such as the RAM.
The drawing image sharing apparatus 118a performs S803 and S804 in a substantially similar manner as described above referring to S302 and S303 of
At S805, the image supplier 214 of the drawing image sharing apparatus 118a causes the projection device 114a to project the most updated drawing image of the remotely located site, which is read out from the storage device through the shared data buffer 240 at S802, onto the object 102a. The drawing image sharing apparatus 118a obtains the time at which the projection image is projected, for example, using the timer function of the OS, and stores the obtained time in the memory such as the RAM.
At S806, the differential image generator 216 generates a differential image, which reflects the difference between the previous projection image previously projected and stored in the memory, and the captured image.
At S807, the differential image generator 216 stores the projection image that is sent by the image supplier 214 to the projection device 114a at S807, in the memory through the data buffer 220, as the previous projection image to be used for processing the shared image for the next time. The differential image generator 216 further sends an instruction to the image transmit 218 to cause the image transmit 218 to send the differential image generated at S806.
At S808, the image transmit 218 sends the differential image to the shared data buffer 240 for storage in the storage device, and the operation returns to S801 to determine whether the predetermined time period has elapsed.
The drawing image sharing apparatus 118b of the system 100 performs operation of
The operation of S810 to S811 of
More specifically, the server may additionally include an image receive that is similar in function to the image receive 210. At S810, the image receive of the server determines whether the drawing image is received from any one of the drawing image sharing apparatus 118. When it is determined that the drawing image is not received (“NO” at S810), the operation repeats S810. When it is determined that the drawing image is received (“YES” at S810), the operation proceeds to S811. At S811, the image receive of the server stores the drawing image in the storage device through the shared data buffer 240, and operation returns to S810.
The operation of
In such case, the operation of
As described above, operation of processing the drawing image of
Referring to
When it is determined that the predetermined time period has elapsed at S901 (“YES” at S901), the operation proceeds to S902.
At 902, the drawing image sharing apparatus 118a reads out the previous projection image, which is stored in the memory such as the HDD through the data buffer 220 at the time when the projection image is projected.
The drawing image sharing apparatus 118a performs S903 to S904, in a substantially similar manner as described above referring to S302 to S303 of
At S905, the drawing image sharing apparatus 118a generates a differential image, which reflects the difference between the projection image obtained at S902 and the captured image obtained at S904.
At S906, the image transmit 218 of the drawing image sharing apparatus 118a sends the differential image generated at S905 to the storage device through the shared data buffer 240, and the operation returns to S901 to determine whether the predetermined time period has elapsed.
The drawing image sharing apparatus 118b of the system 100 performs operation of
The operation of S908 to S910 of
At S908, the image receive 210 of the drawing image sharing apparatus 118a determines whether the drawing image is received from the remotely located site through the shared data buffer 240. When it is determined that the drawing image is not received (“NO” at S908), the operation repeats S908. When it is determined that the drawing image is received (“YES” at S908), the image receive 210 notifies the image supplier 214 that the drawing image is received, and the operation proceeds to S909.
At S909, the image supplier 214 sends the drawing image of the remotely located site to the projection device 116a as the projection image to be projected onto the object 112a. The drawing image sharing apparatus 118a further obtains a current time, for example, using the timer function of the OS to obtain the time at which the projection image is projected, and stores the obtained time in the memory such as the RAM.
At S910, the drawing image sharing apparatus 118a stores the projection image in the storage device through the shared data buffer 240, and the operation returns to S908.
The operation of
In such case, at S906, the differential image may be sent to the remotely located site, such as the drawing image sharing apparatus 118b, for storage in the memory of the drawing image sharing apparatus 118b through the shared data buffer 240.
As described above, operation of capturing the image to be shared and operation of projecting the image to be shared are performed independently from each other. Since the drawing image sharing apparatus 118a does not have to wait for the drawing image to be received from the remotely located site before sending the drawing image of the site 110a, the drawing image sharing apparatus 118a is able to transmit the drawing image of the site 110a to the remotely located site more efficiently. This increases a transmit frame rate, thus improving the real-time capability of the drawing data sharing system 100.
Numerous additional modifications and variations are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the disclosure of the present invention may be practiced otherwise than as specifically described herein.
With some embodiments of the present invention having thus been described, it will be obvious that the same may be varied in many ways. Such variations are not to be regarded as a departure from the spirit and scope of the present invention, and all such modifications are intended to be included within the scope of the present invention.
For example, elements and/or features of different illustrative embodiments may be combined with each other and/or substituted for each other within the scope of this disclosure and appended claims.
Further, as described above, any one of the above-described and other methods of the present invention may be embodied in the form of a computer program stored in any kind of storage medium. Examples of storage mediums include, but are not limited to, flexible disk, hard disk, optical discs, magneto-optical discs, magnetic tapes, involatile memory cards, ROM (read-only-memory), etc.
Alternatively, any one of the above-described and other methods of the present invention may be implemented by ASIC, prepared by interconnecting an appropriate network of conventional component circuits or by a combination thereof with one or more conventional general purpose microprocessors and/or signal processors programmed accordingly.
In one example, the present invention may reside in a drawing image sharing apparatus including: means for obtaining a drawing image that reflects an image drawn onto a shared area of a drawing object, the drawing image to be shared with a remotely located site; means for displaying a drawing image received from the remotely located site onto the drawing object; a data buffer to store the drawing image received from the remotely located site; means for generating a differential image reflecting the difference between the drawing image stored by the data buffer and the drawing image obtained by the means for obtaining; and means for transmitting the differential image to the remotely located site. The means for obtaining obtains the drawing image drawn onto the shared area before the means for displaying displays the drawing image received from the remotely located site onto the drawing object.
In the above-described drawing image sharing apparatus, in one example, the data buffer stores the drawing image received from the remotely located site after the means for generating generates the differential image. The means for generating generates the differential image based on the drawing image stored by the data buffer and the drawing image obtained by the means for obtaining.
In another example, the present invention may reside in a drawing image sharing apparatus including: means for obtaining a drawing image being displayed onto a shared area to be shared with a remotely located site; means for displaying a drawing image received from the remotely located site onto the drawing object; a data buffer to store the drawing image received from the remotely locate site; means for generating a differential image indicating the difference between the drawing image stored in the data buffer and the drawing image obtained by the means for obtaining; and means for transmitting the differential image to the remotely located site. The means for obtaining obtains the drawing image being displayed onto the shared area of the drawing object after a predetermined time period elapses since a time at which the means for displaying displays the drawing image received from the remotely located site.
In another example, the present invention may reside in a drawing image sharing apparatus including: means for supplying a drawing image received from a remotely located site to a display device to cause the display device to display the drawing image; means for obtaining a drawing image being displayed onto a shared area of the drawing object after a predetermined period has elapsed since a time at which the means for supplying causes the display device to display the drawing image; a data buffer to store the drawing image received from the remotely located site; means for generating a differential image indicating the difference between the drawing image stored in the data buffer and the drawing image obtained by the means for obtaining; and means for transmitting the differential image to the remotely located site, wherein the means for generating, and the means for transmitting are concurrently performed with the means for obtaining.
In any one of the above-described examples, the means for generating calculates, for each of color components of red, green, and blue, an average differential value of a target pixel of the drawing image obtained by the means for obtaining, and an average differential value of the target pixel of the drawing image stored in the data buffer, to obtain a differential value of the target pixel based on the difference in average differential value between the drawing image obtained by the means for obtaining and the drawing image stored in the data buffer. The means for generating further combines a pixel having the differential value for each of color components of red, green, and blue to generate the differential image. The average differential value of the target pixel is a brightness value calculated by subtracting an average value of brightness values of pixels surrounding the target pixel from a brightness value of the target pixel.
In any one of the above-described examples, the means for generating calculates, for each of color components of red, green, and blue, an average ratio value of a target pixel of the drawing image obtained by the means for obtaining, and a differential value of a brightness value of the target pixel of the drawing image stored in the data buffer with respect to the average ratio value of the target pixel of the drawing image obtained by the means for obtaining; and combines a pixel having the differential value for each of color components of red, green, and blue to generate the differential image. The average ratio value of the target pixel is a brightness value calculated by multiplying a ratio of the brightness value of the target pixel with respect to the average value of the brightness values of pixels surrounding the target pixel of the drawing image obtained by the means for obtaining, with a brightness value of a background color of the drawing image obtained by the means for obtaining.
In any one of the above-described examples, the means for obtaining causes an image capturing device to capture the drawing image being displayed onto the shared area of the drawing object.
In any one of the above-described examples, the means for supplying supplies the drawing image received from the remotely located site to a projection device to cause the projection device to project the drawing image onto the drawing object for display.
In any one of the above-described examples, the means for supplying causes the projection device to project a marker image onto the drawing object, the marker image being used to specify the shared area of the drawing object. The means for obtaining causes the image capturing device to capture an image being displayed onto the drawing object including the marker image being projected by the projection device, and specifies the shared area of the drawing object based on coordinate values of markers included in the marker image included in the captured image.
In the drawing image sharing apparatus of any one of the above-described examples, the means for supplying causes the projection device to project a white color image onto the drawing object. The means for obtaining causes the image capturing device to capture an image being displayed onto the drawing object including the white color image being projected by the projection device, and specifies the shared area of the drawing object based on the difference in brightness value between the white color image included in the captured image and an area other than the white color image of the captured image.
In the drawing image sharing apparatus of any one of the above-described examples, the means for obtaining applies perspective transformation to the captured image.
In the drawing image sharing apparatus of any one of the above-described examples, the means for generating applies expansion processing to the projection image, and generates the differential image based on the expanded projection image.
In the drawing image sharing apparatus of any one of the above-described examples, the means for generating applies filtering processing to the differential image to reduce a yellow color component of the differential image.
In the drawing image sharing apparatus of any one of the above-described examples, the means for generating applies correction processing to the differential image to improve the color intensity of the differential image.
In one example, the present invention may reside in a data processing method performed by a drawing image sharing apparatus. The method includes: obtaining a drawing image being displayed onto a shared area of a drawing object as an image to be shared with a remotely located site; displaying a drawing image received from the remotely located site onto the drawing object through a display device; generating a differential image indicating the difference between the drawing image received from the remotely located site and the drawing image being displayed onto the shared area of the drawing object; storing the drawing image received from the remotely located site in a data buffer; and transmitting the differential image to the remotely located site.
In one example, the present invention may reside in a data processing method preformed by a drawing image sharing apparatus. The method includes: obtaining a drawing image being displayed onto a shared area of a drawing object as an image to be shared with a remotely located site; displaying a drawing image received from the remotely located site onto the drawing object through a display device; generating a differential image indicating the difference between the drawing image received from the remotely located site and the drawing image being displayed onto the shared area of the drawing object; storing the drawing image received from the remotely located site in a data buffer; and transmitting the differential image to the remotely located site, wherein the step of obtaining the drawing image being displayed onto the shared area of the drawing object is performed after a predetermined time period elapses since the step of displaying the drawing image received from the remotely located site is performed.
In one example, the present invention may reside in a data processing method performed by a drawing image sharing apparatus. The method includes: displaying a drawing image received from a remotely located site onto a drawing object; obtaining a drawing image being displayed onto a shared area of the drawing object as an image to be shared with the remotely located site after a predetermined time period elapses after the drawing image received from the remotely located site is displayed onto the drawing object; generating a differential image indicating the difference between the drawing image received from the remotely located site and the drawing image being displayed onto the shared area of the drawing object; and transmitting the differential image to the remotely located site. The step of obtaining, the step of generating, the step of transmitting, and the step of displaying are performed concurrently.
As described above, the drawing image sharing apparatus 118 of
In this manner, the drawing image sharing apparatus 118 of
In one example, the present invention may reside in: a drawing image sharing apparatus including: an image receive to receive a drawing image that reflects drawings drawn onto a second drawing object located at a remotely located site that is remote from a site at which the first drawing object is located; an image obtainer to cause an image capturing device to obtain an image being displayed onto a first drawing object as a captured image; an image supplier to cause a projection device to project the drawing image of the second drawing object onto the first drawing object as a projection image; a differential image generator to obtain a previous projection image that is previously projected onto the first drawing object from a memory, and to generate a differential image that reflects the difference between the previous projection image and the captured image obtained by the image obtainer; a storage to store the projection image being projected onto the first drawing object in the memory as a previous projected image to be used for next processing; and an image transmit to transmit the differential image to the remotely located site for projection onto the second drawing object.
Number | Date | Country | Kind |
---|---|---|---|
2010-044013 | Mar 2010 | JP | national |
2010-285919 | Dec 2010 | JP | national |