The present application claims priority under 35 U.S.C. §119 to Japanese Patent Application No. 2015-209561, filed Oct. 26, 2015. The contents of which are incorporated herein by reference in their entirety.
1. Field of the Invention
The present invention relates generally to an information processing apparatus, an image projection system, and a computer program product.
2. Description of the Related Art
Multi-image projection is a conventional technique for providing a large screen by causing split images, into which a content image is split, to be projected from a plurality of projectors.
Correcting the split images in advance is typically necessary to make a screen image, into which images projected by the plurality of projectors are joined, appear to be free from warping. Therefore, in multi-image projection, it is typically required to cause a calibration pattern to be projected from each of the plurality of projectors and analyze an image obtained by image capture of the calibration patterns, thereby calculating calibration parameters for use in the correction.
Accordingly, conventional multi-image projection techniques disadvantageously require an image capture job be performed at least twice to detect positions of projectors and to calculate calibration parameters.
According to one aspect of the present invention, an information processing apparatus includes a captured-image receiver, a calibration-image extractor, an identification-information extractor, a projection-position locator, a calibration-parameter calculator, and an image corrector. The captured-image receiver is configured to receive a captured image that includes projected images of calibration images projected by a plurality of image projection apparatuses. Each calibration image includes a calibration pattern in which identification information of a corresponding image projection apparatus is embedded. The calibration-image extractor is configured to extract the calibration images from the captured image. The identification-information extractor is configured to extract the identification information from the extracted calibration images. The projection-position locator is configured to identify the image projection apparatuses having projected the calibration images and locate projection positions of the image projection apparatuses based on the identification information extracted from the calibration images and positions of the calibration images in the captured image. The calibration-parameter calculator is configured to calculate calibration parameters of the respective image projection apparatuses having projected the calibration images, based on the extracted calibration images. The image corrector is configured to split a content image into a plurality of split images and correct the split images based on the number of the identified image projection apparatuses, the projection positions of the image projection apparatuses, and the calibration parameters of the image projection apparatuses.
The accompanying drawings are intended to depict exemplary embodiments of the present invention and should not be interpreted to limit the scope thereof. Identical or similar reference numerals designate identical or similar components throughout the various drawings.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present invention.
As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
In describing preferred embodiments illustrated in the drawings, specific terminology may be employed for the sake of clarity. However, the disclosure of this patent specification is not intended to be limited to the specific terminology so selected, and it is to be understood that each specific element includes all technical equivalents that have the same function, operate in a similar manner, and achieve a similar result.
Exemplary embodiments of the present invention are described below. It should be understood that the embodiments described below are not intended to limit the scope of the present invention. Elements common between the drawings referred to in the description may retain the same numerical designation, and repeated description of the elements is omitted as appropriate.
An object of an embodiment is to provide an image projection apparatus, an image projection system, and a computer program product capable of detecting positions of projectors and calculating calibration parameters of the projectors by performing image capture a single time.
The projector 10 is an image projection apparatus that projects an image onto a projection surface, such as a screen. To perform multi-image projection, it is typically necessary to cause a sender, from which the images are transmitted to the projectors, to be aware of locations of the projectors so that the split content images are allocated to the projectors correctly. Referring to the example illustrated in
The information processing apparatus 100 is an information processing apparatus for transmitting an image to and causes the image to be projected by each of the projectors 10a, 10b, and 10c. The information processing apparatus 100 can be, for example, a personal computer (PC). While
In a preferred embodiment, the image projection system 1000 further includes a camera 20. The camera 20 is an image capture device for capturing projected images projected by the plurality of projectors 10. The camera 20 is a digital camera including a CCD (charge-coupled device) image sensor or a CMOS (complementary metal oxide semiconductor) digital image sensor as an imaging device. While
The system configuration of the image projection system 1000 of the embodiment has been described above. A functional configuration of the projectors 10 and the PC 100 is described below with reference to the functional block diagram illustrated in
The projector 10 includes a calibration-image generation unit 12, an image receiving unit 14, and an image projection unit 16.
The calibration-image generation unit 12 is a functional unit for generating a calibration image by embedding identification information about the apparatus (the projector 10), to which the calibration-image generation unit 12 belongs, in a calibration pattern (which will be described later).
The image receiving unit 14 is a functional unit for receiving an image from the PC 100 via the network 30.
The image projection unit 16 is a functional unit for controlling image projection. The image projection unit 16 projects the calibration image generated by the calibration-image generation unit 12 onto a projection surface and also projects the image received by the image receiving unit 14 from the PC 100 onto the projection surface.
The PC 100 includes a captured-image input unit (captured-image receiver) 102, a content-image input unit (content-image receiver) 103, a calibration-image extraction unit 104, an identification-information extraction unit 105, a projection-position locating unit 106, a calibration-parameter calculation unit 107, an image correction unit 108, and an image transmission unit 109.
The captured-image input unit 102 is a functional unit for receiving a captured image, in which projected images of the calibration images projected simultaneously by the plurality of projectors 10 are collectively captured.
The content-image input unit 103 is a functional unit for receiving a source content image to be projected onto the projection surface to form a large-screen image.
The calibration-image extraction unit 104 is a functional unit for extracting the calibration images from the captured image.
The identification-information extraction unit 105 is a functional unit for extracting the identification information about the projectors 10 from the extracted calibration images.
The projection-position locating unit 106 is a functional unit for identifying the projectors 10 that have projected the calibration images and locating their projection positions.
The calibration-parameter calculation unit 107 is a functional unit for calculating calibration parameters of the respective projectors 10 from the plurality of extracted calibration images.
The image correction unit 108 is a functional unit for splitting the content image into a plurality of split images and correcting each of the split images based on the calculated calibration parameter.
The image transmission unit 109 is a functional unit for transmitting each of the corrected split images to a corresponding one of the projectors 10.
The functional configuration of the projector 10 and the PC 100 has been described above. Processing executed first to perform multi-image projection using the image projection system 1000 by each of the projectors 10 is described below with reference to the flowchart illustrated in
The projector 10 of the embodiment starts processing illustrated in
At step S101, the calibration-image generation unit 12 reads out a calibration pattern from a predetermined storage area 18.
In
Thereafter, the calibration-image generation unit 12 reads identification information about the projector 10 from the predetermined storage area 18 (step S102), and generates a calibration image by embedding the identification information in the calibration pattern read out at step S101 (step S103).
In
Lastly, the image projection unit 16 projects the calibration image generated at step S103 onto a projection surface (step S104). Then, processing ends.
In the embodiment, at this point, a user performs image capture of the projection surface S using the camera 20 such that the three projected images 80 (80a, 70b, and 70c) are collectively captured in a single image. Thereafter, the thus-captured image obtained by the camera 20 is provided to the PC 100 by an appropriate method. The captured-image input unit 102 of the PC 100 receives an input of the captured image provided by the camera 20 and stores the captured image in a storage area 110. While
Processing executed by the PC 100 when image capture with the camera 20 is completed is described below with reference to the flowchart illustrated in
At step S201, the calibration-image extraction unit 104 reads out the captured image (the captured image in which the three projected images (80a, 80b, and 80c) are collectively captured) from the storage area 110. In
At the next step, S202, the calibration-image extraction unit 104 performs image analysis using a predetermined algorithm, thereby extracting image areas 90 (90a, 90b, and 90c), in which the projected images 80 of the calibration images 70 are captured, from the captured image illustrated in (a) in
At the next step, S203, the identification-information extraction unit 105 extracts identification information about the projectors 10 embedded as digital watermark from the respective three calibration images (90a, 90b, and 90c) extracted at step S202. Specifically, the identification-information extraction unit 105 extracts the device ID “PJ001” of the projector 10a from the calibration image 90a, extracts a device ID “PJ002” of the projector 10b from the calibration image 90b, and extracts a device ID “PJ003” of the projector 10c from the calibration image 90c as illustrated in (b) in
At the next step, S204, the projection-position locating unit 106 identifies each of the projectors 10 that has projected a corresponding one of the calibration images 90 and locates their projection positions. Specifically, the projection-position locating unit 106 identifies that the projector 10a associated with the identification information “PJ001” extracted from the calibration image 90a is the projection source of the calibration image 90a, and locates the projection position of the projector 10a as “left”, which is a position of the calibration image 90a in the captured image. Similarly, the projection-position locating unit 106 identifies that the projector 10b associated with the identification information “PJ002” extracted from the calibration image 90b is the projection source of the calibration image 90b, and locates the projection position of the projector 10b as “center”, which is a position of the calibration image 90b in the captured image. The projection-position locating unit 106 identifies that the projector 10c associated with the identification information “PJ003” extracted from the calibration image 90c is the projection source of the calibration image 90c, and locates the projection position of the projector 10c as “right”, which is a position of the calibration image 90c in the captured image.
The projection-position locating unit 106 places the above-described identifying-and-locating result in a device-information management table 500 stored in the storage area 110.
Before the process at step S204 described above is performed, as illustrated in (a) in
At the next step, S205, the calibration-parameter calculation unit 107 calculates calibration parameters of the projectors 10, which are the projection sources of the respective calibration images 90, based on the calibration patterns of the plurality of calibration images 90 extracted at step S202. Specifically, the calibration-parameter calculation unit 107 calculates a calibration parameter of the projector 10a, which is the projection source of the calibration image 90a, based on the calibration pattern contained in the calibration image 90a, calculates a calibration parameter of the projector 10b, which is the projection source of the calibration image 90b, based on the calibration pattern contained in the calibration image 90b, and calculates a calibration parameter of the projector 10c, which is the projection source of the calibration image 90c, based on the calibration pattern contained in the calibration image 90c. Because methods for calculating a calibration parameter for dewarping a projected image from a dot matrix of a calibration pattern are known, description of such a method is omitted.
As illustrated in (c) in
At the next step, S206, the image correction unit 108 reads out a content image, which is a source image to be projected onto the projection surface S to form a large-screen image, from the storage area 110. The content image to be projected may be fed from an auxiliary storage device of the PC 100 or via an interface for external inputs of the PC 100, received at the content-image input unit 103, and stored in the storage area 110.
At the next step, S207, the image correction unit 108 splits the content image read out at step S206 into split images of the same number as the projectors 10 and corrects each of the split images based on the number of the identified projectors 10, the projection positions of the respective projectors 10, and the calibration parameters of the respective projectors 10.
Specifically, first, the image correction unit 108 determines projection available areas A, B, and C of the projectors 10a, 10b, and 10c, which are the projection sources of the respective calibration images 90, as illustrated in (a) in
Next, the image correction unit 108 applies geometric correction to the content image so that the projection available area X contains the content image read out at step S206 in its maximum size with the aspect ratio of the content image maintained, and thereafter maps the corrected content image onto the projection available area X.
Next, as illustrated in (c) in
Next, the image correction unit 108 reads out the calibration parameters of the projectors 10, to each of which a corresponding one of the split images is allocated, from the field 504 in the device-information management table 500 and corrects the split images using the read-out calibration parameters. Specifically, the image correction unit 108 corrects the split image A′ using the calibration parameter of the projector 10a, corrects the split image B′ using the calibration parameter of the projector 10b, and corrects the split image C′ using the calibration parameter of the projector 10c.
At this time, the image correction unit 108 determines areas where adjacent two of the projection available areas overlap based on the projection available areas A, B, and C of the projectors 10a, 10b, and 10c, and corrects brightness of the overlapped areas in the split images so as to prevent an undesirable situation that seams between the images are visually detected, which can occur when the overlapped areas are unnaturally brighter than the other areas.
At the next step, S208, the image transmission unit 109 transmits the corrected split images respectively to the corresponding projectors 10. Specifically, the image transmission unit 109 reads out the IP addresses of the projectors 10, to which the split images are allocated, from the field 502 in the device-information management table 500 and transmits the corrected split images to the read-out IP addresses.
The split image transmitted from the PC 100 is received by the image receiving unit 14 of each of the projectors 10. The image projection unit 16 projects the split image onto the projection surface S.
As described above, according to the embodiment, because locating projection positions of placed projectors and calculating calibration parameters of the respective projectors can be done based on a single image capture result, a preparatory work for multi-image projection can be facilitated.
While the present invention has been described above with reference to the embodiments, it should be understood that the embodiments are not intended to limit the scope of the present invention, and various design modifications can be made.
For example, in the description given above, a calibration image is generated by embedding, as identification information about the projector 10, the device ID of the projector 10 in a calibration pattern. Alternatively, a calibration image may be generated by embedding the IP address of the projector 10 in the calibration pattern.
The calibration image is not necessarily generated by the projector 10. Alternatively, a scheme of storing a calibration image obtained by embedding identification information in a calibration pattern in the storage area 18 and reading out the calibration image as required may be employed.
A hardware configuration of the above-described projector 10 (image projection apparatus) and that of the PC 100 (information processing apparatus) are described below with reference to hardware configuration diagrams illustrated in
As illustrated in
As illustrated in
The above-described functions of the embodiments can be implemented by computing-executable program instructions described in C, C++, C#, or Java (registered trademark), for example. The program instructions of the embodiments may be distributed in a form of being stored in a computing-readable recording medium, which may be provided as a computer program product, such as a CD-ROM, an MO, a DVD, a flexible disk, an EEPROM, and an EPROM. The instructions can be transmitted over a network in a form available to other apparatuses.
As described above, an aspect of the present invention provides a novel image projection system capable of detecting positions of projectors and calculating calibration parameters of the projectors by performing image capture a single time.
The above-described embodiments are illustrative and do not limit the present invention. Thus, numerous additional modifications and variations are possible in light of the above teachings. For example, at least one element of different illustrative and exemplary embodiments herein may be combined with each other or substituted for each other within the scope of this disclosure and appended claims. Further, features of components of the embodiments, such as the number, the position, and the shape are not limited the embodiments and thus may be preferably set. It is therefore to be understood that within the scope of the appended claims, the disclosure of the present invention may be practiced otherwise than as specifically described herein.
The method steps, processes, or operations described herein are not to be construed as necessarily requiring their performance in the particular order discussed or illustrated, unless specifically identified as an order of performance or clearly identified through the context. It is also to be understood that additional or alternative steps may be employed.
Further, any of the above-described apparatus, devices or units can be implemented as a hardware apparatus, such as a special-purpose circuit or device, or as a hardware/software combination, such as a processor executing a software program.
Further, as described above, any one of the above-described and other methods of the present invention may be embodied in the form of a computer program stored in any kind of storage medium. Examples of storage mediums include, but are not limited to, flexible disk, hard disk, optical discs, magneto-optical discs, magnetic tapes, nonvolatile memory, semiconductor memory, read-only-memory (ROM), etc.
Alternatively, any one of the above-described and other methods of the present invention may be implemented by an application specific integrated circuit (ASIC), a digital signal processor (DSP) or a field programmable gate array (FPGA), prepared by interconnecting an appropriate network of conventional component circuits or by a combination thereof with one or more conventional general purpose microprocessors or signal processors programmed accordingly.
It should be noted that a person skilled in the field of information processing technology may employ the present invention using application specific integrated circuits (ASIC) or an apparatus in which circuit modules are connected.
Further, each of the functions (units) may be implemented by one or more circuits.
It should be noted that, in this specification, the circuit may include a processor programmed by software to execute the corresponding functions and hardware which is designed to execute the corresponding functions such as the ASIC and the circuit module.
Number | Date | Country | Kind |
---|---|---|---|
2015-209561 | Oct 2015 | JP | national |