This application is a U.S. National Phase of International Patent Application No. PCT/JP2019/002390 filed on Jan. 25, 2019, which claims priority benefit of Japanese Patent Application No. JP 2018-020696 filed in the Japan Patent Office on Feb. 8, 2018. Each of the above-referenced applications is hereby incorporated herein by reference in its entirety.
The present technology relates to an image processing device, an image processing method, a program, and a projection system, and in particular, to an image processing device, an image processing method, a program, and a projection system that facilitate the adjustment of the position and tilt of a camera.
An image projected on a screen with the use of a projector looks distorted depending on the shape of the screen and the orientation of the projector.
As a typical distortion, there is a phenomenon in which an image projected by a projector installed at an elevation or depression angle with respect to a screen is distorted to be a trapezoid. To make an image distorted to be a trapezoid look like a correct shape, keystone correction is necessary.
[PTL 1]
Japanese Patent Laid-open No. 2014-238601
In a case where images are projected with the use of a plurality of projectors, there are various installation conditions.
For example, it is necessary to match images in an overlapping region in which projection light beams from the plurality of projectors overlap each other, and match the brightness of the overlapping region with the brightness of the surroundings. Correction for matching images in an overlapping region is called warping (geometric correction), and correction for making brightness uniform is called blending (optical correction).
Further, in a case where images are projected on a dome screen, since the screen is not flat, correct images may not be projected without geometric correction. As a method of preparing parameters for geometric correction, there is a method that includes installing a measurement camera at a predetermined position and analyzing a taken image.
The present technology has been made in view of such circumstances, and facilitates the adjustment of the position and tilt of a camera.
According to one aspect of the present technology, there is provided an image processing device including an estimation unit configured to estimate, on the basis of a taken image taken by a camera installed at a predetermined tilt, the tilt of the camera, the camera being configured to photograph a projection surface of a screen on which a pattern image that includes an image having a predetermined pattern has been projected from a projector, and a display controlling unit configured to perform control to display the taken image rotated depending on the tilt of the camera.
In one aspect of the present technology, on the basis of a taken image taken by the camera installed at a predetermined tilt, the tilt of the camera is estimated, the camera being configured to photograph the projection surface of the screen on which a pattern image that includes an image having a predetermined pattern has been projected from the projector, and the taken image rotated depending on the tilt of the camera is displayed.
According to the present technology, it is possible to easily adjust the position and tilt of the camera.
Note that, the effect described herein is not necessarily limited and may be any effect described in the present disclosure.
Now, a mode for embodying the present technology is described. The description is made in the following order.
1. Configuration of Multi-Projection System
2. Arrangement Example of Projector and Camera
3. Example of Angle of View of Camera
4. Example of Projection Image
5. Global Adjustment
6. Detailed Adjustment
7. Configuration of Image Processing Device
8. Operation of Image Processing Device
9. Modified Example
<Configuration of Multi-Projection System>
A multi-projection system 1 of
As illustrated in
Further, the multi-projection system 1 includes projectors 13L and 13R, a surround speaker 14, a woofer 15, cameras 16L and 16R, and an image processing device 21. The projectors 13L and 13R, the surround speaker 14, the woofer 15, and the cameras 16L and 16R are connected to the image processing device 21 via wired or wireless communication.
The projectors 13L and 13R are mounted on the left and right of the dome screen 11 with their projection units facing the dome screen 11. For example, the projectors 13L and 13R are fixed to the installation stand 12 with metallic members.
The cameras 16L and 16R are also mounted on the left and right of the dome screen 11 with their lenses facing the dome screen 11. For example, the cameras 16L and 16R are mounted on the edge of the dome screen 11 through predetermined members such that the positions and tilts of the cameras are adjustable. An administrator of the multi-projection system 1 can adjust the photography ranges by moving the cabinets of the cameras 16L and 16R with his/her hands. Instead of being mounted on the edge of the dome screen 11, the cameras 16L and 16R may be mounted on the installation stand 12 with the use of predetermined members.
As illustrated in
In
The projectors 13L and 13R project images assigned thereto to display a content image on the entire projection surface 11A, to thereby present the content image to the user. An image from each projector is generated on the basis of a content image such that one image can be watched with no distortion from the point of view of the user. When content is reproduced, images assigned to the respective projectors 13L and 13R are supplied from the image processing device 21.
As illustrated in
The camera 16L is used to photograph the projection surface 11A on which images have been projected from the projector 13L and the projector 13R.
On the basis of an image taken by the camera 16L, for example, a preview image is generated in the image processing device 21 to be displayed on a display connected to the image processing device 21. The preview image is an image that is displayed in the adjustment of the position and tilt of the camera 16L and represents the state of the projection surface 11A.
The camera 16R is used to photograph the projection surface 11A on which images have been projected from the projector 13L and the projector 13R.
On the basis of an image taken by the camera 16R, for example, a preview image is generated in the image processing device 21 to be displayed on the display connected to the image processing device 21. The preview image is an image that is displayed in the adjustment of the position and tilt of the camera 16R and represents the state of the projection surface 11A.
The user adjusts the positions and tilts of the cameras 16L and 16R while watching preview images at a predetermined timing such as when the multi-projection system 1 is installed.
An image taken by the adjusted camera 16L the position and tilt of which have been adjusted is used to generate parameters that are used in geometric correction for an image to be projected from the projector 13L, for example. Further, an image taken by the adjusted camera 16R is used to generate parameters that are used in geometric correction for an image to be projected from the projector 13R, for example.
Now, in a case where there is no need to distinguish between the projector 13L and the projector 13R, the projector 13L and the projector 13R are collectively referred to as a “projector 13” appropriately. Further, in a case where there is no need to distinguish between the camera 16L and the camera 16R, the camera 16L and the camera 16R are collectively referred to as a “camera 16.”
Returning to the description of
The image processing device 21 reproduces content and generates, on the basis of each frame of the moving image of the content, a projection image that is projected from the projector 13. The image processing device 21 outputs the projection image to the projector 13 and controls the projector 13 to project the projection image on the projection surface 11A.
Further, the image processing device 21 outputs sound data obtained by reproducing content to the surround speaker 14 and the woofer 15 and controls the surround speaker 14 and the woofer 15 to output the sound of the content.
The image processing device 21 is, for example, a PC. The image processing device 21 may not be one PC but include a plurality of PCs. Further, the image processing device 21 may not be provided near the dome screen 11 as illustrated in
Note that, the two projectors are provided in the example of
The user sitting on the chair in front of the dome screen 11 looks up a little as indicated by the broken line to watch images projected on the projection surface 11A with the viewpoint position being a position P1 in the vicinity of the center of the sphere of the projection surface 11A that is a sphere surface. The position of the innermost part of the projection surface 11A (dome zenith), which is indicated by the broken line arrow of
As illustrated in
For example, moving image content including entire celestial sphere images is provided. Other moving image content such as movies, television programs, and games, and still image content such as landscape photographs may be provided.
<Arrangement Example of Projector and Camera>
In such a way, in the multi-projection system 1, high-resolution and wide-viewing-angle images can be projected with the use of the plurality of projectors each of which includes the fisheye lens as the lens of the projection unit. The projector 13L and the projector 13R are, for example, projectors capable of projecting 4K images such as 3840×2160 pixel images.
In the example of
As described above, the dome screen 11 is mounted downward at a predetermined angle. In
In the case where the multi-projection system 1 is viewed from the left, as illustrated in
Specifically, as illustrated in
With the projector 13L mounted at the 90-degree vertical position, the horizontal direction of an image projected from the projector 13L corresponds to the vertical direction of
As indicated by the broken line circle, the projection unit of the projector 13L is provided on the cabinet front surface that is the other side of the back surface 13L-2. The bottom surface, left-side surface, and right-side surface of the cabinet face the dome screen 11, downward, and upward, respectively, in
In the case where the multi-projection system 1 is viewed from the left, similarly, the camera 16L has an upper surface 16L-1 facing the user and a back surface 16L-2 appearing on the drawing sheet. In the example of
The horizontal direction of an image taken by the camera 16L mounted with the upper surface facing the user similarly to the projector 13L corresponds to the vertical direction of
In the example of
In the case where the multi-projection system 1 is viewed from the right, as illustrated in
Specifically, as illustrated in
With the projector 13R mounted at the 90-degree vertical position, the horizontal direction of an image projected from the projector 13R corresponds to the vertical direction of
As indicated by the broken line circle, the projection unit of the projector 13R is provided on the cabinet front surface that is the other side of the back surface 13L-2. The bottom surface, left-side surface, and right-side surface of the cabinet face the dome screen 11, upward, and downward, respectively, in
In the case where the multi-projection system 1 is viewed from the right, similarly, the camera 16R has an upper surface 16R-1 facing the user and a back surface 16R-2 appearing on the drawing sheet. In the example of
The horizontal direction of an image taken by the camera 16R mounted with the upper surface facing the user similarly to the projector 13R corresponds to the vertical direction of
In such a way, the projector 13L and the projector 13R are vertically mounted by being rotated by 90°.
In general, with regard to the angle of view (resolution) of video display equipment such as a projector and a TV, the horizontal resolution is higher than the vertical resolution. The two projectors are installed on the left and right of the dome screen 11 such that the direction with a higher resolution, namely, the horizontal direction corresponds to the vertical direction of the dome screen 11, with the result that high resolution images can be projected on the entire projection surface 11A.
Further, the camera 16L and the camera 16R are mounted at positions at which the cameras do not disturb the user watching content and the cameras can permanently be installed.
As described above, the camera 16L and the camera 16R are used to photograph the projection surface 11A to measure the states of images projected on the projection surface 11A. From the viewpoint of the measurement accuracy, the camera 16L and the camera 16R are preferably installed as close as possible to the dome screen 11. It can be said that the positions of the camera 16L and the camera 16R are positions that satisfy such conditions.
As described with reference to
Further, as described with reference to
Similarly, with regard to the angle of views of the cameras 16L and 16R, the horizontal resolution is higher than the vertical resolution. With the camera 16L adjusted to the same tilt as the projector 13L and the camera 16R adjusted to the same tilt as the projector 13R, a wide range including images projected from the projectors 13L and 13R can be included in the angle of views to be photographed.
<Example of Angle of View of Camera>
Here, the angle of view in a case where a certain landscape is photographed is described, but practically, a range including the projection surface 11 is photographed by the camera 16L. The range is similarly photographed in
As illustrated in the upper part of
In
Meanwhile, as illustrated in the lower part of
The user adjusts the position and tilt of the camera 16L while watching a preview image, but in a case where a preview image in the orientation illustrated in
As illustrated in the upper part of
In
Meanwhile, as illustrated in the lower part of
The user adjusts the position and tilt of the camera 16R while watching a preview image, but in a case where a preview image in the orientation illustrated in
To match the orientations of preview images with an orientation in the real space, in the image processing device 21, the processing of rotating taken images depending on the tilts of the cameras 16L and 16R is performed.
The image illustrated in the left part of
The display range of the preview image in the right part of
The image illustrated in the left part of
The display range of the preview image in the right part of
In such a way, in the adjustment of the camera 16L or 16R, as a preview image, an image with which the movement direction of the camera 16L or 16R matches the switching direction of a display range is displayed.
In a case where an image taken by the horizontally installed camera is displayed as a preview image as it is, the movement direction of the camera 16L or 16R does not match the switching direction of the display range of the preview image, resulting in troublesome adjustment. The present disclosure can prevent such troublesome adjustment.
In the following, adjustment that the administrator of the multi-projection system 1 makes to match the orientation of the camera 16 with the orientation of the projector 13 is referred to as “global adjustment.” Processing that the image processing device 21 performs to rotate an image depending on the tilt of the camera 16L or 16R to display the image as a preview image is referred to as “global adjustment processing.”
The adjustment of the cameras 16L and 16R includes global adjustment and detailed adjustment described later.
<Example of Projection Image>
The circle illustrated in
The projection range of the projector 13L is a range indicated by the diagonal lines of
Of the image projected from the projector 13L, a region projected outside the projection surface 11A is a black region (black light is projected outside the projection surface 11A).
Meanwhile, the projection range of the projector 13R is a range indicated by the diagonal lines of
Of the image projected from the projector 13R, a region projected outside the projection surface 11A is also a black region.
To such regions on the projection surface 11A, the images are projected from the projector 13L and the projector 13R. A range in which the range indicated by the diagonal lines of
The processing of matching the image from the projector 13L with the image from the projector 13R to prevent an image in the overlapping region from being blurred (to prevent a drop in resolution) is geometric correction.
Further, as illustrated in the vicinity of the center of
<Global Adjustment>
As a method of displaying a preview image to be used in global adjustment that is adjustment for matching the orientation of the camera 16 with the orientation of the projector 13, for example, the following methods are given.
(1) A method in which a user manually rotates a preview image by 90°.
(2) A method in which a preview image is rotated depending on a tilt detected by a sensor built in a camera, such as an IMU.
(3) A method in which a preview image is rotated on the basis of a taken image obtained by a camera taking a pattern image projected from a projector.
Global adjustment of the above-mentioned item (3) using pattern images is described.
The image illustrated in the left part of
The pattern image L is an image in which a region other than the lower left corner and the lower right corner is displayed in gray with constant luminance. A mark MY including yellow pixels is put at the lower left corner of the pattern image L, and a mark MB including blue pixels is put at the lower right corner. The mark MY and the mark MB each include a predetermined number of pixels.
An image representing the pattern with the mark MY put at the lower left corner and the mark MB put at the lower right corner is the pattern image L. The pattern image L is subjected to geometric correction (for example, geometric correction using default parameters) to be projected from the projector 13L.
Meanwhile, the image illustrated in the right part of
The pattern image R is an image in which a region other than the lower left corner and the lower right corner is displayed in gray with constant luminance. A mark MG having green pixels is put at the lower left corner of the pattern image R, and a mark MR having red pixels is put at the lower right corner. The mark MG and the mark MR each include a predetermined number of pixels.
An image representing the pattern with the mark MG put at the lower left corner and the mark MR put at the lower right corner is the pattern image R. The pattern image R is subjected to geometric correction to be projected from the projector 13R.
The image illustrated in the left part of
In the taken image L, the distorted pattern image L and the distorted pattern image R projected on the projection surface 11A appear. A region A1 appearing with a higher luminance than the surroundings is an overlapping region.
A boundary line L11 appearing as a shallow curve in the lower part of the taken image L corresponds to the lower side connecting the point C to the point D of the pattern image L. A boundary line L12 appearing as a shallow curve in the left part of the taken image L corresponds to part of the left side connecting the point A to the point D of the pattern image L. A boundary line L13 appearing as a shallow curve in the right part of the taken image L corresponds to part of the right side connecting the point B to the point C of the pattern image L.
The mark MY appears at the joining point between the boundary line L11 and the boundary line L12. Further, the mark MB appears at the joining point between the boundary line L11 and the boundary line L13.
An arc boundary line L21 appearing in the vicinity of the center of the taken image L corresponds to the lower side connecting the point CC to the point DD of the pattern image R. A boundary line L22 appearing as a shallow curve in the upper left of the taken image L corresponds to part of the right side connecting the point BB to the point CC of the pattern image R. A parabolic boundary line L23 appearing in the upper right of the taken image L corresponds to part of the left side connecting the point AA to the point DD of the pattern image R.
The mark MR appears at the joining point between the boundary line L21 and the boundary line L22. Further, the mark MG appears at the joining point between the boundary line L21 and the boundary line L23.
Meanwhile, the image illustrated in the right part of
In the taken image R, the distorted pattern image L and the distorted pattern image R projected on the projection surface 11A appear. A region A2 appearing with a higher luminance than the surroundings is the overlapping region.
A substantially straight boundary line L31 appearing in the lower part of the taken image R corresponds to the lower side connecting the point CC to the point DD of the pattern image R. A boundary line L32 appearing as a shallow curve in the left part of the taken image R corresponds to part of the left side connecting the point AA to the point DD of the pattern image R. A boundary line L33 appearing as a shallow curve in the right part of the taken image R corresponds to part of the right side connecting the point BB to the point CC of the pattern image R.
The mark MG appears at the joining point between the boundary line L31 and the boundary line L32. Further, the mark MR appears at the joining point between the boundary line L31 and the boundary line L33.
An arc boundary line L41 appearing in the vicinity of the center of the taken image R corresponds to the lower side connecting the point C to the point D of the pattern image L. A parabolic boundary line L42 appearing in the upper left of the taken image R corresponds to part of the right side connecting the point B to the point C of the pattern image L. A boundary line L43 appearing as a shallow curve in the upper right of the taken image R corresponds to part of the left side connecting the point A to the point D of the pattern image L.
The mark MB appears at the joining point between the boundary line L41 and the boundary line L42. Further, the mark MY appears at the joining point between the boundary line L41 and the boundary line L43.
In the image processing device 21, the marks MY, MB, MG, and MR are detected by analyzing the taken image L. The tilt of the camera 16L is estimated on the basis of the respective positions of the marks MY, MB, MG, and MR in the taken image L, and the taken image L is rotated depending on the estimated tilt to be displayed as a preview image.
Further, the marks MG, MR, MY, and MB are detected by analyzing the taken image R. The tilt of the camera 16R is estimated on the basis of the respective positions of the marks MG, MR, MY, and MB in the taken image R, and the taken image R is rotated depending on the estimated tilt to be displayed as a preview image.
As illustrated in the left part of
Further, as illustrated in the right part of
Each preview image is displayed at a predetermined position in the adjustment screen displayed on the display connected to the image processing device 21.
The administrator of the multi-projection system 1 matches the orientation of the camera 16 with the orientation of the projector 13 while watching the preview images illustrated in
In a case where the state of the projection surface 11A on which the pattern image L has been projected from the projector 13L and the pattern image R has been projected from the projector 13R is photographed from the front of the projection surface 11A, an image as illustrated in the left part of
For example, an arc boundary line L61 appearing on the left of the center of the projection surface 11A corresponds to the boundary line L11 appearing on the taken image L and the boundary line L41 appearing on the taken image R of
The right illustration of
Note that, the vicinities of the upper left corner and upper right corner of the pattern image L and the vicinities of the upper left corner and upper right corner of the pattern image R are projected outside the projection surface 11A as described with reference to
For example, in a case where the pattern images are projected from the projectors 13L and 13R mounted upside down, as illustrated in
The fact that it has been detected that the marks do not appear in the images taken by the camera 16L and the camera 16R means that the projectors 13L and 13R are mounted upside down.
In the case where the marks do not appear in an image taken by the camera 16, information indicating that the projector 13 is mounted upside down may be displayed to be notified to the administrator.
Through such global adjustment, the orientation of the camera 16 is adjusted to be matched with the orientation of the projector 13 such that the upper surfaces of both the components face the user.
<Detailed Adjustment>
To prevent an image in an overlapping region from being blurred, geometric correction that matches an image from the projector 13L with an image from the projector 13R is necessary.
To perform geometric correction with high accuracy, it is necessary that the positions and tilts of the cameras 16L and 16 be adjusted to positions and tilts at which the cameras can photograph the entire overlapping region.
Further, to detect the edge portion 11B, thereby reducing light emitted outside the dome screen 11 (light projected outside the projection surface 11A), it is necessary that the positions and tilts of the cameras 16L and 16 be adjusted to positions and tilts at which the cameras can photograph the entire edge portion 11B of the dome screen 11.
As described above, in the image processing device 21, of an image projected from the projector 13, a region projected outside the projection surface 11A is set as a black region. With the accurate detection of the entire edge portion 11B and the correct setting of a region projected outside the projection surface 11A, light emitted outside the dome screen 11 can be reduced.
In the example of
Further, when, as illustrated in the right part of
When the edge portion 11B corresponding to the arc connecting the position p1 to the position p2 of
Whether or not an overlapping region appears and whether or not the edge portion 11B of the dome screen 11 entirely appears are criteria of determination on whether or not the cameras 16L and 16R have been adjusted to appropriate positions and tilts.
Adjustment that the administrator of the multi-projection system 1 makes to adjust the positions and tilts of the cameras 16L and 16R while watching preview images in such a way is referred to as “detailed adjustment.” Processing that the image processing device 21 performs to analyze images taken by the cameras 16L and 16R to determine whether or not the cameras have been adjusted to appropriate positions and tilts is referred to as “detailed adjustment processing.”
In a case where the cameras have not been adjusted to appropriate positions and tilts, for example, information for guiding movement directions of the cameras 16L and 16R is displayed by being superimposed on the preview images. The administrator of the multi-projection system 1 can adjust the respective positions and tilts of the cameras 16L and 16R by following the displayed information to move the cameras.
In the preview image of the camera 16L illustrated in the left part of
In such a case, information for guiding a movement direction of the camera 16L as illustrated in the left part of
The administrator of the multi-projection system 1 can adjust the camera 16L to an appropriate position and tilt by following such a guide to move the camera 16L to the right.
Further, in the preview image of the camera 16R illustrated in the right part of
In such a case, information for guiding a movement direction of the camera 16R as illustrated in the right part of
The movement directions of the cameras 16L and 16R are detected on the basis of, for example, the positions of the marks of the pattern images.
In a case where it is determined that the cameras 16L and 16R have been adjusted to appropriate positions and tilts, as illustrated in
Since guide information is displayed, even in a case where the administrator of the multi-projection system 1 is not used to the adjustment, the administrator can adjust the cameras 16L and 16R to appropriate positions and tilts. Even a person who does not know the above-mentioned conditions, which serve as the indices, can adjust the cameras 16L and 16R, and hence, the operability of the multi-projection system 1 can be enhanced.
Since the movement direction of the camera 16 matches the switching direction of the display range of a preview image, the administrator of the multi-projection system 1 can intuitively adjust the positions and tilts of the cameras 16L and 16.
<Configuration of Image Processing Device>
A CPU (Central Processing Unit) 101, a ROM (Read Only Memory) 102, and a RAM (Random Access Memory) 103 are connected to each other by a bus 104.
To the bus 104, an input/output expansion bus 105 is also connected. To the input/output expansion bus 105, a GPU (Graphics Processing Unit) 106, a UI (User Interface) I/F 109, a communication I/F 112, and a recording I/F 113 are connected.
The GPU 106 renders, using a VRAM 107, projection images that are projected from the projectors 13L and 13R. For example, the GPU 106 generates projection images that are projected from the respective projectors 13L and 13R and outputs the projection images to the display I/F 108.
The display I/F 108 is a projection image output interface. The display I/F 108 is configured as an interface conforming to a predetermined standard, for example, HDMI (registered trademark) (High-Definition Multimedia Interface). The display I/F 108 outputs, to the projector 13L and the projector 13R, projection images supplied from the GPU 106, and controls the projector 13L and the projector 13R to project the projection images.
To the display I/F 108, the display such as an LCD or an organic EL display is also connected. The display I/F 108 controls the display to display the adjustment screen including preview images.
The UI I/F 109 is an operation detecting interface. The UI I/F 109 detects operation made using a keyboard 110 or a mouse 111 and outputs information indicating the operation content to the CPU 101. Operation is made using the keyboard 110 or the mouse 111 by, for example, the administrator of the multi-projection system 1.
The communication I/F 112 is an interface for communication with external devices. The communication I/F 112 is configured by a network interface such as a wireless LAN or a wired LAN. The communication I/F 112 communicates with external devices via a network such as the Internet, to thereby transmit or receive various kinds of data. Content that is reproduced in the multi-projection system 1 may be provided from a server via a network.
The communication I/F 112 appropriately transmits data regarding the sound of content to the surround speaker 14 and the woofer 15 and receives data regarding images taken by the cameras 16L and 16R and then transmitted from the cameras 16L and 16R, for example. In a case where a sensor or the like configured to detect the motion of the user is provided to the chair, the communication I/F 112 also receives sensor data transmitted from the sensor.
The recording I/F 113 is a recording medium interface. On the recording I/F 113, recording media such as an HDD 114 and a removable medium 115 are mounted. The recording I/F 113 reads out data recorded on the mounted recording media and writes data to the recording media. On the HDD 114, in addition to content, various kinds of data such as pattern image data and programs that the CPU 101 executes are recorded.
As illustrated in
A taken image taken by the camera 16 is supplied to the pattern detecting unit 153 and the display controlling unit 155 in global adjustment, and is supplied to the pattern detecting unit 153, the screen edge detecting unit 154, and the display controlling unit 155 in detailed adjustment. In a taken image that is supplied to each unit, a pattern image projected on the projection surface 11A appears.
The pattern image generating unit 151 generates pattern images in global adjustment. The pattern image generating unit 151 generates pattern images also in detailed adjustment. The pattern images generated by the pattern image generating unit 151 are supplied to the projection controlling unit 152.
In global adjustment, the projection controlling unit 152 performs, using default parameters, for example, geometric correction on pattern images generated by the pattern image generating unit 151, to thereby generate a projection image for the projector 13L and a projection image for the projector 13R. Parameters that are used in geometric correction serve as information associating pixels of a pattern image with pixels on the projection surface 11A.
The projection controlling unit 152 controls the display I/F 108 to output each projection image to the projector 13 and controls the projector 13 to project the projection image. Pattern images are projected also in detailed adjustment.
The pattern detecting unit 153 analyzes, in global adjustment, a taken image to detect the marks put at the lower left corner and lower right corner of the pattern image. The pattern detecting unit 153 estimates the tilt of the camera 16 on the basis of the position of each mark on the taken image, and outputs information indicating the estimated tilt to the display controlling unit 155. The pattern detecting unit 153 functions as an estimation unit configured to estimate the tilt of the camera 16 on the basis of the position of each mark on a taken image.
Further, the pattern detecting unit 153 analyzes, in detailed adjustment, a taken image to detect the marks put at the lower left corner and lower right corner of the pattern image. The pattern detecting unit 153 outputs information indicating the position of each mark on the taken image to the screen edge detecting unit 154.
The screen edge detecting unit 154 performs, in detailed adjustment, edge detection on the basis of the luminance of each pixel of a taken image to detect an overlapping region. In edge detection, the positions of the marks detected by the pattern detecting unit 153 serve as starting points, for example.
Further, the screen edge detecting unit 154 performs, in detailed adjustment, edge detection on the basis of the luminance of each pixel to detect the edge portion 11B of the dome screen 11. A region on the outer side of the edge portion 11B appears as a dark region in a taken image as described above. For example, a position at which the luminance suddenly drops is detected as the position of the edge portion 11B. In a case where an overlapping region has been detected, the edge portion 11B is detected with a starting point being the boundary line of the overlapping region.
In a case where an overlapping region and the entire edge portion 11B appear in a taken image, the screen edge detecting unit 154 outputs information indicating the fact to the display controlling unit 155. Further, in a case where one of an overlapping region and the entire edge portion 11B does not appear in a taken image, the screen edge detecting unit 154 outputs information indicating a movement direction of the camera 16 to the display controlling unit 155.
The display controlling unit 155 controls display on the adjustment screen that is displayed by a display 161. The display controlling unit 155 rotates, in global adjustment, a taken image supplied from the camera 16 depending on the tilt of the camera 16 estimated by the pattern detecting unit 153, and performs control to display the rotated taken image on the adjustment screen as a preview image.
Further, similarly to global adjustment, the display controlling unit 155 rotates, in detailed adjustment, a taken image supplied from the camera 16 and controls the display 161 to display the rotated taken image as a preview image.
In detailed adjustment, in a case where the screen edge detecting unit 154 has detected that an overlapping region and the entire edge portion 11B appear in a taken image, the display controlling unit 155 performs control to display information indicating that the camera 16 is at an appropriate position and tilt on the adjustment screen. Further, in detailed adjustment, in a case where the screen edge detecting unit 154 has detected that any of an overlapping region and the entire edge portion 11B does not appear in a taken image, the display controlling unit 155 performs control to display information for guiding a movement direction of the camera 16 on the adjustment screen on the basis of the information supplied from the screen edge detecting unit 154.
<Operation of Image Processing Device>
Here, with reference to the flowchart of
In Step S1, the information processing unit 131 performs the global adjustment processing. The administrator of the multi-projection system 1 matches the tilt of the camera 16 with the tilt of the projector 13 while watching a preview image displayed as a result of the global adjustment processing. The details of the global adjustment processing are described later with reference to the flowchart of
In Step S2, the information processing unit 131 performs the detailed adjustment processing. The administrator of the multi-projection system 1 adjusts the camera 16 to an appropriate position and tilt while watching a preview image displayed as a result of the detailed adjustment processing. The details of the detailed adjustment processing are described later with reference to the flowchart of
Next, with reference to the flowchart of
In Step S11, the pattern image generating unit 151 generates pattern images.
In Step S12, the projection controlling unit 152 generates, on the basis of the pattern images generated by the pattern image generating unit 151, a projection image for the projector 13L and a projection image for the projector 13R and controls the projector 13L and the projector 13R to project the respective projection images.
In Step S13, the camera 16 photographs the projection surface 11A on which the pattern images have been projected from the projector 13L and the projector 13R.
In Step S14, the pattern detecting unit 153 analyzes the taken image to detect the marks put at the lower left corner and lower right corner of the pattern image, to thereby estimate the tilt of the camera 16.
In Step S15, the display controlling unit 155 rotates the taken image depending on the tilt of the camera 16 estimated by the pattern detecting unit 153 and performs control to display the rotated taken image on the adjustment screen as a preview image. The taken image is appropriately rotated as needed.
After the preview image has been displayed, the processing returns to Step S1 of
Next, with reference to the flowchart of
In Step S21, the pattern image generating unit 151 generates pattern images.
In Step S22, the projection controlling unit 152 generates, on the basis of the pattern images generated by the pattern image generating unit 151, a projection image for the projector 13L and a projection image for the projector 13R and controls the projector 13L and the projector 13R to project the respective projection images.
In Step S23, the camera 16 photographs the projection surface 11A on which the pattern images have been projected from the projector 13L and the projector 13R.
In Step S24, the pattern detecting unit 153 analyzes the taken image to detect the marks put at the lower left corner and lower right corner of the pattern image.
In Step S25, the screen edge detecting unit 154 detects the edge of the taken image to detect an overlapping region and the edge of the screen, namely, the edge portion 11B of the dome screen 11.
Here, similarly to global adjustment, the processing of estimating the tilt of the camera 16 on the basis of the positions of the marks and appropriately rotating the taken image depending on the tilt of the camera 16 is performed. On the adjustment screen, the preview image is continuously displayed.
In Step S26, the screen edge detecting unit 154 determines whether or not the overlapping region and the edge portion 11B of the dome screen 11 have been successfully detected.
In a case where it is determined in Step S26 that neither the overlapping region nor the edge portion 11B of the dome screen 11 has been successfully detected, in Step S27, the display controlling unit 155 performs, on the basis of the information supplied from the screen edge detecting unit 154, control to display information for guiding a movement direction of the camera 16 on the adjustment screen. After that, the processing returns to Step S23, and the processing described above is repeated.
Meanwhile, in a case where it is determined in Step S26 that both the overlapping region and the edge portion 11B of the dome screen 11 have been successfully detected, the processing proceeds to Step S28.
In Step S28, the display controlling unit 155 performs control to display, on the adjustment screen, information indicating that the camera 16 is at an appropriate position and tilt. After that, the processing returns to Step S2 of
After the adjustment of the position and tilt of the camera 16, a predetermined image is projected from the projector 13, and the state of the projection surface 11A on which the image has been projected from the projector 13 is photographed by the camera 16. On the basis of the image taken by the camera 16, parameters that are used in geometric correction in projecting content images are calculated.
Such parameter calculation is periodically performed with a predetermined period. Further, parameter calculation is performed at a predetermined timing such as when the position of the projector 13 is moved.
As described above, the plurality of cameras can permanently be installed at positions different from the position that the user watches, that is, positions at which the cameras do not disturb the user. Further, parameters for geometric correction can be calculated at any timing on the basis of images taken by the permanently installed cameras without adjusting the cameras each time.
Since taken images rotated depending on the tilts of the cameras are displayed as preview images, the administrator of the multi-projection system 1 can easily adjust the cameras.
Further, since information for guiding movement directions of the cameras are displayed, even a person who is not used to the adjustment can adjust the cameras. Even in a case where the cameras have been moved due to disturbance, it is not necessary that a person who has expertise in adjusting the cameras comes to adjust the cameras each time, and hence, the operability can be enhanced.
<Modified Example>
The images each having the pattern using the marks in the predetermined colors are used as pattern images, but an image having a pattern using predetermined patterns such as a circle, a square, a triangle, and a cross may be used.
The pattern image is projected from the projector 13, but the above-mentioned processing may be performed on the basis of an image obtained by photographing markers set on the projection surface 11A.
In detailed adjustment, the administrator manually adjusts the camera, but the camera 16 may be automatically adjusted to the appropriate position and tilt detected on the basis of a pattern image. In such a case, a drive unit configured to adjust the position and tilt of the camera 16L and a drive unit configured to adjust the position and tilt of the camera 16R are each provided.
The above-mentioned series of processing processes can be executed by hardware or software. In a case where the series of processing processes are executed by software, a program that configures the software is installed from a program storage medium on a computer of
For example, The program that is executed by the CPU 101 is provided by being recorded on the removable medium 115 or provided via a wired or wireless transmission medium, such as a local area network, the Internet, or digital broadcasting, and is installed on the HDD 114.
With regard to the program that the computer executes, the processing processes of the program may be performed in chronological order in the order described herein or in parallel. Alternatively, the processing processes of the program may be performed at a right timing, for example, when the program is called.
Note that, herein, a “system” means an aggregation of a plurality of components (device, module (part), or the like), and it does not matter whether or not all the components are in the same cabinet. Thus, a plurality of devices that is accommodated in separate cabinets and connected to each other via a network, and one device including a plurality of modules accommodated in one cabinet are both “systems.”
The effects described herein are merely exemplary and are not limited, and other effects may be provided.
The embodiment of the present technology is not limited to the embodiment described above, and various modifications can be made without departing from the gist of the present technology.
For example, the present technology can employ the configuration of cloud computing in which a plurality of devices shares one function via a network to process the function in cooperation.
Further, each step described in the above-mentioned flowcharts can be executed by being shared in a plurality of devices as well as being executed by one device.
Moreover, in a case where a plurality of processing processes is included in one step, the plurality of processing processes included in the one step can be executed by being shared in a plurality of devices as well as being executed by one device.
The present technology can also take the following configurations.
(1)
An image processing device including:
an estimation unit configured to estimate, on the basis of a taken image taken by a camera installed at a predetermined tilt, the tilt of the camera, the camera being configured to photograph a projection surface of a screen on which a pattern image that includes an image having a predetermined pattern has been projected from a projector; and
a display controlling unit configured to perform control to display the taken image rotated depending on the tilt of the camera.
(2)
The image processing device according to Item (1), in which the display controlling unit rotates the taken image such that a movement direction in which the camera is moved in real space matches a switching direction of a display range of the taken image.
(3)
The image processing device according to Item (1) or (2), in which the screen includes a dome screen.
(4)
The image processing device according to Item (3), in which the camera is installed on each of left and right of the screen.
(5)
The image processing device according to Item (4), further including:
a generation unit configured to generate the pattern image having predetermined marks put at a lower left corner and a lower right corner of the pattern image; and
a projection controlling unit configured to control each of a plurality of the projectors to project the pattern image.
(6)
The image processing device according to Item (5), in which two of the projectors are installed on the left and right of the screen at predetermined tilts.
(7)
The image processing device according to Item (6), in which the two projectors are each installed such that a horizontal side of the pattern image is projected on the projection surface as an arc.
(8)
The image processing device according to any one of Items (5) to (7), in which, light of the pattern image projected from the projector includes light of the marks that is emitted on positions on the projection surface and light of an upper left corner and an upper right corner of the pattern image that is emitted on positions outside the projection surface.
(9)
The image processing device according to any one of Items (6) to (8), further including:
a detection unit configured to detect, on the basis of each of the taken images taken by two of the cameras, an overlapping region included in the pattern image projected on the projection surface from each of the two projectors and an edge portion of the screen.
(10)
The image processing device according to Item (9), in which the display controlling unit performs, on the basis of positions of the overlapping region of the taken image and the edge portion, control to display information for guiding a movement direction of the camera together with the taken image.
(11)
The image processing device according to Item (9) or (10), in which the display controlling unit performs, in a case where the overlapping region and the edge portion appear in each of the taken images taken by the two cameras, control to display information indicating that the cameras are at appropriate positions and tilts.
(12)
An image processing method including controlling an image processing device to:
estimate, on the basis of a taken image taken by a camera installed at a predetermined tilt, the tilt of the camera, the camera being configured to photograph a projection surface of a screen on which a pattern image that includes an image having a predetermined pattern has been projected from a projector; and
perform control to display the taken image rotated depending on the tilt of the camera.
(13)
A program for causing a computer to execute the processing of:
estimating, on the basis of a taken image taken by a camera installed at a predetermined tilt, the tilt of the camera, the camera being configured to photograph a projection surface of a screen on which a pattern image that includes an image having a predetermined pattern has been projected from a projector; and
performing control to display the taken image rotated depending on the tilt of the camera.
(14)
A projection system including:
a dome screen;
a projector installed at a predetermined tilt and configured to project a pattern image that includes an image having a predetermined pattern on the screen;
a camera installed at a predetermined tilt together with the projector; and
an image processing device including
1 Multi-projection system, 11 Dome screen, 11A Projection surface, 13L, 13R Projector, 14 Surround speaker, 15 Woofer, 16L, 16R Camera, 21 Image processing device, 131 Information processing unit, 151 Pattern image generating unit, 152 Projection controlling unit, 153 Pattern detecting unit, 154 Screen edge detecting unit, 155 Display controlling unit
Number | Date | Country | Kind |
---|---|---|---|
JP2018-020696 | Feb 2018 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2019/002390 | 1/25/2019 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2019/155904 | 8/15/2019 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6141034 | McCutchen | Oct 2000 | A |
6222593 | Higurashi | Apr 2001 | B1 |
6333826 | Charles | Dec 2001 | B1 |
20020024640 | Ioka | Feb 2002 | A1 |
20030142883 | Ishii | Jul 2003 | A1 |
20100001997 | Kajikawa | Jan 2010 | A1 |
20100073468 | Kutner | Mar 2010 | A1 |
20100141780 | Tan | Jun 2010 | A1 |
20110211175 | Stehle | Sep 2011 | A1 |
20110254916 | Fan | Oct 2011 | A1 |
20120194652 | Myokan | Aug 2012 | A1 |
20120320042 | Green | Dec 2012 | A1 |
20130070094 | Majumder | Mar 2013 | A1 |
20130314388 | Oda | Nov 2013 | A1 |
20140300687 | Gillard | Oct 2014 | A1 |
20190075269 | Nashida | Mar 2019 | A1 |
20190289285 | Nashida | Sep 2019 | A1 |
Number | Date | Country |
---|---|---|
2005-244835 | Sep 2005 | JP |
2012-249009 | Dec 2012 | JP |
2014-003586 | Jan 2014 | JP |
2014-238601 | Dec 2014 | JP |
2005084017 | Sep 2005 | WO |
Entry |
---|
International Search Report and Written Opinion of PCT Application No. PCT/JP2019/002390, dated Apr. 2, 2019, 08 pages of ISRWO. |
Number | Date | Country | |
---|---|---|---|
20210067732 A1 | Mar 2021 | US |