The present application is based on, and claims priority from JP Application Serial Number 2021-048258, filed Mar. 23, 2021, the disclosure of which is hereby incorporated by reference herein in its entirety.
The present disclosure relates to a projection image adjustment method, an information processing device, and a projection system.
JP-A-2013-78001 describes a projector which projects an image including a plurality of control points, accepts a user's operation of moving a control point, and deforms the image according to the amount of movement of the control point. This projector enables the user to move a control point to a desired position and thus correct the mode of the image such as shape or distortion.
Meanwhile, JP-A-2010-224221 describes a multi-projection system which couples together partial images projected respectively from a plurality of projectors, on a projection surface, and thus displays a single large image. In such a system, the individual projectors are arranged in such a way that the ends of partial images next to each other overlap each other so that a plurality of partial images are coupled together seamlessly.
In the case of correcting the mode of the image projected by the multi-projection system described in JP-A-2010-224221 using the technique described in JP-A-2013-78001, the control points may not be arranged in the overlap area where partial images overlap each other, depending on the combination of the number of partial images and the number of control points arranged. When the control points are not arranged in the overlap area, it is difficult to seamlessly couple together a plurality of partial images sharing this overlap area, in the overlap area.
A projection image adjustment method is a projection image adjustment method in which a mode of a projection image formed by a plurality of partial images projected on a projection surface from a plurality of projectors and arranged in a first direction in such a way as to partly overlap each other is adjusted using an image for adjustment including a plurality of adjustment points. The method includes: acquiring projection information including first arrangement information corresponding to a number of the partial images arranged in the first direction; deciding a number of the adjustment points in the first direction in such a way that the adjustment points are arranged in an overlap area where the partial images overlap each other, based on the first arrangement information; causing the plurality of projectors to project the image for adjustment where the decided number of adjustment points are arranged in the first direction, as the projection image on the projection surface; accepting an operation of moving the adjustment point included in the projected image for adjustment; changing a mode of the projected image for adjustment, based on the accepted operation; and deciding a correction parameter for adjusting a mode of the projection image, based on the change in the mode of the image for adjustment.
An information processing device includes a control unit adjusting a mode of a projection image formed by a plurality of partial images projected on a projection surface from a plurality of projectors and arranged in a first direction in such a way as to partly overlap each other, using an image for adjustment including a plurality of adjustment points. The control unit executes: acquiring projection information including first arrangement information corresponding to a number of the partial images arranged in the first direction; deciding a number of the adjustment points in the first direction in such a way that the adjustment points are arranged in an overlap area where the partial images overlap each other, based on the first arrangement information; causing the plurality of projectors to project the image for adjustment where the decided number of adjustment points are arranged in the first direction, as the projection image on the projection surface; accepting an operation of moving the adjustment point included in the projected image for adjustment; changing a mode of the projected image for adjustment, based on the accepted operation; and controlling a decision about a correction parameter for adjusting a mode of the projection image, based on the change in the mode of the image for adjustment.
A projection system includes a plurality of projectors, and an information processing device having a control unit adjusting a mode of a projection image formed by a plurality of partial images projected on a projection surface from the plurality of projectors and arranged in a first direction in such a way as to partly overlap each other, using an image for adjustment including a plurality of adjustment points. The control unit executes: acquiring projection information including first arrangement information corresponding to a number of the partial images arranged in the first direction; deciding a number of the adjustment points in the first direction in such a way that the adjustment points are arranged in an overlap area where the partial images overlap each other, based on the first arrangement information; causing the plurality of projectors to project the image for adjustment where the decided number of adjustment points are arranged in the first direction, as the projection image on the projection surface; accepting an operation of moving the adjustment point included in the projected image for adjustment; changing a mode of the projected image for adjustment, based on the accepted operation; and controlling a decision about a correction parameter for adjusting a mode of the projection image, based on the change in the mode of the image for adjustment.
A projection system according to a first embodiment will now be described with reference to the drawings.
As shown in
In the multi-projection according to this embodiment, four projectors 2 are arranged and four partial images Id projected from these projectors 2 are arranged in the form of a matrix having two lines along a first direction D1 and two lines along a second direction D2 intersecting the first direction D1. These four partial images Id form the full image Iw. In this embodiment, the first direction D1 is parallel to the horizontal direction, and the second direction D2 is parallel to the vertical direction. However, the first direction D1 and the second direction D2 are not limited to these directions. The arrangement of the partial images Id is not limited to the above example, either. A matrix having one or more lines along the first direction D1 and one or more lines along the second direction D2 and having a plurality of lines along at least one of these directions may be employed.
Each projector 2 displays the partial image Id in a partial range in a projection area Ap where an image can be projected. Typically, each projector 2 displays the partial image Id in such a range in the projection area Ap that the full image Iw can be viewed as a rectangle in a desired size. Each projector 2 is installed in such a way that a part of the projection area Ap of the projector 2 overlaps a part of the next projection area Ap. That is, each projector 2 is installed in such a way that a part of the partial image Id projected by the projector 2 overlaps a part of the next partial image Id. Therefore, the full image 1w can be displayed in the state where the partial images Id are smoothly coupled together without any gap between the partial images Id. In this way, in this embodiment, the full image Iw is formed by the four partial images Id projected on the projection surface Sp from the four projectors 2 and arranged in the first direction D1 and the second direction D2 in such a way as to partly overlap each other. In this specification, an area Ao where the partial images Id overlap each other is referred to as an “overlap area Ao”. Also, in this specification, that the partial images Id and the projection areas Ap are “next to each other” means that the partial images Id and the projection areas Ap are next to each other along the first direction D1 or the second direction D2.
The configuration of the projection system 100 shown in
As shown in
The control unit 10 includes one or a plurality of processors, a RAM (random-access memory), and a ROM (read-only memory) or the like. The control unit 10 operates according to a program stored in the ROM or a program read out to the RAM from the storage unit 11 and thus comprehensively controls the operation of the computer 1.
The storage unit 11 has a storage device such as a hard disk drive or a solid-state drive. The storage unit 11 stores an OS (operating system) and an application program installed therein, and various data or the like. In the storage unit 11 in this embodiment, a projection image adjustment program, not illustrated, is installed. The projection image adjustment program is an application program for adjusting the mode of the full image Iw such as the shape of the full image Iw displayed by multi-projection.
The display unit 12 has a display device such as a liquid crystal display or an organic EL (electroluminescence) display and displays an image under the control of the control unit 10.
The communication unit 13 has various circuits for communicating with an external device via the network NW. The communication unit 13 in this embodiment communicates with the plurality of projectors 2 connected via the network NW, under the control of the control unit 10. The form of communication may be wired communication or wireless communication.
The operation unit 14 includes a keyboard and a pointing device or the like. The operation unit 14 accepts various input operations by a user and outputs information corresponding to the input operations, to the control unit 10. As the pointing device, a mouse, a touch pad or the like can be used.
When the configuration including the control unit 10, of the foregoing components, is assumed as the main body of the computer 1, the other components than the control unit 10 may be not formed integrally with the main body of the computer 1.
As shown in
The control unit 20 includes one or a plurality of processors. The control unit 20 operates according to a control program stored in the storage unit 21 and thus comprehensively controls the operation of the projector 2.
The storage unit 21 has a memory such as a RAM and a ROM. The RAM is used to temporarily store various data or the like. The ROM stores the control program and control data for controlling the operation of the projector 2, and image data or the like.
The operation unit 22 has a plurality of operation keys for the user to give various instructions to the projector 2. When the user operates the various operation keys of the operation unit 22, the operation unit 22 outputs an operation signal corresponding to the content of the user's operation to the control unit 20. Also, a remote controller, not illustrated, that can perform remote control may be used as the operation unit 22. In this case, the remote controller sends an infrared operation signal corresponding to the content of the user's operation, and a remote control signal receiving unit, not illustrated, receives this operation signal and transmits the operation signal to the control unit 20.
The communication unit 23 has various circuits for communicating with an external device via the network NW. The communication unit 23 in this embodiment is connected to the computer 1 and the other projectors 2 via the network NW and transmits and receives information to and from these devices, under the control of the control unit 20.
The image pickup unit 24 is a camera having an image pickup element, not illustrated, such as a CCD (charge-coupled device) sensor or a CMOS (complementary metal-oxide semiconductor) sensor. The image pickup unit 24 picks up an image of the projection surface Sp under the control of the control unit 20 and outputs picked-up image data, which is the result of the image pickup, to the control unit 20. The image pickup unit 24 picks up an image over a range including at least the projection area Ap of this projector 2. Therefore, when the projectors 2 are installed in such a way that the plurality of partial images Id partly overlap each other as shown in
The image input unit 25 is coupled to the external image supply device 4 such as an image playback device. The image input unit 25 receives image data corresponding to a content image supplied from the image supply device 4 and outputs the image data to the image correction unit 26.
The image correction unit 26, under the control of the control unit 20, performs correction processing on the image data inputted from the image input unit 25 and outputs the processed image data to a light valve drive unit 34 shown in
The image input unit 25 and the image correction unit 26 may be formed by one or a plurality of processors or may be formed by a dedicated processing device such as an ASIC (application-specific integrated circuit) or an FPGA (field-programmable gate array).
As shown in
The light source 31 includes a discharge-type light source lamp such as an ultra-high-pressure mercury lamp or a metal halide lamp, or a solid-state light source such as a light-emitting diode or a semiconductor laser. The light emitted from the light source 31 is converted into light having a substantially uniform luminance distribution by an optical integration system, not illustrated, and is separated into color light components of red, green, and blue, which are the three primary colors of light, by a color separation system, not illustrated. Subsequently, the color light components enter the corresponding liquid crystal light valves 32R, 32G, 32B.
Each of the liquid crystal light valves 32R, 32G, 32B is formed by a transmission-type liquid crystal panel or the like having a pair of transparent substrates with a liquid crystal enclosed between the substrates. In each liquid crystal panel, a rectangular pixel area 32i made up of a plurality of pixels arranged in the form of a matrix is formed. A drive voltage can be applied to the liquid crystal at each pixel.
The light valve drive unit 34 forms an image in the pixel area 32i of the liquid crystal light valves 32R, 32G, 32B. Specifically, the light valve drive unit 34 applies a drive voltage corresponding to the image data inputted from the image correction unit 26, to each pixel in the pixel area 32i, and thus sets each pixel to a light transmittance corresponding to the image data. The light emitted from the light source 31 is transmitted through the pixel area 32i of the liquid crystal light valves 32R, 32G, 32B and thus modulated for each pixel, forming image light corresponding to the image data for each color light. The image lights of the individual colors, thus formed, are combined together for each pixel by a light combining system, not illustrated, and thus form image light representing a color image. This image light is projected in an enlarged form on the projection surface Sp by the optical projection system 33. Thus, an image based on the image data inputted from the image correction unit 26 is displayed on the projection surface Sp.
Referring back to
A method for adjusting the full image Iw when the projection system 100 performs multi-projection will now be described.
First, the user installs each projector 2 in such a way that the projection area Ap of each projector 2 is in a proper state. Specifically, the user installs each projector 2 in such a way that the area formed by the projection areas Ap of the individual projectors 2 combined together covers the area where the full image Iw is to be displayed and that the projection areas Ap next to each other partly overlap each other, as shown in
The projection image adjustment processing is processing for adjusting the size, shape, distortion or the like of the full image Iw in a desired mode and is executed as initial processing, for example, when the multi-projection by the projection system 100 starts. The execution of this projection image adjustment processing enables the user to adjust, for example, the outline of the full image Iw to coincide with the outline of the screen when the screen is arranged inside the projection surface Sp. Also, for example, when the projection surface Sp is a curved surface, the user can make an adjustment to reduce the distortion of the image due to the curved surface.
As shown in
In this embodiment, the control unit 10 also acquires information about the density of arrangement of adjustment points Pa shown in
In step S120, the control unit 10 executes projection area coupling processing of coupling the projection areas Ap of the individual projectors 2 together. The projection area coupling processing is processing of executing a geometric correction in which the positional relationship between the individual projection areas Ap is found and in which the coordinate systems of the projection areas Ap next to each other are coupled together, with respect to the coordinate system of each projection area Ap. The projection area coupling processing generates a single common coordinate system having the coordinate systems of the individual projection areas Ap coupled together on the projection surface Sp.
The execution of the projection area coupling processing of step S120 may be omitted, when there is a history of the execution of the projection area coupling processing in the past, the individual projection areas Ap are already coupled together, and a common coordinate system is already known. Also, step S120 may be omitted, for example, when each projector 2 is installed in a predetermined arrangement position and arrangement attitude in relation to the projection surface Sp in such a way that a predetermined common coordinate system is constructed on the projection surface Sp. The control unit 10 may display a message on the projection surface Sp and ask the user whether to execute the projection area coupling processing or not.
The projection area coupling processing will now be described with reference to
In step S210, the control unit 10 sequentially instructs each projector 2 to project the pattern image for measurement Dm in the projection area Ap of each projector 2 and causes the image pickup unit 24 of each projector 2 to pick up an image of the pattern image for measurement Dm projected on the projection surface Sp. The picked-up image SI generated by the image pickup at this point is also referred to as a “first picked-up image SIa”.
The control unit 10 also instructs each projector 2 to pickup an image of the projection area Ap of the projector 2, using the image pickup unit 24 of this projector 2, when the pattern image for measurement Dm is projected in the projection area Ap next to the projection area Ap of this projector 2. The picked-up image SI generated by the image pickup at this point is also referred to as a “second picked-up image SIb”.
It is desirable that, in step S210, the control unit 10 sets an order in which the individual projectors 2 project the pattern image for measurement Dm so that the pattern image for measurement Dm is not projected simultaneously in the projection areas Ap next to each other. Also, in step S210, the control unit 10 may perform control in such a way that the pattern image for measurement Dm is projected simultaneously in two or more projection areas Ap that are spaced apart from each other and not next to each other in the first direction D1 or the second direction D2. This enables a reduction in the processing time of step S210.
In step S220, the control unit 10 acquires, from each projector 2, the position of the measurement point Pm in the coordinate system of the projection area Ap of the projector 2. Specifically, the control unit 20 of each projector 2 analyzes the first picked-up image SIa and extracts the position of each measurement point Pm shown in the first picked-up image SIa. The control unit 20 then acquires the coordinate system of the projection area Ap, based on the coordinates of each extracted measurement point Pm on the image data, and calculates the coordinates of each measurement point Pm in the coordinate system of the projection area Ap, that is, the coordinates representing the display position of the measurement point Pm on the projection surface Sp. The result of the calculation is transmitted to the computer 1 via the communication unit 23 of each projector 2.
In step S230, the control unit 10 acquires the position of each measurement point Pm in the overlap area Ao in the second picked-up image SIb, from each projector 2. Specifically, the control unit 20 of each projector 2 analyzes the second picked-up image SIb and extracts the position of each measurement point Pm in the overlap area Ao shown in the second picked-up image SIb. The result of the extraction is transmitted to the computer 1 via the communication unit 23 of each projector 2.
In step S240, the control unit 10 establishes a correspondence between the coordinates of the measurement point Pm in the coordinate system of each projection area Ap acquired from the first picked-up image SIa and the information about the position of the measurement point Pm shown in the overlap area Ao acquired from the second picked-up image SIb. Thus, the control unit 10 specifies the positional relationship between the projection areas Ap next to each other. The control unit 10 transmits information representing the positional relationship between the projection areas Ap next to each other, to each corresponding projector 2. The “information representing the positional relationship between the projection areas Ap next to each other” is information representing the relative positional relationship between the display positions of the pixels in the overlap area Ao between the projection areas Ap next to each other. In this way, in steps S210 to S240, the positional relationship between the projection areas Ap of the plurality of projectors 2 is found, using the measurement point Pm in the overlap area Ao shown in the picked-up image SI picked up by the image pickup unit 24 of each of the plurality of projectors 2, as an indicator.
In step S250, the correction control unit 28 of each projector 2 decides a correction parameter for geometric correction for converting the coordinate system of each projection area Ap, based on the information representing the positional relationship between the projection areas Ap next to each other transmitted from the control unit 10. Specifically, the correction control unit 28 calculates a correction parameter for geometric correction in such a way that the display position of the pixel in the projection area Ap coincides with the display position of the pixel in the next projection area Ap, in the overlap area Ao, and outputs the correction parameter to the image correction unit 26. Thus, the coordinate systems of the projection areas Ap next to each other are coupled together and a single common coordinate system having the coordinate systems of the individual projection areas Ap coupled together on the projection surface Sp is generated. From this point onward, the control unit 10 of the computer 1 can designate a position to each projector 2, using this common coordinate system.
Referring back to
In step S140, the control unit 10 controls each projector 2 to display the image for adjustment Da shown in
As shown in
In step S140, the control unit 10 first provisionally decides the size of the image for adjustment Da and decides the coordinates of each adjustment point Pa in the common coordinate system, based on that size. The control unit 10 then outputs the coordinates of the adjustment point Pa that can be included in each projection area Ap, to each projector 2, and causes each correction control unit 28 to generate image data of the partial image Id corresponding to a part of the image for adjustment Da. When the correction control unit 28 of each projector 2 controls the corresponding image correction unit 26 and outputs generated image data to the projection unit 27, the image for adjustment Da is displayed on the projection surface Sp by multi-projection, as shown in
In this embodiment, as shown in
The way the image for adjustment Da is displayed by multi-projection is not limited to the above example. For example, the control unit 10 may generate image data for adjustment representing the image for adjustment Da as described above, then generate partial image data of a part corresponding to the arrangement of the partial image Id of each projector 2, based on the generated image data for adjustment, and output the partial image data corresponding to each projector 2, to each projector 2. In this case, when each projector 2 projects the partial image Id based on the inputted partial image data, the image for adjustment Da is displayed on the projection surface Sp by multi-projection.
In step S150, the control unit 10 accepts an operation of moving an adjustment point Pa from the user via the operation unit 14. The user selects one adjustment point Pa to be the target of the movement operation and designates a direction of movement and a distance of movement of the selected adjustment point Pa, by the operation unit 14. Alternatively, the user may designate a position to which the selected adjustment point Pa is to be moved, directly on the projection surface Sp by a pointer or the like. The control unit 10 may cause the correction control unit 28 to display a range where each adjustment point Pa is movable, for the sake of convenience of the user.
In step S160, the control unit 10 changes the mode of the image for adjustment Da, based on the user's operation. Specifically, the control unit 10 outputs the coordinates of the adjustment point Pa after the movement, to the projector 2 whose projection area Ap includes the selected adjustment point Pa, and instructs the projector 2 to update the partial image Id. When the correction control unit 28 of the projector 2 receiving the instruction moves the adjustment point Pa to the inputted coordinates and updates the partial image Id, the image for adjustment Da in the changed mode is displayed on the projection surface Sp by multi-projection. For example, when the top left adjustment point Pa is moved to the top left corner of a desired projection range Ad by the user's operation, the image for adjustment Da is deformed as shown in
In step S170, the control unit 10 determines whether or not the adjustment of the full image Iw using the image for adjustment Da is finished, based on the user's operation by the operation unit 14. When an instruction to end is not given by the user, the control unit 10 returns the processing to step S150 and accepts an operation of moving an adjustment point Pa. By repeating steps S150 to S170, the user can move a plurality of adjustment points Pa to desired positions and thus can designate a mode of the image for adjustment Da. Meanwhile, when an instruction to end is given by the user, the positions of all the adjustment points Pa are finalized and the control unit 10 shifts the processing to step S180. For example, when all the adjustment points Pa are moved according to the desired projection range Ad by the user's operation, the image for adjustment Da is displayed in a properly adjusted mode as shown in
In step S180, the control unit 10 causes the correction control unit 28 of each projector 2 to decide a correction parameter for geometric correction, based on the finalized position of the adjustment point Pa, that is, the change in the mode of the image for adjustment Da, and then ends the processing. On receiving an instruction for this, the correction control unit 28 of each projector 2 updates the correction parameter decided in step S250, based on the coordinates of the adjustment point Pa, and outputs the updated correction parameter to the image correction unit 26. From this point onward, the image correction unit 26 of each projector 2 performs a geometric correction based on this correction parameter, on the image data inputted from the image input unit 25, and the projector 2 thus projects the partial image Id in the mode corresponding to the position of the adjustment point Pa. Thus, the full image Iw is adjusted and displayed in the mode designated by the user. As each projector 2 projects the partial image Id based on the decided correction parameter in this way, the mode of the full image Iw is adjusted. Therefore, each correction parameter is equivalent to a parameter for adjusting the mode of the full image Iw.
In this embodiment, the case where both the first density information and the second density information included in the projection information are 1 is described. However, for example, when both the first density information and the second density information are 2, the number of adjustment points Pa is 5 in both the first direction D1 and the second direction D2 and the total number is 5×5=25, as shown in
Although not illustrated, when both the first density information and the second density information are 3, the number of adjustment points Pa is 7 in both the first direction D1 and the second direction D2 and the total number is 7×7=49. In this case, too, the adjustment points Pa are arranged in the overlap area Ao. In this case, nine correction parameters are generated for one partial image Id and the different correction parameters are used according to the positions in the partial image Id.
In this way, when c partial images Id are arranged in the first direction D1, setting the number of adjustment points Pa in the first direction D1 to c×m+1, where m is a natural number, leads to arranging an odd number of adjustment points Pa equal to or greater than 3 are arranged in two partial images Id next to each other in the first direction D1. Also, since these adjustment points Pa are arranged at equal intervals in the first direction D1, the adjustment points Pa are arranged in the overlap area Ao located at the intermediate position between the two partial images Id next to each other in the first direction D1. The same applies to the second direction D2. When b partial images Id are arranged in the second direction D2, setting the number of adjustment points Pa in the second direction D2 to b×n+1, where n is a natural number, leads to arranging the adjustment points Pa in the overlap area Ao located at the intermediate position between the two partial images Id next to each other in the second direction D2. That is, the adjustment points Pa are arranged in the overlap area Ao regardless of the values of the natural numbers expressing the first density information and the second density information. As the values of the natural numbers expressing the first density information and the second density information become greater, a finer correction can be made.
As described above, the projection system 100, the computer 1, and the method for adjusting the full image Iw according to this embodiment can achieve the following effects.
(1) According to this embodiment, the control unit 10 decides the numbers of adjustment points Pa in the first direction D1 and the second direction D2 in such a way that the adjustment points Pa are arranged in the overlap area Ao where the partial images Id overlap each other, based on the numbers of partial images Id arranged in the first direction D1 and the second direction D2. Since the adjustment points Pa are thus arranged in the overlap area Ao, a plurality of partial images Id sharing the overlap area Ao can be smoothly coupled together in the overlap area Ao.
(2) According to this embodiment, the projection information acquired from the user includes information about the density of arrangement of the adjustment points Pa. Therefore, the adjustment points Pa can be arranged at a desired density of arrangement according to the three-dimensional shape or the like of the projection surface Sp.
(3) According to this embodiment, the numbers of adjustment points Pa in the first direction D1 and the second direction D2 are set to the number of partial images Id in each direction multiplied by a natural number plus 1. Therefore, the adjustment points Pa can be arranged at the intermediate position between the partial images Id next to each other, that is, in the overlap area Ao.
A projection system according to a second embodiment will now be described.
The projection system 100 according to this embodiment has the same configuration as in the first embodiment but partly differs in the operations in the projection image adjustment processing.
In this embodiment, when acquiring the projection information from the user in step S110, the control unit 10 acquires the first arrangement information representing the number of partial images Id along the first direction D1 and the second arrangement information representing the number of partial images Id along the second direction D2 as in the first embodiment but does not acquire the information about the density of arrangement of the adjustment points Pa, that is, the first density information and the second density information.
Instead, the control unit 10 causes the display unit 12 to display a menu image Mn shown in
As shown in
Specifically, the control unit 10 decides, as the first option S1, an option in which the number of adjustment points Pa in the first direction D1 is c×1+1 and in which the number of adjustment points Pa in the second direction D2 is b×1+1, where c is the number of partial images Id along the first direction D1 acquired from the user and b is the number of partial images Id along the second direction D2. The control unit 10 also decides, as the second option S2, an option in which the number of adjustment points Pa in the first direction D1 is c×2+1 and in which the number of adjustment points Pa in the second direction D2 is b×2+1. The control unit 10 also decides, as the third option S3, an option in which the number of adjustment points Pa in the first direction D1 is c×3+1 and in which the number of adjustment points Pa in the second direction D2 is b×3+1. In this way, in all the options included in the menu image Mn, the number of adjustment points Pa in the first direction D1 satisfies c×x+1, where x is a natural number, and the number of adjustment points Pa in the second direction D2 satisfies b×y+1, where y is a natural number.
In each option, the same natural number is employed as x and y. However, x and y may differ from each other. Also, the number of options to be decided may be any plural number and may be other than three.
The user can select one of the three options by operating the operation unit 14. The control unit 10 causes the display unit 12 to display the menu image Mn and subsequently accepts an operation by the user selecting an option via the operation unit 14. The control unit 10 then decides the numbers of adjustment points Pa in the first direction D1 and the second direction D2, based on the option selected by the user.
As described above, the projection system 100, the computer 1, and the method for adjusting the full image Iw according to this embodiment can achieve effects similar to those in the first embodiment.
The foregoing embodiments may be modified as follows.
In the embodiments, at least one projector 2 may execute apart of the operations to be executed by the computer 1, and the computer 1 may execute a part of the operations to be executed by each projector 2 so as to control each projector 2. Also, one projector 2 may be configured to execute all the operations to be executed by the computer 1 and control the operations of the other projectors 2. This enables the projection system 100 to be configured without including the computer 1.
In the embodiments, the control unit 10 decides a correction parameter in step S180 after all the movements of adjustment points Pa by the user are finished in step S170. However, the control unit 10 may update the correction parameter every time the movement of one adjustment point Pa is finished, and may display an image using this correction parameter. In this configuration, for example, when an image having a meaningful background image with the adjustment points Pa and the auxiliary lines La superimposed thereon is used as the image for adjustment Da, a geometric correction is made to the background image every time one adjustment point Pa is moved. Therefore, the status of correction of the distortion of the image can be checked in real time.
In the embodiments, the number of partial images Id arranged in the first direction D1 is acquired as the first arrangement information and the number of partial images Id arranged in the second direction D2 is acquired as the second arrangement information. However, the number of overlap areas Ao along the first direction D1 may be acquired as the first arrangement information and the number of overlap areas Ao along the second direction D2 may be acquired as the second arrangement information. For example, when four partial images Id are arranged in two lines in both the first direction D1 and the second direction D2 as in the embodiments, both the number of overlap areas Ao along the first direction D1 and the number of overlap areas Ao along the second direction D2 are 1. In this case, the number of partial images Id in each direction can be calculated by adding 1 to the acquired number of overlap areas Ao.
In the embodiments, the transmission-type liquid crystal light valves 32R, 32G, 32B are used as light modulation devices. However, a reflection-type light modulation device such as a reflection-type liquid crystal light valve can be employed. Also, a digital micromirror device or the like that controls the exit direction of incident light for each micromirror as a pixel and thus modulates the light emitted from the light source 31 can be used. The configuration having a plurality of light modulation devices corresponding to individual color lights is not limiting. A configuration having a single light modulation device modulating a plurality of color lights in time division may be employed.
Number | Date | Country | Kind |
---|---|---|---|
2021-048258 | Mar 2021 | JP | national |