One of the aspects of the disclosure relates to an image projection apparatus (projector) for multi-projection (sometimes simply referred to as MP hereinafter).
As illustrated in
Japanese Patent No. (“JP”) 4472435 discloses a MP system in which a plurality of projectors share, in their image memories, original image data having the number of pixels larger than that of the projection image of the projector from a video input device. In this system, each projector cuts out image data including a secondary image through the daisy chain to generate and project an MP image.
However, as the number of pixels in the projection image increases and the image data amount that can be input to an image input unit in each projector becomes as large as the projection image amount, the MP image cannot be generated by cutting out the image data through the daisy chain unlike JP 4472435. Moreover, as the number of pixels in the projection image increases, high-performance devices become required for the video source device and the MP image generator, and they become expensive.
One of the aspects of the embodiment provides an image projection apparatus that can generate projection image data for multi-projection.
An image projection apparatus according to one aspect of the disclosure as a first image projection apparatus is configured to project an image. The image projection apparatus includes a first input unit into which first image data is input, a second input unit into which second image data is input, and a generating unit configured to generate projection image data that includes a first image area corresponding to the first image data and a second image area corresponding to the second image data. The second image data is at least part of image data input to a second image projection apparatus different from the first image projection apparatus. A multi-projection system including the above image projection apparatus, an image processing method corresponding to the above image projection apparatus, and a storage medium storing a program for causing a computer to execute the above image processing method also constitute another aspect of the disclosure.
Further features of the disclosure will become apparent from the following description of embodiments with reference to the attached drawings. In the following, the term “unit” may refer to a software context, a hardware context, or a combination of software and hardware contexts. In the software context, the term “unit” refers to a functionality, an application, a software module, a function, a routine, a set of instructions, or a program that can be executed by a programmable processor such as a microprocessor, a central processing unit (CPU), or a specially designed programmable device or controller. A memory contains instructions or program that, in a case where executed by the CPU, cause the CPU to perform operations corresponding to units or functions. In the hardware context, the term “unit” refers to a hardware element, a circuit, an assembly, a physical structure, a system, a module, or a subsystem. It may include mechanical, optical, or electrical components, or any combination of them. It may include active (e.g., transistors) or passive (e.g., capacitor) components. It may include semiconductor devices having a substrate and other layers of materials having various concentrations of conductivity. It may include a CPU or a programmable processor that can execute a program stored in a memory to perform specified functions. It may include logic elements (e.g., AND, OR) implemented by transistor circuits or any other switching circuits. In the combination of software and hardware contexts, the term “unit” or “circuit” refers to any combination of the software and hardware contexts as described above. In addition, the term “element,” “assembly,” “component,” or “device” may also refer to “circuit” with or without integration with packaging materials.
Referring now to the accompanying drawings, a description will be given of embodiments according to the disclosure.
The image processing unit (generating unit) 200 includes a processor dedicated to image processing, detects the format of the input image data from the I/O unit 100, and generates projection image data by performing various image processing for the image data. More specifically, the image processing unit 200 performs resolution conversion (scaling) processing, trimming processing, on-screen display (OSD) processing for menu display, warp (keystone correction) processing, and the like. The image processing unit 200 performs the image processing after generating MP image data using first image data and second image data input from the I/O unit 100 in a case where multi-projection is performed by a plurality of projectors. The first image data, second image data, and MP image data will be described below. The projection image data that has received image processing is sent to a display data generating unit 300.
The display data generating unit 300 generates display image data (simply referred to as a display image hereinafter) to be displayed on a light modulation panel unit 500 to be described below from the projection image data or the MP image data generated by the image processing unit 200. At this time, the display data generating unit 300 performs processing (y correction, luminance unevenness correction, and edge blending in the MP) according to the characteristic of the light modulation panel unit 500. The display data generating unit 300 outputs the generated display image to a panel driving unit 400.
A light source unit 602 is driven and turned on by a light source driving unit 601 and emits illumination light. A laser, a discharge lamp, an LED, or the like is used for the light source unit 602. The light source driving unit 601 is turned on and off and adjusts a light amount in accordance with a driving control signal from the control unit 30.
The panel driving unit 400 generates a driving signal for the light modulation panel unit 500 according to the display image generated by the display data generating unit 300 and outputs the driving signal to the light modulation panel unit 500. The light modulation panel unit 500 includes a liquid crystal panel, a digital micromirror device, or the like, and modulates the illumination light from the light source unit 602 according to the input driving signal. The light modulation panel unit 500 according to this embodiment has 4096×2400 pixels. The light modulated by the light modulation panel unit 500 is projected onto a projection surface such as a screen (not illustrated) through a projection lens 702. Thereby, the projection image is displayed. The number of pixels in the light modulation panel unit 500 and the number of pixels in the image data described in this embodiment are merely illustrative, and other numbers of pixels may be used.
The projection lens 702 performs a zoom operation that changes the size of the projection image, a shift operation that shifts the position of the projection image, and a focus operation that adjusts the focus of the projection image through a lens driving unit 701 that receives a control signal from the control unit 30.
The control unit 30 includes a computer including a central processing unit (CPU) and the like, and controls the entire projector 1000 by transmitting and receiving signals to and from the operation unit 10 and other blocks, which will be described below.
The operation unit 10 includes operation members such as a button operable by the user and a receiving unit for receiving an infrared signal from a remote controller. The operation unit 10 accepts power-on and power-off operations, a focusing operation, and setting operations of a variety of types of projection modes, and outputs an operation signal indicative of the accepted operation to the control unit 30. A ROM 20 stores data such as a variety of setting values corresponding to operation signals from the operation unit 10, a zoom position, a shift position, and focus position of the projection lens 702.
An I/O selector 201 in the image processing unit 200 selects the inputs A to C to be input to a memory controller 202 in accordance with the setting by the control unit 30. The I/O selector 201 selects the same outputs A, B, and C as the inputs A, B, and C to be output to the outputs ICA 102, ICB 104, and ICC 106, respectively, in accordance with the setting by the control unit 30. The output ICA 102, ICB 104, and ICC 106 that have received the outputs A, B, and C from the I/O selector 201 output the image data as OUTA, OUTB, and OUTC.
A memory controller 202 performs a control operation for generating MP image data using the inputs A and B output from the I/O selector 201 in a case where horizontal (lateral) multi-projection is performed by connecting projection images from a plurality of projectors in a horizontal direction (lateral direction). The first image data as the input A is image data corresponding to a primary image mainly projected by the projector 1000, and the second image data as the input B is image data corresponding to the primary image mainly projected by another projector (second image projection apparatus). The MP image data includes a primary image area (first image area simply referred to as primary image hereinafter) corresponding to the first image data, and a secondary image area (second image area simply referred to as secondary image hereinafter) corresponding to part of the second image data that contacts an edge in a horizontal direction of the primary image. Part of the secondary image in the projection image becomes an edge portion that overlaps an adjacent projection image in the edge blending.
In a case where vertical (longitudinal) multi-projection is performed by connecting projection images from a plurality of projectors in a vertical direction (longitudinal direction), the memory controller 202 performs a control operation for generating MP image data using the inputs A and C output from the I/O selector 201. The second image data as the input C is image data corresponding to an image mainly projected by still another projector. The MP image data in this case includes a primary image corresponding to the first image data, and a secondary image corresponding to part of the second image data that contacts an edge in the vertical direction of the primary image.
The image data (image signal) output from the video source device 2000 to each projector is, for example, 4K (3840×2160 pixels) image data in which the original image data of 15360×2160 pixels is divided into four in the horizontal direction. Image data 1 to 4 are input as INA to the I/O units 100 in the projectors 1000-1 to 1000-4, respectively. Image data 2, 3, and 4 output as OUTA from the I/O units 100 in the projectors 1000-2 to 1000-4 are input as INB to the I/O units 100 in the projectors 1000-1 to 1000-3, respectively.
The image processing unit 200 in the projector 1000-1 generates MP image data (4096×2160 pixels) in which a secondary image 1-2 corresponding to part (256×2160 pixels) on the left end side of the image data 2 is added to the right end of the primary image (3840×2160 pixels) 1 corresponding to the image data 1. Similarly, the image processing unit 200 in the projector 1000-2 generates MP image data in which a secondary image 2-3 corresponding to part on the left end side of the image data 3 is added to the right end of the primary image 2 corresponding to the image data 2. The image processing unit 200 in the projector 1000-3 generates MP image data in which a secondary image 3-4 corresponding to part on the left end side of the image data 4 is added to the right end of the primary image 3 corresponding to the image data 3. The image processing unit 200 in the projector 1000-4 generates MP image data (4096×2160 pixels) corresponding to the image data 4.
The MP image data generated by the projectors 1000-1 to 1000-3 correspond to first projection image data. The MP image data generated by the projectors 1000-2 to 1000-4 corresponds to second projection image data for the MP image data generated by the projectors 1000-1 to 1000-3.
In the projector 1000-1, the I/O selector 201 inputs to the memory controller 202 the input A from the input ICA 101 that has received INA and the input B from the input ICB 103 that has received INB. As illustrated in
The pre-projection image data written in the DDR 205 is enlarged or reduced by a scaler formatter 203 and converted into image data in a predetermined format (4096 horizontal pixels×2400 vertical pixels). At this time, since the pre-projection image data has 2160 vertical pixels relative to 2400 vertical pixels of the predetermined format, the upper and lower 120 lines are blank image data such as black. A warp circuit 204 performs correction including keystone correction that deforms the image shape for the image data converted into the predetermined format, and outputs the corrected image data. In this embodiment, since the number of pixels of the light modulation panel unit 500 is 4096 horizontal pixels and 2400 vertical pixels, image data of 4096 horizontal pixels and 2400 vertical pixels is output from the image processing unit 200 as optimal image data for this number of pixels.
While this example uses the line memory for the input to the DDR, each of the primary image data and the secondary horizontal image data may be once written in the frame memory as the DDR, and the written data may be written in the memories 1 and 2 as in
Primary image data (PJ1/PJ2 primary image input data) are input to PJ1 and PJ2 at a timing in synchronization with primary image input Hsync (horizontal synchronization signal). PJ2 primary image input data input to PJ2 via the input ICA illustrated in
At this time, in PJ2, the PJ2 secondary image OUT data is output at a timing delayed from the PJ1/PJ2 primary image input data as illustrated in
PJ2 and the projector 1000-3 (PJ3) also generate MP image data through input/output of image data and writing in the DDR 205 as in PJ1. The projector 1000-4 (PJ4) generates MP image data (but not including secondary image) corresponding to PJ4 primary image data input as INA.
Thus, MP image data of 2400 vertical pixels is generated for the 2400 vertical pixels of the light modulation panel unit 500. At this time, the scaler formatter 203 outputs image data dot by dot to an OSD warp circuit without scaling each projection image data by setting to black an area of 120 pixels above and below the center of the light modulation panel unit 500. The OSD warp circuit performs correction in the warp circuit 204 including keystone correction for deforming the projection image, and outputs the projection image data to the display data generating unit 300. Image data may be disposed above or below the light modulation panel unit 500.
The number of pixels of the primary image data (3840 pixels), the number of pixels of the secondary horizontal image data (256 pixels), and the number of pixels of the projection image data (4096 pixels) are merely illustrative and may be the different numbers of pixels. In a case where the number of pixels of the primary image data is larger than the maximum number of pixels of the projection image data, the scaler formatter 203 may scale down the primary image and the secondary image data so that the number of pixels in a combination of the primary image data and the secondary image data can be the maximum number of pixels of the projection image data. In a case where the number of pixels of the primary image data is smaller than the maximum number of pixels in the projection image data, the scaler formatter 203 may scale up the primary image and secondary image data so that the number of pixels of the combination of the primary image data and the secondary image data can be the maximum number of pixels of the projection image data. In a case where the number of pixels of the combination of the primary image data and the secondary image data is equal to the maximum number of pixels of the projection image data, the scaler formatter 203 may output them dot by dot without scaling up and down.
The projectors 1000-1 to 1000-4 project and display the projection image corresponding to the MP image data generated by them, respectively, as illustrated in
The edge blending in the projector 1000-2 is processing that sets the gain at the left end of the primary image 2 to 0%, and gradually increases the gain up to 100% toward the right end of an area which the secondary image 1-2 of the primary image 2 overlaps. Other edge blending in the projector 1000-2 is processing that sets the gain at the left end of the secondary image 2-3 to 100% and gradually decreases the gain to 0% toward the right end. The edge blending in the projector 1000-3 is similar to that of the projector 1000-2.
The edge blending in the projector 1000-4 is processing that sets the gain at the left end of the primary image 4 to 0%, and gradually increases the gain to 100% toward the right end of an area which the secondary image 3-4 of the primary image 4 overlaps.
In this embodiment, the projectors 1000-1 to 1000-3 receive the secondary image data from the adjacent projectors and generate MP image data including the primary and secondary images. Alternatively, as illustrated in
On the other hand, the HVMP system according to this embodiment illustrated in
The image data output from the video source device 2000 to each projector is, for example, 4K (3840×2160 image data) in which the original image data of 11520×6480 pixels is divided into 3 horizontal images×3 vertical images. Image data 1 to 9 are input as INA to the I/O units 100 in the projectors 1000-1 to 1000-9. Image data 2, 3, 5, 6, 8, and 9 output as OUTA from the I/O units 100 in the projectors 1000-2, 1000-3, 1000-5, 1000-6, 1000-8, and 1000-9 to the I/O units 100 in the projectors 1000-1, 1000-2, 1000-3, 1000-4, 1000-5, 1000-7, and 1000-8, respectively. Image data 4 to 9 output as OUTA from the I/O units 100 in the projectors 1000-4, 1000-5, 1000-6, 1000-7, 1000-8, and 1000-9 are input as INC to the I/O units 100 in the projectors 1000-1, 1000-2, 1000-3, 1000-4, 1000-5, and 1000-6, respectively.
The image processing unit 200 in the projector 1000-1 adds a secondary image 1-2 corresponding to part (256×2160 pixels) on the left side of the image data 2 to the right end of the primary image (3840×2160 pixels) 1 corresponding to the image data 1. The image processing unit 200 adds a secondary image (1-4) corresponding to part (3840×240 pixels) on the upper end side of the image data 4 to the lower end of the primary image 1. The MP image data is thus generated. The image processing units 200 in the projectors 1000-2, 1000-4, and 1000-5 similarly generate MP image data including secondary images at the right and bottom ends. The image processing units 200 in the projectors 1000-2, 1000-4, and 1000-5 also generate MP image data including a secondary image at the right and lower ends. The image processing units 200 in the projectors 1000-3 and 1000-6 generate MP image data including a secondary image at the lower end. The image processing units 200 in the projectors 1000-7 and 1000-8 generate MP image data including a secondary image at the right end. The image processing unit 200 in the projector 1000-9 generates MP image data of only the primary image.
In the projector 1000-1, the I/O selector 201 inputs the input A from the input ICA 101 that has received INA, the input B from the input ICB 103 that has received INB, and the input C from the input ICC 105 that has received INC in the memory controller 202. The memory controller 202 writes the primary image data (1) as the input A in the line memory 1 and secondary horizontal image data (2) as the input B in the line memory 2, as illustrated in
The memory controller 202 reads the image data (1-4) for a predetermined number of lines (240 lines) of the secondary vertical image data from the DDR 205. The read secondary vertical image data (1-4) is added to the primary image data. Thereby, the pre-projection image data (MP image data) having the primary image (1), the secondary horizontal image (1-2), and the secondary vertical image (1-4) is generated and written in DDR 205. The pre-projection image data is image data of 4096 horizontal pixels×2400 vertical pixels.
The primary image data (PJ1/PJ4 primary image input data) are input to PJ1 and PJ4 at timings in synchronization with the primary image input Vsync (vertical synchronization signal). PJ4 primary image input data input to PJ4 via the input ICA illustrated in
At this time, in PJ4, the PJ4 secondary image OUT data is input to PH as PJ1 secondary image input data at a timing delayed from the PJ1/PJ4 primary image input data. The memory controller 202 in PJ1 writes the same frame (or a predetermined area) of the PH primary image input data and the PJ1 secondary image input data in the line memory (DDR 205) at the delayed timing. At the timing of the next and subsequent frames, data corresponding to 2160 lines of PJ1 primary image data out of the PH primary image input data and data corresponding to 240 lines of PH secondary image data out of the PH secondary image input data are output to the scaler formatter 203. The MP image data is thus output via the scaler formatter 203 and the warp circuit 204.
The projectors 1000-2, 1000-3, 1000-4, 1000-5, and 1000-6 also generate the MP image data through input/output of image data and writing in the DDR 205 similar to PJ1. However, the projectors 1000-3 and 1000-6 do not handle secondary horizontal image data. The projectors 1000-7 and 1000-8 generate MP image data through input/output of image data and writing in the DDR 205 similar to PJ1 described in
The projectors 1000-1 to 1000-9 project and display projection images corresponding to the MP image data generated by them, respectively, as illustrated in
The edge blending in the projector 1000-2 is processing that sets the gain at the left end of the primary image 2 to 0% and gradually increases the gain to 100% toward the right end of the area of the primary image 2 which the secondary image 1-2 overlaps. The second edge blending in the projector 1000-2 is processing that sets the gain at the left end of the secondary image 2-3 to 100% and gradually decreases the gain to 0% toward the right end. The third edge blending in the projector 1000-2 is processing that sets the gain at the upper end of the secondary image 2-5 to 100% and gradually decreases the gain to 0% toward the lower end. The edge blending in the projector 1000-5 is similar to that of the projector 1000-2.
The edge blending in the projector 1000-3 is processing that sets the gain at the left end of the primary image 4 to 0% and gradually increases the gain to 100% toward the right end of the area which the secondary images 2-3 of the primary image 4 overlaps. Other edge blending in the projector 1000-3 is processing that sets the gain at the upper end of the secondary image 3-6 to 100% and gradually decreases the gain to 0% toward the lower end. The edge blending in the projector 1000-6 is similar to that of the projector 1000-3.
The edge blending in the projector 1000-7 is processing that sets the gain at the left end of the secondary horizontal image 7-8 to 100% and gradually decreases the gain to 0% toward the right end. The edge blending in the projector 1000-8 is processing that includes the edge blending of the projector 1000-7, further sets the gain at the left end of the primary image 8 to 0%, and increases the gain to 100% towards the right end of an area which the secondary image 2-3 of the primary image 8 overlaps.
The edge blending in the projector 1000-9 is processing that sets the gain at the left end of the primary image 9 to 0% and gradually increases the gain to 100% toward the right end of an area which the secondary images 8-9 of the primary image 9 overlap.
A flowchart of
In step S103, the control unit 30 makes a variety of types of settings for image projection to the I/O unit 100, the image processing unit 200, and the display data generating unit 300 based on the various setting data stored in the ROM 20. In a case where the variety of types of settings are completed, the flow proceeds to step S105 to start image projection.
In step S107, the control unit 30 determines whether or not an operation signal from the operation unit 10 has been detected. In a case where the operation signal has been detected, the flow proceeds to step S109 to display a menu using the OSD.
In step S111, the control unit 30 determines whether or not the user has selected MP setting from the menu. If the MP setting has not been selected, the flow proceeds to step S123 to display a menu for another setting using the OSD, and the flow proceeds to step S117. On the other hand, in a case where the MP setting has been selected, the flow proceeds to step S113 to display a MP setting menu through the OSD, and enters MP setting processing in step S115. The MP setting processing will be described below.
In a case where the flow proceeds from step S115 or step S123 to step S117, the control unit 30 determines whether or not to continue menu setting, and in a case where it is determined that the setting is to be continued, the flow returns to step S107. In a case where it is determined that the setting is not to be continued, the flow proceeds to step S119 to determine whether the power switch has been turned off. When the power switch is turned off, the flow proceeds to step S121 to end the image projection and this processing.
A flowchart of
In a case where HMV is selected as the MP mode in step S203, the flow proceeds to step S215 to display the secondary horizontal (HP) pixel setting as illustrated in
In a case where VMP is selected as the MP mode, the flow proceeds to step S221 to display secondary vertical (SV) pixel setting as illustrated in
In a case where HVMP is selected as the MP mode, the control unit 30 proceeds to step S205, and performs the secondary HV pixel setting display as illustrated in
In step S227, the control unit 30 sets the number of pixels set in steps S217 and S223, or S207 and S211 to the image processing unit 200, and writes the number of pixels in the ROM 20 in the next step S229.
This embodiment can provide a projector that can generate MP image data. Therefore, multi-projection with edge blending can be performed without using a multi-projection image generator.
The secondary image data input from the adjacent projector or video source device in each projector may be part of the primary image data in the adjacent projector (part added as the secondary image to the primary image).
The image projection processing executed by the projector 1000-1 (control unit 30) as the main projector is the same as that illustrated in
In step S301, the control unit 30 causes the DDR 205 to take in the primary image data input to INA and the SH image data input to INB at the same timing.
In step S303, the control unit 30 compares the primary image data and the SH image data taken in the DDR 205, and identifies the same image area, that is, an overlap area. The control unit 30 detects the number of horizontal pixels (overlap amount) in the overlap area. In the projector 1000-1, the overlap area occurs only if the imaging ranges of the camera 5000-1 and the camera 5000-2 partially overlap each other.
In step S305, the control unit 30 determines whether or not the overlap amount is equal to or larger than a predetermined amount (for example, 10% of the primary image). In a case where the overlap amount is smaller than the predetermined amount, the flow proceeds to S315, and in a case where the overlap amount is equal to or larger than the predetermined amount, the flow proceeds to S307.
In steps S315 and S317, for a small overlap amount, the control unit 30 causes the user to select the number of horizontal pixels of the SH image, as in steps S215 and S217 of
On the other hand, in step S307, the control unit 30 causes the OSD to display a menu for allowing the user to select one of the automatic setting and the manual setting of the number of SH pixels of the secondary horizontal image as illustrated in
Here, as illustrated in
On the other hand, if there is a sufficient overlap amount between the projection image (1) of the projector 1000-1 and the projection image (2) of the projector 1000-2 but the manual setting has been selected in step S307, the flow proceeds to step S315 to cause the user to select the number of horizontal pixels of the secondary horizontal image.
In step S313, the control unit 30 writes no setting of step S311 or the number of horizontal pixels set in step S317 in the ROM 20, and ends this processing.
The above embodiment can provide an image projection apparatus that can generate projection image data for multi-projection.
Embodiment(s) of the disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer-executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer-executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer-executable instructions. The computer-executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read-only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the disclosure has been described with reference to embodiments, it is to be understood that the disclosure is not limited to the disclosed embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2022-018316, filed on Feb. 8, 2022, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2022-018316 | Feb 2022 | JP | national |