1. Field of the Invention
The present invention relates to a projection device, an image correction method, and a computer-readable recording medium.
2. Description of the Related Art
A projection device such as a projector device is known which drives display elements based on an input image signal and projects an image relating to the image signal on a projection face of a projection medium such as a screen or a wall face. In such a projection device, in a case where a projection image is projected not in a state in which an optical axis of a projection lens is perpendicular to the projection face but in a state in which the optical axis of the projection lens is inclined with respect to the projection face, a problem of a so-called trapezoidal distortion in which a projection image originally projected in an approximate rectangular shape is displayed to be distorted in a trapezoidal shape on the projection face occurs.
Accordingly, conventionally, by performing a trapezoidal distortion correction (keystone correction) for converting an image that is a projection target into a trapezoidal shape formed in a direction opposite to that of the trapezoidal distortion formed in the projection image displayed on the projection face, a projection image having an approximately rectangular shape without any distortion is displayed on the projection face.
For example, in Japanese Patent Application Laid-open No. 2004-77545, a technology for projecting an excellent video for which a trapezoidal distortion correction has been appropriately performed onto a projection face in a projector in a case where the projection face is either a wall face or a ceiling is disclosed.
In such a conventional technology, when a trapezoidal distortion correction (keystone correction) is performed, an image is converted into a trapezoidal shape formed in a direction opposite to a trapezoidal distortion generated in a projection image according to a projection direction, and the converted image is input to a display device, whereby the keystone correction is performed. Accordingly, on the display device, an image having the number of pixels that is smaller than the number of pixels that can be originally displayed by the display device is input in the trapezoidal shape formed in the opposite direction, and a projection image is displayed in an approximately rectangular shape on the projection face onto which the projection image is projected.
In the conventional technology as described above, in order not to display an area of the periphery of the projection image onto which the approximately rectangular-shaped original projection image is projected, in other words, a differential area between the area of the projection image of a case where no correction is made and the area of the projection image after the correction on the projection face, image data corresponding to black is input to the display device, or the display device is controlled not to be driven. Accordingly, there are problems in that the pixel area of the display device is not effectively used, and the brightness of the actual projection area decreases.
Meanwhile, recently, in accordance with wide use of high-resolution digital cameras, the resolution of a video content is improved, and thus, there are cases where the resolution of the video content is higher than the resolution of a display device. For example, in a projection device such as a projector that supports up to full HD of 1920 pixels×1080 pixels as an input image for a display device having resolution of 1280 pixels×720 pixels, the input image is scaled in a prior stage of the display device so as to match the resolution such that the whole input image can be displayed on the display device, or a partial area of the input image that corresponds to the resolution of the display device is cut out and is displayed on the display device without performing such scaling.
Even in such a case, in a case where projection is performed in a state in which the optical axis of the projection lens is inclined with respect to the projection face, a trapezoidal distortion occurs, and accordingly, in order to perform the trapezoidal distortion correction, similar problems occur.
It is an object of the present invention to at least partially solve the problems in the conventional technology.
There is provided a projection device that includes a projection unit that converts input image data into light and projects a converted image as a projection image onto a projection face with a predetermined view angle; a correction control unit that calculates a correction amount used for eliminating a geometric distortion occurring in the projection image according to a projection direction and determines a cut out range including also an area other than an area of the image data after the geometric distortion correction estimated according to the correction amount based on the correction amount; and a correction unit that generates cut out image data acquired by cutting out an area of the cut out range from the input image data and performs a geometric distortion correction for the cut out image data based on the correction amount.
There is also provided a projection device that includes a projection unit that converts input image data into light and projects a converted image as a projection image onto a projection face with a predetermined view angle; a projection control unit that performs control changing a projection direction of the projection image using the projection unit; a projection angle deriving unit that derives a projection angle of the projection direction; a correction control unit that calculates a correction amount used for correcting a geometric distortion occurring in the projection image according to the projection direction based on the projection angle and the view angle and determines a cut out range including also an area other than an area of the image data after the geometric distortion correction estimated according to the correction amount based on the correction amount; and a correction unit that generates cut out image data acquired by cutting out an area of the cut out range from the input image data and performs a geometric distortion correction for the cut out image data based on the correction amount.
There is further provided an image correction method executed by a projection device, the image correction method including converting input image data into light and projecting a converted image as a projection image onto a projection face with a predetermined view angle using a projection unit; calculating a correction amount used for eliminating a geometric distortion occurring in the projection image according to a projection direction and determining a cut out range including also an area other than an area of the image data after the geometric distortion correction estimated according to the correction amount based on the correction amount; and generating cut out image data acquired by cutting out an area of the cut out range from the input image data and performing a geometric distortion correction for the cut out image data based on the correction amount.
There is also provided an image correction method executed by a projection device, the image correction method including converting input image data into light and projecting a converted image as a projection image onto a projection face with a predetermined view angle using a projection unit; performing control changing a projection direction of the projection image using the projection unit; deriving a projection angle of the projection direction; calculating a correction amount used for correcting a geometric distortion occurring in the projection image according to the projection direction based on the projection angle and the view angle and determining a cut out range including also an area other than an area of the image data after the geometric distortion correction estimated according to the correction amount based on the correction amount; and generating cut out image data acquired by cutting out an area of the cut out range from the input image data and performing a geometric distortion correction for the cut out image data based on the correction amount.
There is further provided a computer readable recording medium that stores therein a computer program causing a computer to execute an image correction method, the method including converting input image data into light and projecting a converted image as a projection image onto a projection face with a predetermined view angle using a projection unit; calculating a correction amount used for eliminating a geometric distortion occurring in the projection image according to a projection direction and determining a cut out range including also an area other than an area of the image data after the geometric distortion correction estimated according to the correction amount based on the correction amount; and generating cut out image data acquired by cutting out an area of the cut out range from the input image data and performing a geometric distortion correction for the cut out image data based on the correction amount. The above and other objects, features, advantages and technical and industrial significance of this invention will be better understood by reading the following detailed description of presently preferred embodiments of the invention, when considered in connection with the accompanying drawings.
Hereinafter, a projection device, an image correction method and a computer-readable recording medium according to embodiments will be described in detail with reference to the accompanying drawings. Specific numerical values, external configurations, and the like represented in the embodiments are merely examples for easy understanding of the present invention but are not for the purpose of limiting the present invention unless otherwise mentioned. In addition, elements not directly relating to the present invention are not described in detail and are not presented in the drawings.
The drum unit 10 is supported to be rotatable by a rotation shaft, which is not illustrated in the figure, that is disposed on the inner side of side plate portions 21a and 21b that are parts of the base 20 and is configured by a bearing and the like. Inside the drum unit 10, a light source, a display element that modulates light emitted from the light source based on image data, a drive circuit that drives the display element, an optical engine unit that includes an optical system projecting the light modulated by the display element to the outside, and a cooling means configured by a fan and the like used for cooling the light source and the like are disposed.
In the drum unit 10, window portions 11 and 13 are disposed. The window portion 11 is disposed such that light projected from a projection lens 12 of the optical system described above is emitted to the outside. In the window portion 13, a distance sensor deriving a distance up to a projection medium, for example, using an infrared ray, an ultrasonic wave, or the like is disposed. In addition, the drum unit 10 includes an intake/exhaust hole 22a that performs air in-taking/exhausting for heat rejection using a fan.
Inside the base 20, various substrates of the circuit unit, a power supply unit, a drive unit used for driving the drum unit 10 to be rotated, and the like are disposed. The rotary drive of the drum unit 10 that is performed by this drive unit will be described later. On the first face of the base 20, an operation unit 14 used for a user inputting various operations for controlling the projector device 1 and a reception unit 15 that receives a signal transmitted by a user from a remote control commander not illustrated in the figure when the projector device 1 is remotely controlled are disposed. The operation unit 14 includes various operators receiving user's operation inputs, a display unit used for displaying the state of the projector device 1, and the like.
On the first face side and the second face side of the base 20, the intake/exhaust holes 16a and 16b are respectively disposed. Thus, even in a case where the intake/exhaust hole 22a of the drum unit 10 that is driven to be rotated takes a posture toward the base 20 side, air in-taking or air exhaust can be performed so as not to decrease the rejection efficiency of the inside of the drum unit 10. In addition, the intake/exhaust hole 17 disposed on the side face of the casing performs air in-taking and air exhaust for heat rejection of the circuit unit.
Rotary Drive of Drum Unit
On one face of the drum 30, a gear 35 is disposed on the circumference. The drum 30 is driven to be rotated through the gear 35 by the drive unit 32 disposed in the support portion 31b. Here, protrusions 46a and 46b disposed in the inner circumference portion of the gear 35 are disposed so as to detect a start point and an end point at the time of the rotation operation of the drum 30.
In addition, photo interrupters 51a and 51b are disposed on the support portion 31b. The photo interrupters 51a and 51b respectively detect the protrusions 46b and 46a disposed in the inner circumference portion of the gear 35. Output signals of the photo interrupters 51a and 51b are supplied to a rotation control unit 104 to be described later. In the embodiment, by detecting the protrusion 46b using the photo interrupter 51a, the rotation control unit 104 determines that the posture of the drum 30 is a posture arriving at an end point of the rotation operation. In addition, by detecting the protrusion 46a using the photo interrupter 51b, the rotation control unit 104 determines that the posture of the drum 30 is a posture arriving at a start point of the rotation operation.
Hereinafter, a direction in which the drum 30 rotates from a position at which the protrusion 46a is detected by the photo interrupter 51b to a position at which the protrusion 46b is detected by the photo interrupter 51a through a longer arc in the circumference of the drum 30 will be represented as a forward direction. In other words, the rotation angle of the drum 30 increases toward the forward direction.
In addition, the photo interrupters 51a and 51b and the protrusions 46a and 46b are arranged such that an angle formed with the rotation shaft 36 is 270° between the detection position at which the photo interrupter 51b detects the protrusion 46a and the detection position at which the photo interrupter 51a detects the protrusion 46b.
For example, in a case where a stepping motor is used as the motor 40, by specifying the posture of the drum 30 based on timing at which the protrusion 46a is detected by the photo interrupter 51b and the number of drive pulses used for driving the motor 40, a projection angle according to the projection lens 12 can be acquired.
Here, the motor 40 is not limited to the stepping motor but, for example, a DC motor may be used. In such a case, for example, as illustrated in
In the code wheel 44, for example, a transmission portion 45a and a reflection unit 45b having phases changing in the radial direction are disposed. By receiving reflected light having each phase from the code wheel 44 using the photo reflectors 50a and 50b, the rotation speed and the rotation direction of the gear 43 can be detected. Then, based on the rotation speed and the rotation direction of the gear 43 that have been detected, the rotation speed and the rotation direction of the drum 30 are derived. Based on the rotation speed and the rotation direction of the drum 30 that have been derived and a result of the detection of the protrusion 46b that is performed by the photo interrupter 51a, the posture of the drum 30 is specified, whereby the projection angle according to the projection lens 12 can be acquired.
In the configuration as described above, a state in which the projection direction according to the projection lens 12 is in the vertical direction, and the projection lens 12 is completely hidden by the base 20 will be referred to as a housed state (or housing posture).
Hereinafter, unless otherwise mentioned, the “direction of the drum unit 10” and the “angle of the drum unit 10” have the same meanings as the “projection direction according to the projection lens 12” and the “projection angle according to the projection lens 12”.
For example, when the projector device 1 is started up, the drive unit 32 starts to rotate the drum unit 10 such that the projection direction according to the projection lens 12 faces the above-described first face. Thereafter, the drum unit 10, for example, is assumed to rotate up to a position at which the direction of the drum unit 10, in other words, the projection direction according to the projection lens 12 is horizontal on the first face side and temporarily stop. The projection angle of the projection lens 12 of a case where the projection direction according to the projection lens 12 is horizontal on the first face side is defined as a projection angle of 0°. In
For example, at the 0° posture, it is assumed that image data is input, and the light source is turned on. In the drum unit 10, light emitted from the light source is modulated based on the image data by the display element driven by the drive circuit and is incident to the optical system. Then, the light modulated based on the image data is projected from the projection lens 12 in a horizontal direction and is emitted to the projection face of the projection medium such as a screen or a wall face.
By operating the operation unit 14 and the like, the user can rotate the drum unit 10 around the rotation shaft 36 as its center while projection is performed from the projection lens 12 based on the image data. For example, by getting the rotation angle to be 90° (90° posture) by rotating the drum unit 10 from the 0° posture in the forward direction, light emitted from the projection lens 12 can be projected vertically upwardly with respect to the bottom face of the base 20. In
The drum unit 10 can be rotated further in the forward direction from the 90° posture. In such a case, the projection direction of the projection lens 12 changes from the vertically upward direction with respect to the bottom face of the base 20 to the direction of the second face side. In
The projector device 1 according to this embodiment rotates the drum unit 10, for example, as illustrated in States 501 to 503 with projection of an image being performed for easy understanding of description of a change in the projection posture, thereby changing (moving) a projection area of image data in accordance with the projection angle according to the projection lens 12. The change in the projection posture will be described in detail later. Accordingly, changes in the content of a projected image and the projection position of the projected image in the projection medium and changes in the content and the position of the image area cut out as an image to be projected from the whole image area relating to input image data can be associated with each other. Accordingly, a user can intuitively perceive an area which is projected out of the whole image area relating to the input image data based on the position of the projected image in the projection medium and intuitively perform an operation of changing the content of the projected image.
In addition, the optical system includes an optical zoom mechanism and can enlarge or reduce the size at the time of projecting a projection image to the projection medium by operating the operation unit 14. Hereinafter, the enlarging or reducing of the size at the time of projecting the projection image to the projection medium according to the optical system may be simply referred to as “zooming”. For example, in a case where the optical system performs zooming, the projection image is enlarged or reduced with the optical axis of the optical system at the time point of performing zooming being as its center.
When the user ends the projection of the projection image using the projector device 1 and stops the projector device 1 by performing an operation for instructing the operation unit 14 to stop the projector device 1, first, rotation control is performed such that the drum unit 10 is returned to be in the housed state. When drum unit 10 is positioned toward the vertical direction, and the return of the drum unit 10 into the housed state is detected, the light source is turned off, and, after a predetermined time required for cooling the light source, the power is turned off. By turning the power off after the drum unit 10 is positioned toward the vertical direction, the projection lens 12 can be prevented from getting dirty when the projection lend is not used.
Functional Configuration of Projector Device 1
Next, a configuration for realizing each function or operation of the projector device 1 according to this embodiment, as described above, will be described.
As illustrated in
The optical engine unit 110 includes a light source 111, a display element 114, and a projection lens 12. The light source 111, for example, includes three light emitting diodes (LEDs) respectively emitting red (R) light, green (G) light, and blue (B) light. Luminous fluxes of colors RGB that are emitted from the light source 111 irradiate the display element 114 through an optical system not illustrated in the figure.
In description presented below, the display element 114 is assumed to be a transmission-type liquid crystal display device and, for example, to have a size of horizontal 1280 pixels×vertical 720 pixels. However, the size of the display element 114 is not limited to this example. The display element 114 is driven by a drive circuit not illustrated in the figure and modulates luminous fluxes of the colors RGB based on image data and emits the modulated luminous fluxes. The luminous fluxes of the colors RGB that are emitted from the display element 114 and are modulated based on the image data are incident to the projection lens 12 through the optical system not illustrated in the figure and are projected to the outside of the projector device 1.
In addition, the display element 114, for example, may be configured by a reflection-type liquid crystal display device using liquid crystal on silicon (LCOS) or a digital micromirror device (DMD). In such a case, the projector device is configured by an optical system and a drive circuit that correspond to the used display element.
The projection lens 12 includes a plurality of lenses that are combined together and a lens driving unit that drives the lenses according to a control signal. For example, the lens driving unit drives a lens included in the projection lens 12 based on a result of distance measurement that is acquired based on an output signal output from a distance sensor disposed in the window portion 13, thereby performing focus control. In addition, the lens driving unit changes the view angle by driving the lens in accordance with a zoom instruction supplied from the view angle control unit 106 to be described later, thereby controlling the optical zoom.
As described above, the optical engine unit 110 is disposed inside the drum unit 10 that can be rotated by 360° by the rotation mechanism unit 105. The rotation mechanism unit 105 includes the drive unit 32 and the gear 35 that is a configuration of the drum unit 10 side described with reference to
The input control unit 119 receives a user operation input from the operation unit 14 as an event. The control unit 120 performs overall control of the projector device 1.
The rotation control unit 104, for example, receives an instruction according to a user operation for the operation unit 14 through the input control unit 119 and instructs the rotation mechanism unit 105 based on the instruction according to the user operation. The rotation mechanism unit 105 includes the drive unit 32 and the photo interrupters 51a and 51b described above. The rotation mechanism unit 105 controls the drive unit 32 according to an instruction supplied from the rotation control unit 104, thereby controlling the rotation operation of the drum unit (drum 30). For example, the rotation mechanism unit 105 generates a drive pulse according to an instruction supplied from the rotation control unit 104 and drives the motor 40 that is, for example, a stepping motor.
Meanwhile, outputs of the photo interrupters 51a and 51b described above and a drive pulse 122 used for driving the motor 40 are supplied from the rotation mechanism unit 105 to the rotation control unit 104. The rotation control unit 104, for example, includes a counter and counts the pulse number of the drive pulses 122. The rotation control unit 104 acquires the timing of detection of the protrusion 46a based on the output of the photo interrupter 51b and resets the pulse number counted by the counter at the timing of the detection of the protrusion 46a. The rotation control unit 104, based on the pulse number counted by the counter, can sequentially acquire the angle of the drum unit 10 (drum 30), thereby acquiring the posture (in other words, the projection angle of the projection lens 12) of the drum unit 10. The projection angle of the projection lens 12 is supplied to the geometric distortion correction unit 100. In this way, in a case where the projection direction of the projection lens 12 is changed, the rotation control unit 104 can derive an angle between a projection direction before change and a projection angle after the change.
The view angle control unit 106, for example, receives an instruction according to a user operation for the operation unit 14 through the input control unit 119 and gives a zoom instruction, in other words, an instruction for changing the view angle to the projection lens 12 based on an instruction according to the user operation. The lens driving unit of the projection lens 12 drives the lens based on the zoom instruction, thereby performing zoom control. The view angle control unit 106 supplies the zoom instruction and a view angle derived based on a zoom magnification relating to the zoom instruction and the like to the geometric distortion correction unit 100.
The image control unit 103 receives input image data 121 as input and stores the input image data in the image memory 101 with designated output resolution. The image control unit 103, as illustrated in
The output resolution control unit 1031 receives resolution from the geometric distortion correction unit 100 through the extended function control unit 109 and outputs the received resolution to the memory controller 1032 as output resolution.
The memory controller 1032 receives the input image data 121 of 1920 pixels×1080 pixels, which is a still image or a moving image, as input and stores the input image data 121 of 1920 pixels×1080 pixels that has been input in the image memory 101 with the output resolution input from the output resolution control unit 1031.
The image memory 101 stores the input image data 121 in units of images. In other words, for each still image in a case where the input image data 121 is still image data and for each frame image configuring moving image data in a case where the input image data 121 is the moving image data, corresponding data is stored. The image memory 101, for example, in compliance with the standards of digital high vision broadcasting, can store one or a plurality of frame images of 1920 pixels and 1080 pixels.
In addition, it is preferable that the size of the input image data 121 is shaped in advance into a size corresponding to the storage unit of the image data in the image memory 101, and resultant input image data is input to the projector device 1. In this example, the size of the input image data 121 is shaped into 1920 pixels×1080 pixels, and resultant input image is input to the projector device 1. However, the configuration is not limited thereto, but an image shaping unit that shapes the input image data 121 input with an arbitrary size into image data of a size of 1920 pixels and 1080 pixels may be disposed in a previous stage of the memory controller 1032 in the projector device 1.
The geometric distortion correction unit 100 calculates a first correction coefficient relating to a horizontal correction of the geometric distortion and a second correction coefficient relating to a vertical correction, acquires a cut out range, cuts out an image of an area of the cut range from the input image data 121 stored in the image memory 101, performs a geometric distortion correction and image processing for the image, and outputs a resultant image to the display element 114.
The geometric distortion correction unit 100, as illustrated in
The correction control unit 108 receives a projection angle 123 from the rotation control unit 104 as input and receives a view angle 125 from the view angle control unit 106 as input. Then, the correction control unit 108 calculates the first correction coefficient and the second correction coefficient used for eliminating a geometric distortion occurring in the projection image according to the projection direction based on the projection angle 123 and the view angle 125 that have been input and outputs the first correction coefficient and the second correction coefficient to the memory controller 107.
In addition, the correction control unit 108 determines a cut out range from the input image data such that the size of the image data after the geometric distortion correction includes a displayable size of the display device based on the projection angle 123, the view angle 125, the first correction coefficient, and the second correction coefficient and outputs the determined cut out range to the memory controller 107 and the extended function control unit 109. At this time, the correction control unit 108 designates a cut out area of the image data based on the angle of the projection direction of the projection lens 12.
The memory controller 107 cuts out (extracts) an image area of the cut out range determined by the correction control unit 108 from the whole area of a frame image relating to the image data stored in the image memory 101 and outputs the cut out image area as image data.
In addition, the memory controller 107 performs a geometric distortion correction for the image data cut out from the image memory 101 by using the first correction coefficient and the second correction coefficient and outputs the image data after the geometric distortion correction to the image processing unit 102. Here, the first correction coefficient, the second correction coefficient, and the geometric distortion correction will be described in detail later.
The image data output from the memory controller 107 is supplied to the image processing unit 102. The image processing unit 102, for example, by using a memory not illustrated in the figure, performs image processing for the supplied image data and outputs the image data for which the image processing has been performed to the display element 114 as image data of 1280 pixels×720 pixels. The image processing unit 102 outputs the image data for which the image processing has been performed based on timing represented in a vertical synchronization signal 124 supplied from a timing generator not illustrated in the figure. The image processing unit 102, for example, performs a size converting process for the image data supplied from the memory controller 107 such that the size matches the size of the display element 114. In addition, other than the process, the image processing unit 102 may perform various kinds of image processing. For example, the image processing unit 102 may perform a size converting process for the image data using a general linear transformation process. In addition, in a case where the size of the image data supplied from the memory controller 107 matches the size of the display element 114, the image data may be directly output.
In addition, by performing interpolation (over sampling) with the aspect ratio of the image being maintained to be constant, a part or the whole of the image may be enlarged through an interpolation filter having a predetermined characteristic, in order to extract an aliasing distortion, by thinning (sub sampling) the image through a low pass filter according to a reduction rate, a part or the whole of the image may be reduced, or the image may be configured to maintain the size without passing through a filter.
Furthermore, when an image is projected in an inclined direction, in order to prevent an image from being blurred due to out-of focus on a periphery portion, an edge enhancement process using an operator such as Laplacian or an edge enhancement process applying one-dimensional filters in horizontal and vertical directions may be performed. Through this edge enhancement process, the edge of a blurred image portion that is projected can be enhanced.
In addition, in a case where a periphery portion of a projected image texture includes a diagonal line, in order not to allow an edge jag to be visually noticed, by mixing a local halftone or applying a local low pass filter using the image processing unit 102, the edge jag is shaded off, whereby the diagonal line can be prevented from being observed as a jagged line.
The image data output from the image processing unit 102 is supplied to the display element 114. Actually, this image data is supplied to the drive circuit that drives the display element 114. The drive circuit drives the display element 114 based on the supplied image data.
The extended function control unit 109 receives a cut out range from the correction control unit 108 as input and outputs resolution including the cut out range to the output resolution control unit 1031 as output resolution.
Cutting Out Process of Image Data
Next, a cutting out process of image data stored in the image memory 101 that is performed by the memory controller 107 according to this embodiment will be described.
In the image memory 101, for example, addresses are set in the vertical direction in units of lines and are set in the horizontal direction in units of pixels. In addition, it is assumed that the address of a line increases from the lower end of an image (screen) toward the upper end thereof, and the address of a pixel increases from the left end of the image toward the right end thereof.
The correction control unit 108, for the memory controller 107, designates addresses of lines q0 and q1 in the vertical direction and designates addresses of pixels p0 and p1 in the horizontal direction as a cut out area of image data 140 of Q lines×P pixels stored in the image memory 101. The memory controller 107 reads lines within the range of the lines q0 and q1 over the pixels p0 and p1 from the image memory 101 in accordance with the designation of the addresses. At this time, as the sequence of reading, for example, it is assumed that the lines are read from the upper end toward the lower end of the image, and the pixels are read from the left end toward the right end of the image. The access control for the image memory 101 will be described in detail later.
The memory controller 107 supplies the image data 141 of the range of the lines q0 and q1 and the pixels p0 and p1, which has been read from the image memory 101, to the image processing unit 102. The image processing unit 102 performs a size conversion process in which the size of an image according to the supplied image data 141 is adjusted to the size of the display element 114. As an example, in a case where the size of the display element 114 is V lines×H pixels, a maximum multiplication m satisfying both Equations (1) and (2) as represented below is acquired. Then, the image processing unit 102 enlarges the image data 141 with this multiplication m and, as illustrated in
m×(p1−p0)≦H (1)
m×(q1−q0)≦V (2)
Next, the designation (update) of a cut out area according to the projection angle according to this embodiment will be described.
In
In the projector device (PJ) 1, a projection position of a case where an image 1310 is projected with a projection angle of 0° onto a projection face 130 that is a projection medium such as a screen by using a projection lens 12 having a view angle α is assumed to be a position Pos0 corresponding to the luminous flux center of light projected from the projection lens 12. In addition, at the projection angle of 0°, an image according to image data from the S-th line that is the lower end of an area designated in advance to the L-th line is assumed to be projected such that the image data stored in the image memory 101 is projected at the posture of a projection angle of 0°. In the area formed by lines of the S-th line to the L-th line, lines corresponding to the line number ln are included. In addition, a value representing a line position such as the S-th line or the L-th line, for example, is a value increasing from the lower end toward the upper end of the display element 114 with the line positioned at the lower end of the display element 114 set as the 0-th line.
Here, the line number ln is the number of lines of a maximal effective area of the display element 114. In addition, the view angle α is an angle for viewing a projection image in the vertical direction from the projection lens 12 in a case where the image is projected when an effective area in the vertical direction, in which the display is effective in the display element 114, has a maximum value, in other words, in a case where an image of the line number ln is projected.
The view angle α and the effective area of the display element 114 will be described using a more specific example. The display element 114 is assumed to have a vertical size of 720 lines. For example, in a case where the vertical size of the projection image data is 720 lines, and projection image data is projected using all the lines of the display element 114, the effective area of the display element 114 in the vertical direction has a maximum value of 720 lines (=line number ln). In this case, the view angle α is an angle for viewing 1st to 720th lines of the projection image from the projection lens 12.
In addition, a case may be also considered in which the vertical size of projection image data is 600 lines, and the projection image data is projected using only 600 lines out of 720 lines (=line number ln) of the display element 114. In such a case, the effective area of the display element 114 in the vertical direction is 600 lines. In this case, only a portion of the effective area according to the projection image data with respect to a maximal value of the effective area of the view angle α is projected.
The correction control unit 108 instructs the memory controller 107 to cut out and read the S-th line to L-th line of the image data 140 stored in the image memory 101. Here, in the horizontal direction, all the image data 140 of the left end to the right end is read. The memory controller 107 sets an area of the S-th line to the L-th line of the image data 140 as a cut out area in accordance with an instruction from the correction control unit 108, reads the image data 141 of the set cut out area, and supplies the read image data to the image processing unit 102. In the example illustrated in
Next, a case will be described in which the drum unit 10 is rotated, for example, according to a user operation for the operation unit 14, and the projection angle of the projection lens 12 becomes an angle θ. In this embodiment, in a case where the drum unit 10 is rotated, and the projection angle according to the projection lens 12 is changed, the cut out area from the image memory 101 of the image data 140 is changed in accordance with the projection angle θ.
The setting of a cut out area for the projection angle θ will be described more specifically with reference to
R
S=0×(ln/α)+S (3)
R
L=0×(ln/α)+S+ln (4)
In Equations (3) and (4), a value ln represents the number of lines (for example, the number of lines of the display element 114) included within the projection area. In addition, a value α represents a view angle of the projection lens 12, and a value S represents a position of a line located at the lower end of the cut out area at the 0° posture described with reference to
In Equations (3) and (4), (ln/α) represents the number of lines (including a concept of an approximately averaged number of lines changing in accordance with the shape of the projection face) per unit angle of a case where the view angle α projects the line number ln. Accordingly, θ×(ln/α) represents the number of lines corresponding to the projection angle θ according to the projection lens 12 in the projector device 1. This means that, when the projection angle changes by an angle Δθ, the position of the projection image is moved by a distance corresponding to the number of lines {Δθ×(ln/α)} in the projection image. Accordingly, Equations (3) and (4) respectively represent the positions of lines located at the lower end and the upper end of the image data 140 in the projection image of a case where the projection angle is the angle θ. This corresponds to a read address for the image data 140 on the memory 101 at the projection angle θ.
In this way, in this embodiment, an address at the time of reading the image data 140 from the image memory 101 is designated in accordance with the projection angle θ. Accordingly, image data 1411 of the image data 140 that is located at a position corresponding to the projection angle θ is read from the image memory 101, and an image 1311 relating to the read image data 1411 is projected to the projection position Pos1 corresponding to the projection angle θ of the projection face 130.
Thus, according to this embodiment, in a case where the image data 140 having a size larger than the size of the display element 114 is projected, a correspondence relation between the position within the projected image and the position within the image data is maintained. In addition, since the projection angle θ is acquired based on a drive pulse of the motor 40 used for driving the drum 30 to be rotated, the projection angle θ can be acquired in a state in which there is substantially no delay with respect to the rotation of the drum unit 10, and the projection angle θ can be acquired without being influenced by the projection image or the surrounding environment.
Next, the setting of a cut out area of a case where optical zooming according to the projection lens 12 is performed will be described. As described above, in the case of the projector device 1, the view angle α of the projection lens 12 is increased or decreased by driving the lens driving unit, whereby optical zooming is performed. An increase in the view angle according to the optical zooming is assumed to be an angle Δ, and the view angle of the projection lens 12 after the optical zooming is assumed to be a view angle (α+Δ).
In such a case, even when the view angle is increased according to the optical zooming, the cut out area for the image memory 101 does not change. In other words, the number of lines included in a projection image according to the view angle α before the optical zooming and the number of lines included in a projection image according to the view angle (α+Δ) after the optical zooming are the same. Accordingly, after the optical zooming, the number of lines included per unit angle is changed from that before the optical zooming.
The setting of a cut out area of a case where optical zooming is performed will be described more specifically with reference to
In a case where optical zooming corresponding to the view angle Δ is performed, when the number of lines designated as a cut out area for the image data 140 is ln, the number of lines included per unit angle is represented by {ln/(α+Δ)}. Accordingly, the cut out area for the image data 140 is designated based on the following Equations (5) and (6). The meaning of each variable in Equations (5) and (6) is common to that in Equations (3) and (4) described above.
R
S=0×{ln/(α+Δ)}+S (5)
R
L=0×{ln/(α+Δ)}+S+ln (6)
Image data 1412 of an area represented in Equations (5) and (6) is read from the image data 140, and an image 1312 relating to the read image data 1412 is projected to a projection position Pos2 of the projection face 130 by the projection lens 12.
In this way, in a case where optical zooming is performed, the number of lines included per unit angle is changed with respect to a case where the optical zooming is not performed, and the amount of change in the number of lines with respect to a change in the projection angle θ is different from that of a case where the optical zooming is not performed. This is a state in which a gain corresponding to the view angle Δ increased according to the optical zooming is changed in the designation of a read address according to the projection angle θ for the image memory 101.
In this embodiment, an address at the time of reading the image data 140 from the image memory 101 is designated in accordance with the projection angle θ and the view angle α of the projection lens 12. In this way, even in a case where optical zooming is performed, the address of the image data 1412 to be projected can be appropriately designated for the image memory 101. Accordingly, even in a case where the optical zooming is performed, in a case where the image data 140 of a size larger than the size of the display element 114 is projected, a correspondence relation between the position within the projected image and the position within the image data is maintained.
Next, a case will be described with reference to
In such a case, for example, a case may be considered in which the offset angle θofst is regarded as the projection angle 0°, and a cut out area for the image memory 101 is designated. By applying Equations (3) and (4) described above, the following Equations (7) and (8) are formed. The meaning of each variable in Equations (7) and (8) is common to that in Equations (3) and (4) described above.
R
S=(θ−θofst)×(ln/α)+S (7)
R
L=(θ−θofst)×(ln/α)+S+ln (8)
The image data 1413 of the area represented in Equations (7) and (8) is read from the image data 140, and the image 1313 relating to the read image data 1413 is projected to the projection position Pos3 of the projection face 130 by the projection lens 12.
Memory Control
Next, access control of the image memory 101 will be described with reference to
In the image data, for each vertical synchronization signal VD, pixels are sequentially transmitted from the left end toward the right end of an image for each line in the horizontal direction on the screen, and lines are sequentially transmitted from upper end toward the lower end of the image. Hereinafter, a case will be described as an example in which the image data has a size of horizontal 1920 pixels×vertical 1080 pixels (lines) corresponding to the digital high vision standard.
Hereinafter, an example of the access control of a case where the image memory 101 includes four memory areas for which the access control can be independently performed will be described. In other words, as illustrated in
For every vertical synchronization signal VD, image data D1, D2, D3, D4, D5, D6, . . . each having an image size of 1920 pixels×1080 lines are input to the memory controller 107. Each of the image data D1, D2, . . . is synchronized with the vertical synchronization signal VD and is input after the vertical synchronization signal VD. In addition, the projection angles of the projection lens 12 corresponding to the vertical synchronization signals VD are denoted as projection angles θ1, θ2, θ3, θ4, θ5, θ6, . . . . The projection angle θ is acquired for every vertical synchronization signal VD as above.
First, the image data D1 is input to the memory controller 107. As described above, the projector device 1 according to this embodiment changes the projection angle θ according to the projection lens 12 by rotating the drum unit 10 so as to move the projection position of the projection image and designates a read position for the image data in accordance with the projection angle θ. Accordingly, it is preferable that the image data is longer in the vertical direction. Generally, image data frequently has a horizontal size longer than a vertical size. Thus, for example, it may be considered for a user to rotate the camera by 90° in an imaging process and input image data acquired by the imaging process to the projector device 1.
In other words, an image according to the image data D1, D2, . . . input to the memory controller 107, similarly to an image 160 illustrated as an image in
The memory controller 107 writes the input image data D1 into the memory Y1 at timing WD1 corresponding to the input timing of the image data D1 (timing WD1 illustrated in Chart 213). The memory controller 107 writes the image data D1 into the memory Y1, as illustrated on the left side of
The memory controller 107, as illustrated in
At this time, the memory controller 107 sequentially reads the image data D1 in the vertical direction over the lines for each pixel with a pixel positioned on the lower left corner of the image being set as a reading start pixel. When pixels positioned at the upper end of the image are read, next, pixels are read in the vertical direction with a pixel positioned on the right side neighboring to the pixel positioned at the reading start position of the vertical direction being set as a reading start pixel. This operation is repeated until the reading of a pixel positioned on the upper right corner of the image is completed.
In other words, the memory controller 107 sequentially reads the image data D1 from the memory Y1 for each line in the vertical direction from the left end toward the right end of the image for each pixel in the line direction being set as the vertical direction from the lower end toward the upper end of the image.
The memory controller 107 sequentially writes the pixels of the image data D1 read from the memory Y1 in this way, as illustrated on the left side in
On the right side in
The memory controller 107 designates an address of the cut out area that is designated by the correction control unit 108 to the memory T1 and reads image data of the area designated as the cut out area from the memory T1. The timing of this reading process, as represented by timing RD1 in Chart 214, is delayed from the timing at which the image data D1 is input to the memory controller 107 by two vertical synchronization signals VD.
The projector device 1 according to this embodiment, as described above, moves the projection position of the projection image by rotating the drum unit 10 so as to change the projection angle θ according to the projection lens 12 and designates a reading position for image data in accordance with the projection angle θ. For example, the image data D1 is input to the memory controller 107 at the timing of the projection angle θ1. The projection angle θ at the timing when an image according to the image data D1 is actually projected may be changed from the projection angle θ1 to a projection angle θ3 different from the projection angle θ1.
Accordingly, the cut out area at the time of reading the image data D1 from the memory T1 is read from a range that is larger than the area of image data corresponding to the projected image in consideration of a change in the projection angle θ.
The description will be described more specifically with reference to
The memory controller 107 reads the image data from this cut out area 170 at the timing of a next vertical synchronization signal VD after the vertical synchronization signal VD for writing the image data D1 into the memory T1. In this way, at the timing of the projection angle θ3, the image data to be projected is read from the memory T1, is supplied to the display element 114 through the image processing unit 102 of a later stage, and is projected from the projection lens 12.
At the timing of the next vertical synchronization signal VD after the vertical synchronization signal VD for which the image data D1 is input, the image data D2 is input to the memory controller 107. At this timing, the image data D1 is written into the memory Y1. Accordingly, the memory controller 107 writes the image data D2 into the memory Y2 (timing WD2 illustrated in Chart 215). The sequence of writing the image data D2 into the memory Y2 at this time is similar to the sequence of writing the image data D1 described above into the memory Y1, and the sequence for the image is similar to that described above (see
In other words, the memory controller 107 sequentially reads the image data D2 in the vertical direction over the lines for each pixel up to the pixel positioned at the upper end of the image with a pixel positioned on the lower left corner of the image being set as a reading start pixel, and next, pixels are read in the vertical direction with a pixel positioned on the right side neighboring to the pixel positioned at the reading start position of the vertical direction being set as a reading start pixel (timing RD2 illustrated in Chart 215). This operation is repeated until the reading of a pixel positioned on the upper right corner of the image is completed. The memory controller 107 sequentially writes (timing WD2 represented in Chart 216) the pixel of the image data D2 read from the memory Y2 in this way into the memory T2 toward the line direction for each pixel (see the left side in
The memory controller 107 designates an address of the cut out area that is designated by the correction control unit 108 to the memory T2 and reads image data of the area designated as the cut out area from the memory T2 at timing RD2 represented in Chart 216. At this time, as described above, the correction control unit 108 designates an area lager than the area of the image data corresponding to the projected image as the cut out area 170 in consideration of a change in the projection angle θ for the memory T2 (see the right side in
The memory controller 107 reads the image data from this cut out area 170 at the timing of a next vertical synchronization signal VD after the vertical synchronization signal VD for writing the image data D2 into the memory T2. In this way, the image data of the cut out area 170 of the image data D2 input to the memory controller 107 at the timing of the projection angle θ2 is read from the memory 12 at the timing of the projection angle θ4, is supplied to the display element 114 through the image processing unit 102 of a later stage, and is projected from the projection lens 12.
Thereafter, similarly, for the image data D3, D4, D5, . . . , the process is sequentially performed using a set of the memories Y1 and T1 and a set of the memories Y2 and T2 in an alternate manner.
As described above, according to this embodiment, in the image memory 101, an area of the memories Y1 and Y2 used for writing and reading image data with the size of horizontal 1920 pixels×vertical 1080 pixels (lines) and an area of the memories T1 and T2 used for writing and reading image data with the size of horizontal 1080 pixels×vertical 1920 pixels (lines) are arranged. The reason for this is that, generally, a dynamic random access memory (DRAM) used in an image memory has an access speed for the vertical direction that is lower than an access speed for the horizontal direction. In a case where another memory, which is easily randomly accessible, having access speeds of the same level for the horizontal direction and the vertical direction is used, a configuration may be employed in which a memory having a capacity corresponding to the image data is used in both the cases.
Geometric Distortion Correction
Next, the geometric distortion correction for the image data that is performed by the projector device 1 according to this embodiment will be described.
However, as illustrated in
For this reason, conventionally, by performing a geometric distortion correction such as a trapezoidal distortion correction (keystone correction) transforming image data to be projected into a trapezoidal shape in a direction opposite to a trapezoidal shape generated in a projection image on a projection face such as a screen, as illustrated in
However, in the conventional trapezoidal distortion correction (keystone correction), as illustrated in
Recently, in accordance with wide use of high-resolution digital cameras and the like, the resolution of a video content is improved, and there are cases where the resolution of the video content is higher than the resolution of the display device. For example, in a projector device supporting up to the full HD of 1920 pixels×1080 pixels as an input image for a display device having resolution of 1280 pixels×720 pixels, the input image is scaled in a former stage of the display device, and accordingly, the resolution is matched for enabling the whole input image to be displayed on the display device.
On the other hand, instead of performing such a scaling process, as illustrated in
For this reason, according to the projector device 1 of this embodiment, as illustrated in
The correction control unit 108 of the geometric distortion correction unit 100, as described above, calculates a first correction coefficient and a second correction coefficient based on the projection angle and the view angle. Here, the first correction coefficient is a correction coefficient for performing a correction of the image data in the horizontal direction, and the second correction coefficient is a correction coefficient for performing a correction of the image data in the vertical direction. The correction control unit 108 may be configured to calculate the second correction coefficient for each line configuring the image data (cut out image data) of the cut out range.
In addition, the correction control unit 108, for each line from the upper side to the lower side of the image data of the cut out range, calculates a linear reduction rate for each line based on the first correction coefficient.
The relation between the projection angle and the correction coefficient and the correction coefficients and a correction amount for a trapezoidal distortion calculated based on the projection angle will be described in detail.
Here, the projection angle θ is an inclination angle of the optical axis of projection light emitted from the projection lens 12 with respect to the horizontal direction. Hereinafter, an inclination angle of a case where the optical axis of the projection light is in the horizontal direction is set as 0°, a case where the drum unit 10 including the projection lens 12 is rotated to the upper side, in other words, the elevation angle side will be defined as positive, and a case where the drum unit 10 is rotated to the lower side, in other words, the depression angle side will be defined as negative. In such a case, a housed state in which the optical axis of the projection lens 12 faces a floor face 222 disposed right below corresponds to a projection angle (−90°), and a horizontal state in which the projection direction faces the front side of a wall face 220 corresponds to a projection angle (0°), and a state in which the projection direction faces a ceiling 221 disposed right above corresponds to a projection angle (+90°).
A projection direction 231 is a direction of a boundary between the wall face 220 and the ceiling 221 that are two projection faces adjacent to each other. A projection direction 232 is, the projection direction of the projection lens 12 in a case where an upper side, which corresponds to a first side, of one pair of sides disposed in a direction perpendicular to the vertical direction that is the moving direction of a projection image approximately coincides with the boundary in the projection image on the wall face 220.
A projection direction 233 is the projection direction of the projection lens 12 in a case where a lower side, which corresponds to a second side, of the above-described one pair of sides of the projection image of the ceiling 221 approximately coincides with the boundary. A projection direction 234 is the direction of the ceiling 221 right above the projector device 1 and corresponds to a state in which the optical axis of the projection lens 12 and the ceiling 221 cross each other at right angles. The projection angle at this time is 90°.
In the example illustrated in
A projection direction 235 is a direction in which projection is started by the projector device 1 that is acquired by rotating the projection lens from a state in which the projection lens is positioned toward the right below side (−90°), and the projection angle θ at this time is −45°. A projection direction 236 is the projection direction of the projection lens in a case where an upper side, which corresponds to a first side, of one pair of sides disposed in a direction perpendicular to the moving direction of a projection image approximately coincides with a boundary between the floor face 222 and the wall face 220 in the projection image on the floor face 222. The projection angle θ at this time will be referred to as a second boundary start angle, and the second boundary start angle is −19°.
A projection direction 237 is a direction of a boundary between the floor face 222 and the wall face 220 that are two projection faces adjacent to each other. The projection angle θ at this time will be referred to as a second boundary angle, and the second boundary angle is −12°.
A projection direction 238 is the projection direction of the projection lens in a case where a lower side, which corresponds to a second face, of the above-described one pair of sides of the projection image on the wall face 220 approximately coincides with a boundary between the floor face 222 and the wall face 220. The projection angle θ at this time will be referred to as a second boundary end angle, and the second boundary end angle is −4°.
Hereinafter, an example of the geometric distortion correction (the trapezoidal distortion correction will be used as an example) will be described.
In
In addition, as illustrated in
In addition, as illustrated in
The correction control unit 108 calculates a trapezoidal distortion correction amount based on a correction coefficient according to each projection angle θ denoted by a solid line in
Here, as illustrated in
On the other hand, as illustrated in
In addition, as illustrated in
Here, as illustrated in
Here, a technique for calculating the correction coefficient will be described.
Here, as illustrated in
Then, in
n=b cos(θ+β)=a cos(θ−β) (10)
By transforming Equation (10), Equation (11) is acquired. Accordingly, based on Equation (11), the correction coefficient is determined based on the angle β that is a half of the view angle α and the projection angle θ.
Based on this Equation (11), in a case where the projection angle θ is 0°, in other words, in a case where the projection image is projected in a direction horizontal to the projection face 270, the first correction coefficient is “1”, and, in such a case, the trapezoidal distortion correction amount is zero.
In addition, based on Equation (11), the first correction coefficient decreases as the projection angle θ increases, and the trapezoidal distortion correction amount increases according to the value of the first correction coefficient. Accordingly, the trapezoidal distortion of the projection image that becomes remarkable according to an increase in the projection angle θ can be appropriately corrected.
Furthermore, in a case where the projection image is projected onto the ceiling that is disposed right above and is perpendicular to the projection face 270, the correction direction of the trapezoidal distortion correction changes, and accordingly, the correction coefficient is b/a. In addition, as described above, the sign of the correction coefficient is negative.
In this embodiment, the correction control unit 108 calculates the correction coefficient based on Equation (11) when the projection angle θ is between the projection angle (−45°) at the time of the projection direction 235 and the second boundary start angle (−19°) that is the projection angle θ at the time of the projection direction 236, between the projection angle (0°) at the time of the projection direction 230 and the first boundary start angle (35°) that is the projection angle at the time of the projection direction 232, between the second boundary end angle (−4°) that is the projection angle θ at the time of the projection direction 238 and the projection angle (0°) at the time of the projection direction 230, or between the first boundary end angle (49°) that is the projection angle θ of the projection direction 233 and the projection angle (90°) at the time of the projection direction 234, described above.
On the other hand, the correction control unit 108 calculates the correction coefficient in a direction for lowering the degree of the correction without using Equation (11) when the projection angle θ is between the second boundary start angle (−19°) that is the projection angle θ at the time of the projection direction 236 and the second boundary angle (−12°) that is the projection angle θ at the time of the projection direction 237 or between the first boundary start angle (35°) that is the projection angle θ at the time of the projection direction 232 and the first boundary angle (42°) that is the projection angle θ at the time of the projection direction 231.
In addition, the correction control unit 108 calculates the correction coefficient in a direction for raising the degree of the correction without using Equation (11) when the projection angle θ is between the second boundary angle (−12°) that is the projection angle θ at the time of the projection direction 237 and the second boundary end angle (−4°) that is the projection angle θ at the time of the projection direction 238 or between the first boundary angle (42°) that is the projection angle θ at the time of the projection direction 231 and the first boundary end angle (49°) that is the projection angle θ at the time of the projection direction 233.
The calculation of the first correction coefficient is not limited to that described above, and the correction control unit 108 may be configured to calculate the first correction coefficient using Equation (11) for all the projection angles θ.
The correction control unit 108 multiplies the length Hact of the line of the upper side of the image data by a correction coefficient k(θ, β) represented in Equation (11) and calculates the length Hact(θ) of the line of the upper side after the correction using the following Equation (12) for the correction.
H
act(θ)=k(θ,β)×Hact (12)
The correction control unit 108, in addition to the length of the upper side of the image data, calculates a reduction rate of the length of each line in a range from the line of the upper side to the line of the lower side.
As illustrated in
Another method of calculating the first correction coefficient will now be described. The first correction coefficient may be calculated from a ratio between the length of the side of the projection image at the projection angle 0° and the length of the side of the projection image at the projection angle θ. In such a case, the length Hact(y) of each line from the upper side to the lower side of the image data can be represented as in Equation (14).
In the trapezoidal distortion correction using the first correction coefficient according to this calculation method, an image having the same size as the projection image of the projection angle 0° can be projected regardless of the projection angle θ.
As illustrated in
In the cylindrical model described above, a projection image is projected with an arc 202 that has the position 201 as its center and has a radius r being the projection face. Each point on the arc 202 has the same distance from the position 201, and the center of the luminous fluxes of light projected from the projection lens 12 is a radius of a circle including the arc 202. Accordingly, even when the projection angle θ is increased from an angle θ0 of 0° to an angle θ1, an angle θ2, . . . , the projection image is projected onto the projection face with the same size all the time.
On the other hand, in a case where an image is projected from the projection lens 12 onto the projection face 204 that is a perpendicular face, when the projection angle θ is increased from an angle θ0 to an angle θ1, an angle θ2, . . . , a position on the projection face 204 to which the center of luminous fluxes of light emitted from the projection lens 12 is projected changes according to the characteristics of a tangent function as a function of the angle θ.
Accordingly, the projection image grows upwardly in accordance with a ratio M represented in the following Equation (15) as the projection angle θ increases.
Here, when the angle θ is a projection angle, an angle β is a half of the view angle α, and a total number of lines of the display element 114 is a value L, a projection angle θ′ of a luminous ray projecting a line disposed at a perpendicular position dy on the display element 114 is calculated using Equation (16).
The height Lh(dy) of the line at the time of projecting the line disposed at the perpendicular position dy on the display element 114 onto the projection face 204 is calculated using Equation (17).
Lh(dy)=r(tan(θ+β−2β×(dy−1)/L)−tan(θ+β−2β×dy/L)) (17)
Accordingly, an enlargement rate ML(dy) of the height Lh(dy) of the line at the time of projecting the line disposed at the perpendicular position dy on the display element 114 onto the projection face 204 with respect to the height of the line of the lower side (dy=L) is calculated using Equation (18).
The second correction coefficient is the reciprocal of the enlargement rate ML(dy) and is calculated for each line disposed at the perpendicular position dy on the display element 114.
In addition, in a case where the view angle α or the projection angle θ is small, instead of calculating the second correction coefficient for each line disposed at the perpendicular position dy on the display element 114 using Equation (18), the second correction coefficient may be calculated by acquiring the enlargement rate ML(1) of the height of the line of the upper side (dy=1) with respect to the height of the line of the lower side (dy=L) using Equation (19) and approximating the second correction coefficient through linear interpolation for an intermediate value.
Another method of calculating the second correction coefficient will be described. The second correction coefficient may be calculated from a ratio between the height of the projection image of the projection angle 0° and the height of the projection image of the projection angle θ.
When the angle θ is the projection angle, and the angle β is a half of the view angle α, a value M0 that is the ratio of the height of the projection image of the projection angle θ to the height of the projection image of the projection angle 0° can be calculated using the following Equation (20).
As the second correction coefficient, the reciprocal of the value M0 may be used.
Here, when the angle θ is the projection angle, and the angle β is a half of the view angle α, the height W′ of the projection image of the projection angle θ is represented as in Equation (21).
W′=r×{tan(θ+β)−tan(θ−β)} (21)
The height of the projection image at a projection angle of 0° and the view angle α is approximated to a height L acquired by delimiting a tangential line at the projection angle θ of the arc 202 illustrated in
Based on Equations (21) and (22), a value M0 that is the ratio of the height of the projection image of the projection angle θ to the height of the projection image of the projection angle 0° is represented as in Equation (23).
According to Equation (15) described above, for example, in the case of the projection angle θ=45°, the projection image grows at the ratio of about 1.27 times. In addition, in a case where the projection face 204 is much higher than the length of the radius r, and projection at the projection angle θ=60° can be performed, in the case of the projection angle θ=60°, the projection image grows at the ratio of about 1.65 times.
In addition, as illustrated in
Thus, the correction control unit 108, in accordance with the projection angle θ of the projection lens 12, performs a geometric distortion correction by performing a reduction process for image data to be projected by calculating the reciprocal of the ratio ML(dy) represented in Equation (18) described above as the second correction coefficient and multiplying the height of the line by the second correction coefficient using the memory controller 107, thereby eliminating the vertical-direction distortion of the image data.
In the vertical-direction reduction process (geometric distortion correction process), image data is preferably larger than the image data cut out based on the cylindrical model. In other words, while the image data depends on the height of the projection face 204 that is a perpendicular face, in the case of the projection angle θ=22.5° and the view angle α=45°, the projection image grows at the ratio of about 1.27 times, and accordingly, the image data is reduced at the ratio of the reciprocal thereof that is about 1/1.27 times.
In addition, the correction control unit 108 acquires a cut out range of the image data based on the first correction coefficient, the second correction coefficient, and the reduction rate calculated as described above and outputs the acquired cut out range to the extended function control unit 109 and the memory controller 107.
For example, in a case where the view angle α is 10°, and the projection angle θ is 30°, the projection image is distorted to be in a trapezoidal shape, and the length of the upper side of the trapezoid is about 1.28 times of the length of the lower side. Accordingly, in order to correct the horizontal-direction distortion, the correction control unit 108 calculates the first correction coefficient as 1/1.28, reduces a first line of the upper side of the image data at 1/1.28 times, and sets reduction rates of lines to be linear such that the final line is scaled to the original size. In other words, the number of pixels for the first line of the output of the image data is reduced from 1280 pixels to 1000 pixels (1280/1.28=1000), whereby the trapezoidal distortion is corrected.
However, in this state, as described above, for the first line, image data of 280 pixels (1280−1000=280) is not projected, and the number of effective projection pixels decreases. Thus, in order to supplement the amount of information as illustrated in
The extended function control unit 109 achieves the role of associating the image control unit 103 with the geometric distortion correction unit 100. In other words, in an area for which all the outputs of the image data is painted in black according to the geometric distortion correction in a conventional case, information of the image data is represented. For this reason, the extended function control unit 109, in accordance with the cut out range input from the correction control unit 108, sets the output resolution to be higher than the resolution of 1280 pixels×720 pixels at the time of outputting the image data in the output resolution control unit 1031. In the example described above, since the enlargement/reduction rate is one, the extended function control unit 109 sets the output resolution as 1920 pixels×1080 pixels.
In this way, the memory controller 1032 of the image control unit 103 stores the input image data in the image memory 101 with the resolution of 1920 pixels×1080 pixels. Accordingly, the image data in the cut out range can be cut out in the state in which, as illustrated in
In addition, the memory controller 107 performs the geometric distortion correction as below by using the first correction coefficient, the reduction rate, and the second correction coefficient calculated as described above. In other words, the memory controller 107 multiplies the upper side of the image data of the cut out range by the first correction coefficient and multiplies each line of the upper side to the lower side of the image data of the cut out range by a reduction rate. In addition, the memory controller 107 generates lines corresponding to a display pixel number from the image data of the lines configuring the image data of the cut out range based on the second correction coefficient.
Next, an example of the cutting out of image data and the geometric distortion correction performed by the geometric distortion correction unit 100 according to this embodiment will be described with being compared with a conventional case. In
As illustrated in
As illustrated in
As illustrated in
Then, the memory controller 107 performs the geometric distortion correction for the image data of the cut out range. More specifically, as illustrated in
As illustrated in the examples represented in
Process of Projecting Image Data
Next, the flow of the process performed when an image according to the image data is projected by the projector device 1 will be described.
In step S100, in accordance with input of image data, various setting values relating to the projection of an image according to the image data are input to the projector device 1. The input various setting values, for example, are acquired by the input control unit 119 and the like. The various setting values acquired here, for example, includes a value representing whether or not the image according to the image data is rotated, in other words, whether or not the horizontal direction and the vertical direction of the image are interchanged, an enlargement rate of the image, and an offset angle θofst at the time of projection. The various setting values may be input to the projector device 1 as data in accordance with the input of the image data to the projector device 1 or may be input by operating the operation unit 14.
In next step S101, image data corresponding to one frame is input to the projector device 1, and the input image data is acquired by the memory controller 1032. The acquired image data is written into the image memory 101.
In next step S102, the image control unit 103 acquires the offset angle θofst. In next step S103, the correction control unit 108 acquires the view angle α from the view angle control unit 106. In addition, in next step S104, the correction control unit 108 acquires the projection angle θ of the projection lens 12 from the rotation control unit 104.
In next step S105, the image data cutting out and geometric distortion correction process are performed. Here, the image data cutting out and geometric distortion correction process will be described in detail.
First, in step S301, the correction control unit 108 calculates the first correction coefficient using Equation (11). In next step S302, the correction control unit 108 calculates the reduction rate of each line from the upper side (first side) to the lower side (second side) of the image data using the equation represented inside the braces { } illustrated in Equation (13). In addition, in step S303, the correction control unit 108 acquires the second correction coefficient for each line as the reciprocal of the enlargement rate ML(dy) calculated using Equation (18).
Then, next, in step S304, the correction control unit 108 acquires the cut out range based on the first correction coefficient and the second correction coefficient as described above.
Next, in step S305, the memory controller 107 cuts out image data of the cut out range from the image data stored in the image memory 101. Then, in step S306, the memory controller 107 performs the geometric distortion correction described above for the image data of the cut out range using the first correction coefficient, the second correction coefficient, and the reduction rate and ends the process.
Returning to
In a case where the input of the image data of the next frame is determined to be present, the control unit 120 returns the process to step S101 and performs the process of steps S101 to S105 described above for the image data of the next frame. In other words, the process of steps S101 to S105 is repeated in units of frames of the image data in accordance with a vertical synchronization signal VD of the image data. Accordingly, the projector device 1 can cause each process to follow a change in the projection angle θ in units of frames.
On the other hand, in step S106, in a case where the image data of the next frame is determined not to have been input, the control unit 120 stops the image projection operation in the projector device 1. For example, the control unit 120 controls the light source 111 so as to be turned off and issues a command for returning the posture of the drum unit 10 to be in the housed state to the rotation mechanism unit 105. Then, after the posture of the drum unit 10 is returned to be in the housed state, the control unit 120 stops the fan cooling the light source 111 and the like.
As above, according to this embodiment, in a case where the geometric distortion correction is performed for the image data, a projection image is displayed by using an image of the unused area originally remaining after the cutting out of the input image data for the area of the periphery of the image data after the geometric distortion correction, and the amount of information lacking in the area of the periphery in the horizontal direction and the vertical direction is supplemented. For this reason, according to this embodiment, compared to a conventional technology, by effectively using the image of the unused area, the geometric distortion correction is performed for the content of the projection image, and a high-quality projection image effectively using the displayable area can be acquired.
Particularly, in a case where, for example, an environment video such as the sky or the night sky is projected using the projector device 1 according to this embodiment, even in a case where the projection image is displayed in a trapezoidal shape, when the amount of information to be displayed is large, a realistic sensation can be more effectively acquired. In addition, in a case where a map image or the like is projected using the projector device 1 according to this embodiment, compared to a conventional technique, a relatively broad range of peripheral information can be projected.
According to the projector device 1 of the first embodiment, a horizontal distortion and a vertical distortion of the projection image that occur in accordance with the projection angle θ are eliminated by the geometric distortion correction, and the amount of information is supplemented for both areas of the horizontal-direction area and the vertical-direction area. However, according to a second embodiment, a horizontal distortion is eliminated by a geometric distortion correction, and the amount of information is supplemented for the horizontal-direction area, but a distortion correction is not performed for the vertical direction.
The external view, the structure, and the functional configuration of a projector device 1 according to this embodiment are similar to those of the first embodiment.
In this embodiment, the correction control unit 108 calculates the first correction coefficient used for a horizontal distortion correction based on the projection angle θ (projection angle 123) input from the rotation control unit 104 and the view angle α (view angle 125) input from the view angle control unit 106 using Equation (11) described above and calculates the reduction rate for each line using the equation represented inside the braces { } represented in Equation (13) but does not calculate the second correction coefficient used for a vertical distortion correction.
In addition, based on the projection angle θ, the view angle α, and the first correction coefficient, the correction control unit 108 determines a cut out range from the input image data such that image data after the geometric distortion correction includes a displayable size of the display device and outputs the determined cut out range to the memory controller 107 and the extended function control unit 109.
The memory controller 107 cuts out (extracts) an image area of the cut out range determined by the correction control unit 108 from the whole area of a frame image relating to the image data stored in the image memory 101 and outputs the image area that has been cut out as image data.
In addition, the memory controller 107 performs a geometric distortion correction for the image data cut out from the image memory 101 by using the first correction coefficient and outputs the image data after the geometric distortion correction to the image processing unit 102.
The flow of the process of projecting the image data according to the second embodiment is similar to that of the first embodiment described with reference to
First, in step S401, the correction control unit 108 calculates the first correction coefficient using Equation (11). In next step S402, the correction control unit 108 calculates the reduction rate of each line from the upper side (first side) to the lower side (second side) of the image data using the equation represented inside the braces { } illustrated in Equation (13).
Then, next, in step S403, the correction control unit 108 acquires a cut out range based on the first correction coefficient as described above.
Next, in step S404, the memory controller 107 cuts out image data of the cut out range from the image data stored in the image memory 101. Then, in step S405, the memory controller 107 performs the geometric distortion correction described above for the image data of the cut out range using the first correction coefficient and the reduction rate and ends the process.
Next, an example of the cutting out of image data and the geometric distortion correction performed by the geometric distortion correction unit 100 according to this embodiment will be described.
In a case where the projection angle θ is greater than 0°, as illustrated in
Then, the memory controller 107 performs the geometric distortion correction for the image data 3401 of the cut out range. More specifically, the memory controller 107, in the horizontal direction, corrects the image data in a trapezoidal shape according to the projection angle θ, as represented as image data 3402 in
As above, according to this embodiment, the horizontal distortion is eliminated by the geometric distortion correction, and the amount of information is supplemented for the horizontal-direction area, but the geometric distortion correction is not performed for the vertical direction. Accordingly, not only the same advantages as those of the first embodiment are acquired, but the processing load of the correction control unit 108 can be reduced.
In the first embodiment and the second embodiment, while the method has been described in which the projection angle θ is derived by changing the projection direction of the projection unit such that the projection unit is moved while projecting the projection image onto the projection face, and a correction amount used for eliminating the geometric distortion according to the projection angle θ is calculated, a change in the projection direction does not need to be dynamic. In other words, as illustrated in
In addition, the calculation of the correction amount and the detection method are not limited to those described in this embodiment, and a cut out range including also an area other than the above-described image data area after the correction may be determined according to the correction amount.
Each of the projector devices 1 according to the first embodiment and the second embodiment has a configuration that includes hardware such as a control device such as a central processing unit (CPU), storage devices such as a read only memory (ROM) and a random access memory (RAM), an HDD, and an operation unit 14.
In addition, the rotation control unit 104, the view angle control unit 106, the image control unit 103 (and each unit thereof), the extended function control unit 109, the geometric distortion correction unit 100 (and each unit thereof), the input control unit 119, and the control unit 120 mounted as circuit units of the projector devices 1 of the first and second embodiments may be configured to be realized by software instead of being configured by hardware.
In a case where the projector device is realized by the software, an image projection program (including an image correction program) executed by the projector devices 1 according to the first and second embodiments is built in a ROM or the like in advance and is provided as a computer program product.
The image projection program executed by the projector devices 1 according to the first and second embodiments may be configured to be recorded on a computer-readable recording medium such as a CD-ROM, a flexible disk (FD), a CD-R, or a DVD so as to be provided as a file having an installable form or an executable form.
In addition, the image projection program executed by the projector devices 1 according to the first and second embodiments may be configured to be stored in a computer connected to a network such as the Internet and be provided by being downloaded through the network. In addition, the image projection program executed by the projector devices 1 according to the first and second embodiments may be configured to be provided or distributed through a network such as the Internet.
The image projection program executed by the projector devices 1 according to the first and second embodiments has a module configuration including the above-described units (the rotation control unit 104, the view angle control unit 106, the image control unit 103 (and each unit thereof), the extended function control unit 109, the geometric distortion correction unit 100 (and each unit thereof), the input control unit 119, and the control unit 120). As actual hardware, as the CPU reads the image projection program from the ROM and executes the read image projection program, the above-described units are loaded into a main memory device, the rotation control unit 104, the view angle control unit 106, the image control unit 103 (and each unit thereof), the extended function control unit 109, the geometric distortion correction unit 100 (and each unit thereof), the input control unit 119, and the control unit 120 are generated on the main storage device.
Although the invention has been described with respect to specific embodiments for a complete and clear disclosure, the appended claims are not to be thus limited but are to be construed as embodying all modifications and alternative constructions that may occur to one skilled in the art that fairly fall within the basic teaching herein set forth.
Number | Date | Country | Kind |
---|---|---|---|
2012-117016 | May 2012 | JP | national |
This application is a continuation of International Application No. PCT/JP2013/063463, filed on May 14, 2013 which claims the benefit of priority of the prior Japanese Patent Application No. 2012-117016, filed on May 22, 2012, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2013/063463 | May 2013 | US |
Child | 14549343 | US |