The following disclosure relates to a video display device, a method for controlling the video display device, and a control program for the video display device, and relates to, for example, a video display device that performs rendering processing for an input video to the video display device.
Video display devices are devices that display an output video on a display. Some of the video display devices perform rendering processing for an original video before being displayed. For example, a television receiving device (video display device) described in PTL 1 performs rendering processing for an original video by using a geometry engine and thereby tilts or rotates an output video on a display.
PTL 1: Japanese Unexamined Patent Application Publication No. 2006-41979 (published on Feb. 9, 2006)
Some video display devices change resolution of an original video before being displayed. For example, a video display device converts an original video, which is generated by the HD standard, into an output video that has resolution of the super high vision standard. In this case, a user may view the output video displayed on a display by coming closer to the display than a recommended viewing distance (3.0 H) of the original video that has resolution of the high vision standard.
In a case where a viewpoint position of the user, that is, a position on a display surface, which is gazed by the user, is close to a center of the display surface of the display, an angle formed by a sight line of the user seeing a corner of the display and the display surface of the display is small. Thus, due to a so-called perspective principle, the output video at the corner of the display appears to be distorted to the user.
With reference to
An aspect of the disclosure is made in view of the aforementioned problems and an object thereof is to display a less-distorted output video regardless of a viewpoint position of a user.
In order to solve the aforementioned problems, a video display device according to an aspect of the disclosure includes: a video enlargement unit that increases the number of pixels of an input video input to the video display device and generates an output video obtained by enlarging the input video; and a display unit that displays the output video generated by the video enlargement unit, in which the video enlargement unit reduces, at a position close to a reference position on the display unit, an enlargement ratio of the output video with respect to the input video, and increases, at a position away from the reference position, the enlargement ratio of the output video with respect to the input video to thereby continuously change the enlargement ratio of the output video with respect to the input video on the display unit.
In order to solve the aforementioned problems, a method for controlling a video display device according to an aspect of the disclosure includes: a video enlargement step of increasing the number of pixels of an input video input to the video display device and generating an output video obtained by enlarging the input video; and a display step of displaying the output video generated at the video enlargement step on a display unit, in which in the video enlargement step, at a position close to a reference position on the display unit, an enlargement ratio of the output video with respect to the input video is reduced and at a position away from the reference position, the enlargement ratio of the output video with respect to the input video is increased to thereby continuously change the enlargement ratio of the output video with respect to the input video on the display unit.
According to an aspect of the disclosure, it is possible to display a less-distorted output video regardless of a viewpoint position of a user.
Embodiment 1 of the disclosure will be described in detail below.
A configuration of a video display device 1 will be described with reference to
The video conversion unit 10 acquires original video data from an HDD (Hard Disc Drive) recorder, a media reproducing device, the Internet, or the like. Here, the HDD recorder and the media reproducing device may be included in the video display device 1 or connected to the video display device 1. The video conversion unit 10 converts resolution of the acquired original video data into a format that allows processing by the rendering unit 20. The video conversion unit 10 outputs an input video signal that includes the generated input video data to the rendering unit 20.
The rendering unit 20 executes rendering processing (described below) for the input video data output from the video conversion unit 10 and generates output video data. Then, the rendering unit 20 outputs the generated output video data to the display unit 30. The rendering unit 20 includes a temporary storage unit 21, a pixel information acquisition unit 22, a pixel reference position control unit 23 (pixel data extraction unit), and an interpolation calculation unit 24 (pixel data interpolation unit). An operation of each of the units of the rendering unit 20 will be described in description for the rendering processing.
With reference to
As illustrated in
The pixel reference position control unit 23 decides the corresponding position (x, y) in an input video, which corresponds to (X, Y) (S3). For example, the pixel reference position control unit 23 may calculate the corresponding position (x, y) that corresponds to the reference pixel position (X, Y) in accordance with the following formula.
x=a(X−Px)+Px
y=a(Y−Py)+Py [Mathematical formula 1]
As illustrated in
The video display device 1 may require the user to input, as the reference position (Px, Py), the viewpoint position when viewing the output video or may automatically detect the viewpoint position of the user by using an infrared sensor (not illustrated) included in the display unit 30. Alternatively, through a setting menu of the video display device 1, the user may be allowed to perform an input indicating to what extent the position of the user is deviated in a vertical or horizontal direction from the center of the output video.
The parameter a is a function of the reference pixel position (X, Y) in the output video. The parameter a is preferably reduced as the reference pixel position (X, Y) is farther from the center (Px, Py) of the output video. In this case, as (X, Y) is away from (Px, Py), the parameter a is reduced (that is, the enlargement ratio is increased), so that a change amount of (X, Y) with respect to a change of (x, y) is increased. In other words, as (X, Y) is closer to a corner of the display unit 30, the enlargement ratio of the output video with respect to the input video is increased. To the contrary, as (X, Y) is closer to the center (Px, Py) of the output video, the parameter a is increased, so that the enlargement ratio of the output video with respect to the input video is reduced.
The interpolation calculation unit 24 acquires an input video signal I(x, y) corresponding to a pixel at (x, y) and a pixel proximate thereto from the temporary storage unit 21. Then, in accordance with a formula described below, the interpolation calculation unit 24 calculates an output video signal J(X, Y) corresponding to a pixel proximate to the reference pixel position (X, Y) from the input video signal I(x, y) corresponding to the pixel proximate to the corresponding position (x, y) (S4). Note that, an example of algorithm of the calculation at S4 will be described below. The interpolation calculation unit 24 outputs the output video signal J(X, Y) to the display unit 30 (S5). Note that, S1 to S5 described above correspond to a video enlargement step of the disclosure.
The display unit 30 displays, at the reference pixel position (X, Y) on the display unit 30, an output video according to the output video signal J(X, Y) (display step).
(Correspondence between S(X, Y) and (x, y))
With reference to
d=√{square root over ((X−Px)2+(Y−Py)+(D)2)}
L=√{square root over (Px2+Py2+(D)2)} [Mathematical formula 2]
In the formula, a variable D indicating a distance between the eye position of the user and the display unit 30, that is, a viewing distance of the user may be set in accordance with image quality of the output video. For example, in a case where the image quality of the output video is equivalent to that of a video of the SHD standard, D may be set to a recommended viewing distance of the video of the SHD standard, 0.75 H (H is a height of the display unit 30). Alternatively, the user may be allowed to input D from the setting menu of the video display device 1 or the viewing distance of the user may be detected by using the infrared sensor or the like of the display unit 30.
The parameter a described above may be calculated in accordance with the following formula.
As found from the formula, as the reference pixel position (X, Y) is farther from the center (Px, Py) of the output video, that is, as the reference pixel position (X, Y) is closer to the corner of the display unit 30, the enlargement ratio of the output video with respect to the input video data is increased. That is, at the corner of the display unit 30, the original video is greatly stretched. Moreover, the enlargement ratio (that is, inverse of the parameter a) depends on the distance d or D between the eye position of the user and the display unit 30 (refer to
With reference to
As illustrated in
The output video signal J(X, Y) may be calculated from the input video signals I(xL, yT), I(xR, yT), I(xL, yB), and I(xR, yB), for example, in accordance with the following formula.
x
L
=└x┘,y
T
=└y┘
x
R
=└x┘+1,yB=└y┘+1
w
xL
=x
R
−x,w
yT
=y
B
−y
w
xR
=x−x
R
,w
yB
=y−y
T
J(X,Y)=wxLwyTI(xL,yT)+wxRwyTI(xR,yT)+wxLwyBI(xL,yB)+wxRwyBI(xR,yB) [Mathematical formula 4]
Here, wxL, wxR, wyT, and wyB respectively indicate weights of the input video signals I(xL, yT), I(xR, yT), I(xL, yB), and I(xR, yB). In the formula, a greater weight is assigned to an input video signal corresponding to a pixel closer to the corresponding position (x, y).
Embodiment 2 of the disclosure will be described as follows. Note that, for convenience of description, a member having the same function as that of the member described in the foregoing embodiment will be given the same reference sign and description thereof will be omitted.
In the present embodiment, a method for calculating the parameter a described in Embodiment 1 above by algorithm different from that of Embodiment 1 above will be described.
In the formula, R is a distance between the viewpoint position (Px, Py) of the user and the corner (0, 0) of the display unit 30 and r indicates a distance between the viewpoint position (Px, Py) of the user and the reference pixel position (X, Y) in the output video. Moreover, θ is an angle formed by a sight line of the user directed to the center (Px, Py) of the output video and the reference pixel position (X, Y) and θmax is a maximum value of θ. Moreover, φ is an angle formed by a vector (X−Px, Y−Py) and an x-axis. Note that, a tan 2 is a function to calculate a tan (inverse function of tan) in a programming language such as the C language. When a tan is represented by a format of a tan 2, the aforementioned formula is obtained.
In the present embodiment, the parameter a is calculated by the following formula.
The parameter a calculated by the algorithm described in the present embodiment is substantially equal to the parameter a described in Embodiment 1 above. However, in the present embodiment, the parameter a is represented by each calculation of addition and subtraction, multiplication, a square-root of sum of squares, cos, and a tan. Both the addition and subtraction and the multiplication are calculation with a low load. The calculation of a tan and the calculation of the square-root of sum of squares are able to be relatively easily executed by using existing algorithm. The algorithm to calculate the parameter a described in the present embodiment is able to be achieved by a relatively small electronic circuit.
Embodiment 3 of the disclosure will be described as follows. Note that, for convenience of description, a member having the same function as that of the member described in the foregoing embodiment will be given the same reference sign and description thereof will be omitted.
The enlargement ratio between the reference pixel position (X, Y) in the output video and the corresponding position (x, y) in the input video is not limited to the parameter a described in Embodiments 1 and 2 above as long as satisfying a condition that the enlargement ratio continuously changes with a change of (X, Y).
In the present embodiment, (X, Y) and (x, y) are associated with each other in accordance with the following formula.
The parameters φ and θ are the same as those described in Embodiment 2 above (refer to
A change rate of θ is represented by the following formula.
According to the formula, when r is equal to 0, the change ratio of θ is largest, and when r is equal to R, the change ratio of θ is smallest. This indicates that a degree of stretching of the output video is smallest at the center (Px, Py) of the output video, and as (X, Y) is closer to the corner of the display unit 30, the degree of the stretching of the output video is increased.
The parameters φ and θ in the present embodiment are also described only by a trigonometric function, an inverse trigonometric function, and a square-root of sum of squares, similarly to the parameter a of Embodiment 2 above. Thus, by using existing algorithm, the enlargement ratio is able to be calculated through calculation processing with a relatively small load. The algorithm to calculate the enlargement ratio described in the present embodiment is able to be achieved by a relatively small electronic circuit.
A control block (in particular, the video conversion unit 10 and the rendering unit 20) of the video display device 1 may be realized by a logic circuit (hardware) provided in an integrated circuit (IC chip) or the like or may be realized by software with use of a CPU (Central Processing Unit).
In the latter case, the video display device 1 includes a CPU that executes a command of a program that is software enabling each of functions, a ROM (Read Only Memory) or a storage device (each referred to as a “recording medium”) in which the program and various kinds of data are stored so as to be readable by a computer (or a CPU), a RAM (Random Access Memory) that develops the program, and the like. An object of the disclosure is achieved by a computer (or a CPU) reading and executing the program from the recording medium. As the recording medium, for example, a “non-transitory tangible medium” such as a tape, a disk, a card, a semiconductor memory, or a programmable logic circuit may be used. The program may be supplied to the computer via any transmission medium (such as a communication network or a broadcast wave) which enables the program to be transmitted. Note that, the disclosure can also be achieved in a form of a data signal in which the program is embodied through electronic transmission and which is embedded in a carrier wave.
A video display device (1) according to an aspect 1 of the disclosure includes: a video enlargement unit (rendering unit 20) that increases the number of pixels of an input video input to the video display device and generates an output video obtained by enlarging the input video; and a display unit (30) that displays the output video generated by the video enlargement unit, in which the video enlargement unit reduces, at a position close to a reference position on the display unit, an enlargement ratio of the output video with respect to the input video, and increases, at a position away from the reference position, the enlargement ratio of the output video with respect to the input video to thereby continuously change the enlargement ratio of the output video with respect to the input video on the display unit.
According to the aforementioned configuration, the enlargement ratio of the output video with respect to the input video continuously changes on the display unit. The change cancels out a perspective effect caused when the display unit is seen from the reference position. Thus, in a case where a user sees the output video from a vicinity of the reference position or a case where the reference position is set so as to correspond to a viewpoint position of the user, a less-distorted output video is able to be displayed.
In the video display device according to an aspect 2 of the disclosure, in the aspect 1, the video enlargement unit may include: (a) a temporary storage unit (21) that stores data of the input video; (b) a pixel data extraction unit (pixel reference position control unit 23) that extracts, out of the data of the input video stored in the temporary storage unit, data of a pixel of the input video corresponding to a pixel interpolated to the output video; and (c) a pixel data interpolation unit (interpolation calculation unit 24) that generates data of the pixel, which is interpolated to the output video, on a basis of the data of the pixel of the input video extracted by the pixel data extraction unit, in which the pixel data extraction unit may select, on a basis of the enlargement ratio, one or more pixels of the input video corresponding to the pixel interpolated to the output video.
According to the aforementioned configuration, the data of the pixel interpolated to the output video is able to be generated on the basis of the data of the pixel of the input video.
In the video display device according to an aspect 3 of the disclosure, in the aspect 1 or 2, the reference position may be a position at which an eye position of a user is projected onto the display unit.
In the video display device according to an aspect 4 of the disclosure, in any of the aspects 1 to 3, the enlargement ratio may be calculated on a basis of a distance between the eye position of the user and the display unit.
According to the aforementioned configuration, the enlargement ratio is able to be increased as a position on the display unit is farther from the eye position of the user.
A method for controlling a video display device according to an aspect 5 of the disclosure includes: a video enlargement step of increasing the number of pixels of an input video input to the video display device and generating an output video obtained by enlarging the input video; and a display step of displaying the output video generated at the video enlargement step on a display unit, in which in the video enlargement step, at a position close to a reference position on the display unit, an enlargement ratio of the output video with respect to the input video is reduced and at a position away from the reference position, the enlargement ratio of the output video with respect to the input video is increased to thereby continuously change the enlargement ratio of the output video with respect to the input video on the display unit.
According to the aforementioned configuration, an effect similar to that of the video display device according to the aspect 1 is able to be exerted.
The video display device according to each aspect of the disclosure may be enabled by a computer, and in such case, a control program for the video display device that causes the video display device to be realized by a computer by causing the computer to operate as each unit (software element) of the video display device, and a computer-readable recording medium having the control program recorded therein are also included in the scope of the disclosure.
The disclosure is not limited to each of the embodiments described above, and may be modified in various manners within the scope indicated in the claims and an embodiment achieved by appropriately combining technical means disclosed in different embodiments is also encompassed in the technical scope of the disclosure. Further, by combining the technical means disclosed in each of the embodiments, a new technical feature may be formed.
This application claims the benefit of priority to Japanese Patent Application No. 2016-114833 filed on Jun. 8, 2016, the content of which is incorporated herein by reference in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2016-114833 | Jun 2016 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2017/016066 | 4/21/2017 | WO | 00 |