The present application claims the priority of Chinese patent application No. 201510058988.1 filed on Feb. 4, 2015, the disclosure of which is incorporated herein by reference.
The present application relates to a technical field of displaying, and more particular to a display device and a display method for three dimensional (3D) displaying.
When an image is displayed on a conventional display device, a viewer may observe the same displayed image from any position before a screen of the display device, i.e. the image that the viewer observes is independent of the visual angle of the viewer and kept unchanged, which may not present the viewer with a visual experience as if in a real world.
In view of the above fact, it is an object of the present application to provide a display method and a display device, which may display different images corresponding to different visual angles based on position information with respect to the viewer's eyes.
For solving the above technical problem, in the present disclosure, it is provided a display method, including steps of:
Alternatively, the step of obtaining the screen display image corresponding to the visual angle of the eyes based on the position information with respect to the eyes and parameters for the stereo image to be displayed may include:
Alternatively, the coordinates (X1,Y1) of the display position M′ of each pixel M within the stereo image to be displayed on the display screen may be calculated using the following equations:
X1=(L/tan ∠AXZ)−[L(L/tan ∠AXZ−b1)/(L+a)]
Y1=(L/tan ∠AYZ)−[L(L/tan ∠AYZ−b2)/(L+a)],
Alternatively, ∠AXZ and ∠AYZ may be calculated using the following equations:
tan ∠AXZ=L/x
tan ∠AYZ=L/y,
Alternatively, the distances between the pixel M and the detection point in the X-axis, Y-axis and Z-axis directions may be obtained by:
Alternatively, the step of detecting the position information with respect to the viewer's eyes may include:
Alternatively, the step of detecting the position information with respect to the viewer's eyes may include:
In the present disclosure, it is further provided a display device including:
Alternatively, the processing unit may be further configured to:
Alternatively, the coordinates (X1,Y1) of the display position M′ of each pixel M within the stereo image to be displayed on the display screen may be calculated by the processing unit using the following equations:
X1=(L/tan ∠AXZ)−[L(L/tan ∠AXZ−b1)/(L+a)]
Y1=(L/tan ∠AYZ)−[L(L/tan ∠AYZ−b2)/(L+a)],
Alternatively, ∠AXZ and ∠AYZ may be calculated by the processing unit using the following equations:
tan ∠AXZ=L/x
tan ∠AYZ=L/y,
Alternatively, the processing unit may be further configured to:
Alternatively, the detection unit may include:
Alternatively, the detection unit may include:
In order to illustrate the technical solutions of the present disclosure or the related art in a clearer manner, the drawings desired for the embodiments will be described briefly hereinafter. Obviously, the following drawings merely relate to some embodiments of the present disclosure, and based on these drawings, a person skilled in the art may obtain the other drawings without any creative effort.
The present disclosure will be described hereinafter in conjunction with the drawings and embodiments. The following embodiments are for illustrative purposes only, but shall not be used to limit the scope of the present disclosure.
In order to make the objects, the technical solutions and the advantages of the present disclosure more apparent, the present disclosure will be described hereinafter in a clear and complete manner in conjunction with the drawings and embodiments. Obviously, the following embodiments are merely a part of, rather than all of, the embodiments of the present disclosure, and based on these embodiments, a person skilled in the art may obtain the other embodiments, which also fall within the scope of the present disclosure.
Unless otherwise defined, any technical or scientific term used herein shall have the common meaning understood by a person of ordinary skills. Such words as “first” and “second” used in the specification and claims are merely used to differentiate different components rather than to represent any order, number or importance. Similarly, such words as “one” or “one of” are merely used to represent the existence of at least one member, rather than to limit the number thereof. Such words as “connect” or “connected to” may include electrical connection, direct or indirect, rather than to be limited to physical or mechanical connection. Such words as “on”, “under”, “left” and “right” are merely used to represent relative position relationship, and when an absolute position of the object is changed, the relative position relationship will be changed too.
In the following, the present disclosure will be described hereinafter in a clear and complete manner in conjunction with the drawings and embodiments. Obviously, the following embodiments are merely a part of, rather than all of, the embodiments of the present disclosure, and based on these embodiments, a person skilled in the art may obtain the other embodiments, which also fall within the scope of the present disclosure.
Referring to
Step 11: detecting position information with respect to the viewer's eyes.
Herein, the position information with respect to the viewer's eyes may include the position coordinates of the viewer's eyes.
Step S12: obtaining a screen display image corresponding to a visual angle of the eyes based on the position information with respect to the eyes and parameters for a stereo image to be displayed.
Herein, the parameters for the stereo image to be displayed may include a position relation between each pixel M in the stereo image to be displayed and the display screen. And such position relation may be obtained based on a 3D model of the stereo image to be displayed.
Step S13: displaying the screen display image.
Referring to
In this embodiment, it may be displayed the screen display image corresponding to the visual angle of the viewer's eyes based on the position information with respect to the eyes instead of displaying the same screen display image irrespective of the visual angle, which may present the viewer with the visual experience as if in the real world.
In step S12, the step of obtaining a screen display image corresponding to a visual angle of the eyes based on the position information with respect to the eyes and parameters for a stereo image to be displayed may include:
Step S121: calculating coordinates of a display position M′ of each pixel M within the stereo image to be displayed on the display screen based on the position information with respect to the eyes and the parameters for the stereo image to be displayed; and
Step S122: obtaining the screen display image corresponding to the visual angle of the eyes based on the coordinates of the display position M′ of each pixel M on the display screen.
In this embodiment, the detection unit may adopt one camera to obtain an image of the viewer's eyes, and then the position information with respect to the viewer's eyes may be calculated based on the image of the viewer's eyes obtained by the one camera. However, it is assumed that the position information with respect to the viewer's eyes calculated based on the image of the eyes obtained by the one camera is somewhat inaccurate. Thus, in this embodiment, the detection unit may adopt two cameras to obtain the image of the viewer's eyes, and then the position information of the viewer's eyes may be calculated based on the image of the eyes obtained by the two cameras, so as to achieve a more accurate result of the position information with respect to the viewer's eyes.
Alternatively, when the position information with respect to the viewer's eyes is obtained with the one camera, the one camera is provided in the middle of an upper edge or a lower edge of a frame of the display screen, and the camera is provided within the plane of the display screen.
Alternatively, when the position information with respect to the viewer's eyes is obtained with the two cameras, the two cameras are provided at two ends regarding the upper edge of the frame of the display screen respectively, or the two cameras are provided at two ends regarding the lower edge of the frame of the display screen respectively, while both of the cameras are provided within the plane of the display screen, and a connection line between the two cameras is substantially parallel to the lower edge of the frame of the display screen.
Hereafter, the embodiments are described with the examples of obtaining the position information with respect to the viewer's eyes with the two cameras.
In the following, it is described the method of calculating coordinates of a display position M′ of each pixel M within the stereo image to be displayed on the display screen in details.
Before explaining the method of calculating coordinates of a display position M′ of each pixel M within the stereo image to be displayed on the display screen, it is firstly introduced the coordinate system adopted by the display method of the embodiment.
In this embodiment, it is firstly established a 3D coordinate system, wherein a middle point of the connection line of the two camera may be adopted as an origin of the coordinate (hereafter, it is also called as a detection point). It is appreciated that, in other embodiments of the present disclosure, another position, for example an intersection point of diagonals on the display screen may be adopted as the origin of the coordinate.
In this 3D coordinate system, an X-axis direction indicates a horizontal direction in a plane of the display screen, a Y-axis direction indicates a vertical direction in the plane of the display screen, and a Z-axis direction indicates a direction perpendicular to the plane of the display screen.
Referring to
As illustrated in
tan ∠AYZ=L/(b2+c)=L/(Y1+d) (1)
Herein, L represents a distance between the position of the eyes and the detection point in the Z-axis direction, b2 represents a distance between the pixel M and the detection point in the Y-axis direction, c represents a distance between a line between the pixel M and the position of the eyes in the Y-axis direction and an extension line of the detection point in the Z-axis direction, and d represents a distance between an extension line of the display position M′ in the Y-axis direction and an extension line of the position of the eyes vertical to the display screen.
Furthermore, the following equation (2) may be obtained from
tan ∠B=L/d=(L+a)/c (2)
The following equation (3) may be obtained based on the equations (1) and (2):
Y1=(L/tan ∠AYZ)−[L(L/tan ∠AYZ−b2)/(L+a)] (3)
That is, the Y-axis coordinate of the display position M′ of the pixel M on the display screen may be calculated using the above equation (3).
Herein, if Y1=e1 or Y1=f1, it is indicated that M′ is on an edge of the frame of the display screen; and if Y1<e1 or Y1>f1, it is indicated that the M′ is not on the display screen, i.e. the pixel M is not displayed on the display screen, wherein e1 represents a vertical distance between a side of the display screen being closer to the detection point and the detection point, and f1 represents a vertical distance between a side of the display screen being away from the detection point and the detection point.
Referring to
X1=(L/tan ∠AXZ)−[L(L/tan ∠AXZ−b1)/(L+a)]
Herein, X1 represents the X-axis coordinate of M′, L represents the distance between the position of the eyes and the detection point in the Z-axis direction, ∠AXZ represents an angle of a position component of the eyes relative to the detection point in the X-axis direction, a represents a distance between the pixel M and the detection point in the Z-axis direction, b1 represents the distance between the pixel M and the detection point in the X-axis direction, and Y1 represents the Y-axis coordinate of M′.
Similarly, if X1=e2 or X1=f2, it is indicated that M′ is on an edge of the frame of the display screen; and if X1>e2 or X1>f2, it is indicated that the M′ is not on the display screen, i.e. the pixel M is not displayed on the display screen, wherein e2 represents a vertical distance between a first side of the display screen and the detection point, and f2 represents a vertical distance between a second side of the display screen and the detection point.
The above L, ∠AXZ and ∠AYZ are obtained based on the position information of the eyes, and a, b1 and b2 are obtained based on the parameters for the stereo image to be displayed.
Referring to
It can be seen from
tan ∠AXZ=L/x
tan ∠AYZ=L/y
Herein, P (x, y, L) represents position coordinates of the eyes in the position information with respect to the eyes, x represents a distance between the position of the eyes and the detection point in the X-axis direction, y represents a distance between the position of the eyes and the detection point in the Y-axis direction, and L represents a distance between the position of the eyes and the detection point in the Z-axis direction.
In the above embodiment, the distances between the pixel M and the detection point in the X-axis, Y-axis and Z-axis directions are obtained by:
As illustrated in
Herein, those skilled in the art may understand that the detection unit 61 may be a camera, a webcam, a camcorder and so forth; and the display unit 63 may be a cell phone, a computer, a television, a digital camera and so forth. However, the present disclosure is not limited thereto.
Furthermore, those skilled in the art may further understand that the processing unit 62 may be implemented as a hardware, a firmware, a software or a combination thereof in any computing device (including a processor, a storage medium, and etc.) or a network of the computing devices, which may be implemented by those skilled in the art with basic programming skills under the teaching of the present disclosure.
In this embodiment of the present disclosure, it is displayed the screen display image corresponding to the visual angle of the viewer's eyes based on the position information with respect to the eyes instead of displaying the same screen display image irrespective of the visual angle, which may present the viewer with the visual experience as if in the real world.
Alternatively, the processing unit 62 is further configured to:
Alternatively, the coordinates (X1,Y1) of the display position M′ of each pixel M within the stereo image to be displayed on the display screen are calculated by the processing unit 62 using the following equations:
X1=(L/tan ∠AXZ)−[L(L/tan ∠AXZ−b1)/(L+a)]
Y1=(L/tan ∠AYZ)−[L(L/tan ∠AYZ−b2)/(L+a)],
In a coordinate system (X, Y, Z), the X-axis direction indicates a horizontal direction in a plane of the display screen, the Y-axis direction indicates a vertical direction in the plane of the display screen, the Z-axis direction indicates a direction perpendicular to the plane of the display screen, and the detection point indicates an origin of the coordinate system and is on the plane of the display screen; and
Alternatively, ∠AXZ and ∠AYZ are calculated by the processing unit 62 using the following equations:
tan ∠AXZ=L/x
tan ∠AYZ=L/y,
Alternatively, the processing unit 62 is further configured to:
Alternatively, the detection unit 61 includes:
Alternatively, the detection unit 61 includes:
The above are merely the preferred embodiments of the present disclosure. It should be appreciated that, a person skilled in the art may make further improvements and modifications without departing from the principle of the present disclosure, and these improvements and modifications shall also fall within the scope of the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
2015 1 0058988 | Feb 2015 | CN | national |
Number | Name | Date | Kind |
---|---|---|---|
20140176676 | Lin | Jun 2014 | A1 |
20140192168 | Shimoyama | Jul 2014 | A1 |
Number | Date | Country |
---|---|---|
101931823 | Dec 2010 | CN |
102520970 | Jun 2012 | CN |
103354616 | Oct 2013 | CN |
0874303 | Oct 1998 | EP |
Entry |
---|
First Office Action regarding Chinese Application No. 201510058988.1, dated May 2, 2017. Translation provided by Dragon Intellectual Property Law Firm. |
Number | Date | Country | |
---|---|---|---|
20160227204 A1 | Aug 2016 | US |