The present disclosure relates to a three-dimensional display apparatus, a three-dimensional display system, a head-up display system, and a mobile body.
Conventionally, three-dimensional display apparatuses that display three-dimensional images without the need for a viewer to wear eyeglasses are known. Such a three-dimensional display includes an optical element configured to transmit a portion of image light emitted from an image display panel to a right eye and another portion of image light emitted from the image display panel to a left eye. Generally, subpixels of the image display panel have a rectangular shape and are arranged in horizontal and vertical directions. In order to control image light that reaches the right and left eyes, a parallax barrier having strip-like openings extending in the vertical direction is customarily used as the optical element. However, when the parallax barrier extending in the vertical direction is arranged in the horizontal direction, problems are encountered, such as the occurrence of a moiré pattern that obscures the image. As such, a three-dimensional display apparatus that includes an optical element in which strip-shaped parallax barriers extending in a diagonal direction of subpixels of a display surface are arranged is suggested.
A three-dimensional display apparatus according to the present disclosure includes a display surface, an optical element, and a controller. The display surface is formed as a curved surface that does not have curvature in a first direction and has curvature in a plane orthogonal to the first direction. The display surface includes subpixels arranged in a grid pattern along the first direction and a direction in the display surface that is orthogonal to the first direction. The optical element is arranged to follow the curved surface and defines a beam direction of image light emitted from each of the subpixels for each of strip-shaped regions extending in a certain direction in a surface following the curved surface. The controller is configured to acquire a position of a user's eye and change an image displayed by each of subpixels based on the display surface, the optical element, and the position of the user's eye.
A three-dimensional display system according to the present disclosure includes a detection apparatus and three-dimensional display apparatus. The detection apparatus is configured to detect a position of a user's eye. The three-dimensional display apparatus includes a display surface, an optical element, and a controller. The display surface is formed as a curved surface that does not have curvature in a first direction and has curvature in a plane orthogonal to the first direction. The display surface includes subpixels arranged in a grid pattern along the first direction and a direction in the display surface that is orthogonal to the first direction. The optical element is arranged to follow the curved surface and defines a beam direction of image light emitted from each of the subpixels for each of strip-shaped regions extending in a certain direction in a surface following the curved surface. The controller is configured to acquire the position of the user's eye and change an image displayed by each of subpixels based on the display surface, the optical element, and the position of the user's eye.
A head-up display system according to the present disclosure includes a three-dimensional display apparatus. The three-dimensional display apparatus includes a display surface, an optical element, and a controller. The display surface is formed as a curved surface that does not have curvature in a first direction and has curvature in a plane orthogonal to the first direction, and includes subpixels arranged in a grid pattern along the first direction and a direction in the display surface that is orthogonal to the first direction. The optical element is arranged to follow the curved surface and defines a beam direction of image light emitted from each of the subpixels for each of strip-shaped regions extending in a certain direction in a surface following the curved surface. The controller is configured to acquire a position of a user's eye and change an image displayed by each of subpixels based on the display surface, the optical element, and the position of the user's eye.
A mobile body according to the present disclosure includes a three-dimensional display system. The three-dimensional display system includes a detection apparatus and a three-dimensional display apparatus. The detection apparatus is configured to detect a position of a user's eye. The three-dimensional display apparatus includes a display surface, an optical element, and a controller. The display surface is configured from a curved surface that does not curve along a first direction but curves within a plane orthogonal to the first direction, the display surface including subpixels arranged in a grid pattern along the first direction and a direction that is orthogonal to the first direction within the display surface. The optical element is arranged along the curved surface and defines a beam direction of image light emitted from each of the subpixels for each of strip-shaped regions extending in a certain direction within a plane that is along the curved surface. The controller is configured to acquire the position of the user's eye and change an image displayed by each of subpixels based on the display surface, the optical element, and the position of the user's eye.
In the accompanying drawings:
It is desirable to apply three-dimensional display apparatuses as described above to curved display panels, which are in increasing demand due to design requirements.
The present disclosure provides a three-dimensional display apparatus, a three-dimensional display system, a head-up display system, and a mobile body that satisfy the design requirements of a curved display panel and are capable of appropriately performing a three-dimensional display by suppressing the occurrence of a moiré pattern.
According to embodiments of the present disclosure, three-dimensional display can be appropriately performed while accommodating the design requirements of a curved display panel formed by a curved surface.
Hereinafter, embodiments of the present disclosure will be described with reference to the drawings.
A three-dimensional display system 1 according to the embodiments of the present disclosure includes a detection apparatus 2 and a three-dimensional display apparatus 3, as illustrated in
Hereinafter, a configuration of the three-dimensional display system 1 will be described in detail. The detection apparatus 2 illustrated in
The detection apparatus 2 may detect the position of at least one of the left eye and the right eye of the user as the coordinates in a three-dimensional space based on images captured by two or more cameras.
The detection apparatus 2 does not need to include a camera and can be connected to an external camera. The detection apparatus 2 may include an input terminal configured to receive a signal input from the external camera. The external camera may be directly connected to the input terminal. The external camera may be indirectly connected to the input terminal via a common network. The detection apparatus 2 which does not include a camera may include an input terminal through which the camera inputs an image signal. The detection apparatus 2 which does not include a camera may detect at least one of positions of the left eye and the right eye of the user based on the image signal input to the input terminal.
The detection apparatus 2 may include, for example, a sensor. The sensor may be an ultrasonic sensor, an optical sensor, or the like. The detection apparatus 2 may detect a position of a user's head using the sensor and detect at least one of positions of the left eye and the right eye of the user based on the position of the head. The detection apparatus 2 may detect at least one of positions of the left eye and the right eye of the user as coordinates in a three-dimensional space using one or more sensors.
The three-dimensional display system 1 does not need to include the detection apparatus 2. When the three-dimensional display system 1 does not include the detection apparatus 2, the controller 7 may include an input terminal configured to receive a signal input from an external detection apparatus. The external detection apparatus may be connected to the input terminal. The external detection apparatus may use an electrical signal or an optical signal as a transmission signal to be transmitted in respect of the input terminal. The external detection apparatus may be indirectly connected to the input terminal via a common network. The controller 7 may receive an input of position coordinates indicating at least one of a position of the left eye and a position the right eye acquired from the external detection apparatus. The controller 7 may calculate a moving distance of the left eye and the right eye along the horizontal direction based on the position coordinates.
As illustrated in
The emitter 4 irradiates the surface of the display panel 5. The emitter 4 may include a light source, a light guide plate, a diffusion plate, a diffusion sheet, and the like. In the emitter 4, the light source emits light uniformed in a direction of the surface of the display panel 5 by the light guide plate, the diffusion plate, the diffusion sheet, and the like. The emitter 4 emits homogenized light towards the display panel 5.
A display surface 51 of the display panel 5 is formed as a curved surface that does not have curvature in a first direction (an x-axis direction) and has a curvature in a plane (a yz plane) orthogonal to the first direction as illustrated in
As illustrated in
The display panel 5 includes the display surface 51 that includes subpixels arranged in a grid pattern along the first in-surface direction and the second in-surface direction. Although the display panel 5 is illustrated as a flat plane in
Each subdivision of the display panel 5 corresponds to one subpixel. Subpixels is arranged in a grid pattern in the first in-surface direction and the second in-surface direction. Each of the subpixels corresponds to one of the colors R (Red), G (Green), and B (Blue). One pixel may be configured as a combination of three subpixels respectively corresponding to R, G, and B. One pixel may be referred to as one image element. The display panel 5 is not limited to a transmitting liquid crystal panel and may be another type of a display panel such as an organic EL display panel. When the display panel 5 is a self-luminous display panel, the emitter 4 may be omitted.
A parallax barrier 6 defines a beam direction of image light emitted from each of the subpixels. The parallax barrier 6 is formed by a curved surface which follows the display surface 51 and is arranged at a certain distance from the display surface 51 as illustrated in
In particular, the parallax barrier 6 includes light shielding surfaces 61 configured to block image light as illustrated in
The transmitting regions 62 are portions configured to transmit light incident on the parallax barrier 6. The transmitting regions 62 may transmit light at a transmittance of a first certain value or more. The first certain value may be, for example, 100% or a value close thereto. The light shielding surfaces 61 are portions configured to block the light incident on the parallax barrier 6. In other words, the light shielding surfaces 61 are configured to block the image displayed on the three-dimensional display apparatus 3. The light shielding surfaces 61 may block light at a transmittance of a second certain value or less. The second certain value may be, for example may be 0% or a value close thereto.
The transmitting regions 62 and the light shielding surfaces 61 extend in a certain direction in a surface which follows the display surface 51 and are alternately arranged in a repeating manner in the direction orthogonal to the certain direction. The transmitting regions 62 are configured to define the respective beam directions of image light emitted from the subpixels.
In a case in which a line indicating an edge of the transmitting region 62 extends in the second in-surface direction, a moiré pattern may occur between an aperture pattern of the parallax barrier 6 and a pixel pattern displayed on the display panel 5. When the line indicating the edge of the transmitting region 62 extends in a certain direction having a certain angle other than 0 degrees with respect to the second in-surface direction, the moiré pattern that occurs in a displayed image can be reduced.
The parallax barrier 6 may be configured as a film or a plate-like member having a transmittance lower than the second certain value. In this case, the light shielding surfaces 61 are configured as the film or the plate-like member. The transmitting regions 62 are configured as openings formed in the film or the plate-like member. The film may be made of a resin or any appropriate material. The plate-like member may be made of a resin, a metal, or any appropriate material. The parallax barrier 6 is not limited to being configured as the film or the plate-like member and may be configured as a different type of member. The parallax barrier 6 may include a substrate having a light-shielding property or a light-shielding additive added thereto.
The parallax barrier 6 may be configured as a liquid crystal shutter. The liquid crystal shutter can control light transmittance in accordance with an applied voltage. The liquid crystal shutter may include pixels and control the light transmittance of each of the pixels. The liquid crystal shutter may form a high light-transmittance region or a low light-transmittance region in any shape. In a case in which the parallax bather 6 is configured as a liquid crystal shutter, the transmitting regions 62 may be areas having a light transmittance of the first certain value or more. In a case in which the parallax barrier 6 is configured as a liquid crystal shutter, the light shielding surfaces 61 may be areas having a light transmittance of the second certain value or less.
When the parallax barrier 6 has the above configuration, the parallax barrier 6 causes image light emitted from some subpixels in the display surface 51 to pass through the transmitting regions 62 and reach the users right eye. The parallax barrier 6 causes image light emitted from other subpixels to pass through the transmitting regions 62 and reach the users left eye.
In particular, when image light transmitted through the transmitting regions 62 of the parallax barrier 6 reaches the users left eye, the users left eye can see the visible regions 51a corresponding to the transmitting regions 62, as illustrated in
Further, when image light from other subpixels transmitted through the transmitting regions 62 of the parallax barrier 6 reaches the users right eye, the users right eye can see the invisible regions 51b, which cannot be seen by the users left eye. Because image light is blocked by the light shielding surfaces 61 of the parallax barrier 6, the users right eye cannot see the visible regions 51a that can be seen by the users left eye. Thus, when the subpixels in the visible regions 51a display the left-eye image and the subpixels in the invisible regions 51b display the right-eye image as described above, the users right eye sees the right-eye image alone. That is, the users left-eye sees the left-eye image alone, and the users right eye sees the right-eye image alone. Accordingly, the user recognizes a parallax image, i.e., a stereoscopic image.
The controller 7 is connected to and is configured to control each constituent element of the three-dimensional display system 1. The constituent elements controlled by the controller 7 include the detection apparatus 2 and the display panel 5. The controller 7 is configured as, for example, a processor. The controller 7 may include one or more processors. The processor may include a general-purpose processor configured to read a particular program and performing a particular function, or a specialized processor dedicated for a particular processing. The specialized processor may include an application-specific integrated circuit (ASIC: Application Specific Integrated Circuit). The processor may include a programmable logic device (PLD: Programmable Logic Device). The PLD may include an FPGA (Field-Programmable Gate Array). The controller 7 may be configured as a SoC (System-on-a-Chip) or a SiP (System In a Package) in which one or more processors cooperate. The controller 7 may include a memory which stores various information and programs for operating each constituent element of the three-dimensional display system 1. The memory may be configured as, for example, a semiconductor memory. The memory may function as a working memory of the controller 7.
The controller 7 is configured to acquire a position of the user's eyes detected by the detection apparatus 2. The controller 7 is configured to change the image displayed by the subpixels in accordance with the position of the user's eyes. In particular, the controller 7 is configured to change the image displayed by the respective subpixels between the right-eye image and the left-eye image. Here, in order to explain a manner in which the controller 7 is configured to change the image displayed by the subpixels, the display panel 5 and the parallax barrier 6 will be first described in detail.
First, the display panel 5 and the parallax barrier 6 will be described with reference to
As illustrated in
On the display surface 51, the left-eye image is displayed by first subpixel groups Pg1 that include (n×b) number of subpixels (hereinafter, n×b=m) in which the n-number of subpixels are continuously arranged in the first in-surface direction and the b-number of subpixels are continuously arranged in the second in-surface direction. In the example illustrated in
The display surface 51 also includes second subpixel groups Pgr that are adjacent to the first subpixel groups Pg1 in the first in-surface direction and include the m-number of subpixels made up of the n-number of subpixels continuously arranged in the first in-surface direction and the b-number of subpixels continuously arranged in the second in-surface direction. The second subpixel groups Pgr display the right-eye image. In the display surface 51 of the example illustrated in
As described above, the n-number of subpixels repeatedly display the left-eye image in the first in-surface direction, and the n-number of subpixels adjacent to the left-eye image repetitively display the right-eye image in the first in-surface direction.
Accordingly, an image pitch k between images adjacent to each other in the first in-surface direction is expressed by 2n×Hp. The image pitch k is expressed by 2×4×Hp=8×Hp in the example illustrated in
The display surface 51 and the parallax barrier 6 as described above are viewed as illustrated in
In the example illustrated in
The parallax barrier 6 is arranged to follow the display surface 51. Thus, the visible regions 51a of the left eye located at the reference position Ps include all of the subpixels P12 to P14, P22 to P24, P32 to P34, P42 to P44, and P52 to P54 in the example illustrated in
Here, a manner in which the controller 7 is configured to change an image displayed by each subpixel in accordance with vertical displacement of an eye detected by the detection apparatus 2 will be described. In the following description, a case in which the controller 7 is configured to change an image based on the displacement of the left eye from the reference position Ps will be described. Note that the controller 7 performs in a similar manner so as to change an image based on displacement of the right eye from the reference position Ps.
When the user's eyes are displaced in the vertical direction from the reference position Ps as illustrated in
Thus, when the user's eyes are displaced in the vertical direction, the controller 7 is configured to change the image displayed by the subpixels P15, P25, and P35, for which more than half portions are included in the visible regions 51a of the left eye, to the left-eye image from the right-eye image. At this time, the portions of the subpixels P16, P26, and P36 included in the visible regions 51a are smaller than half of the respective subpixels P16, P26, and P36. Thus, the controller 7 does not change the image displayed by the subpixels P16, P26, and P36 and causes these subpixels to maintain the right-eye image.
At this time, the controller 7 is configured to change the image displayed by the subpixels P11, P21, and P31 from the left-eye image to the right-eye image, for which the portions included in the visible regions 51a are reduced, that is, the greater portions thereof become visible to the right eye.
At a position further remote from the vertical standard position Oy, the visible regions 51a include portions of the subpixels P45 and P55 when the eyes are located at the reference position Ps. However, the subpixels P45 and P55 in their entirety are included in the visible regions Ma when the eyes are located in the displaced position. Thus, the controller 7 is configured to change the images displayed by the subpixels P45 and P55 to the left-eye image from the right-eye image. Further, although the subpixels P46 and P56 are not included in the visible regions 51a when the eye is located in the reference position Ps, more than a half portion of each of the subpixels P46 and P56 is included in the visible regions 51a when the eye is in the displaced position. Thus, the controller 7 also is configured to change the images displayed by the subpixels P46 and P56 to the left-eye image from the right-eye image.
In addition, the controller 7 is configured to change the images displayed by the subpixels P41 and P51 which are no longer included in the visible regions 51a, that is, which are now visible to the right eye, to the right-eye image from the left-eye image. Further, the controller 7 is configured to change the images displayed by the subpixels P42 and P52 for which the portions included in the visible regions 51a are reduced, that is, the greater portions thereof become visible to the right eye, to the right-eye image from the left-eye image.
As described above, the controller 7 is configured to change the images displayed by the subpixels based on the distance from the vertical direction reference position Oy. In particular, when the user's eyes move in the vertical direction from the reference position Ps, the controller 7 is configured to switch the image between the right-eye image and the left-eye image displayed by the number of subpixels corresponding to a vertical displacement amount of the eye out of the group of subpixels arranged in the horizontal direction. More subpixels that are located remote from the vertical reference position Oy of the display surface 51 need to change the displayed image than subpixels that are located near the vertical reference position Oy of the display surface 51. In other words, the farther a group of subpixels located remote from the vertical reference position Oy in the vertical direction, the narrower the intervals the controller 7 needs to change the displayed images. Because the controller 7 controls the imaged displayed by the subpixels as described above, the occurrence of crosstalk can be reduced in the three-dimensional display apparatus 3 having the display surface 51 formed as the curved surface.
Next, the manner in which the controller 7 is configured to change the images displayed by the subpixels based on a displacement of the eyes, detected by the detection apparatus 2, to approach the display surface 51 in the depth direction will be described.
First, a preferred viewing distance dy in a case in which the display surface 51 is a flat surface rather than a curved surface will be described with reference to
dy: Bpy=(dy+g): ky Equation (1)
In the present embodiment, the display panel 5 is formed as a curved surface that has an arc shape in the yz plane as described above. The display panel 5 formed as the curved surface includes subpixels arranged in regions that are divided at equal intervals in the first in-surface direction and the second in-surface direction.
Thus, when the user's eyes are located at the reference position Ps as illustrated in
In this state, when the viewing distance becomes shorter than the preferred viewing distance dy as illustrated in
Further, when the user's eyes are located at the displaced position in the depth direction, the visible region 51a(2) located remote from the vertical reference position Oy includes two subpixels that display the left-eye image and four subpixels that display the right-eye image. Thus, crosstalk that occurs in the user's eyes due to image light from the visible region 51a(2) located remote from the vertical standard position Oy is increased more than crosstalk occurs in the user's eyes due to image light from the visible region 51a(1). As such, the controller 7 causes the four subpixels that are included in the visible region 51a(2) and displaying the right-eye image to display the left-eye image. Also, the controller 7 causes the subpixels that are no longer included in the visible region 51a, i.e., the subpixels that became visible to the right eye to change their displays from the left-eye image to the right-eye image. This reduces the number of right-eye images viewed by the left eye and the number of left-eye images viewed by the right eye. Accordingly, crosstalk that occurs in the user's eyes is reduced.
As described above, when the user's eye moves in the depth direction approaching the display surface 51 from the reference position Ps, the controller 7 is configured to change the image displayed by 0 or 1 subpixel in the visible region 51a(1) located in the vicinity of the vertical reference position Oy. On the other hand, the controller 7 causes the four subpixels in the visible region 51a(2) located remote from the vertical direction standard position Oy to change the displayed image. That is, the controller 7 is configured to change the image displayed by the number of subpixels corresponding to the displacement amount of the eyes in the depth direction, out of the subpixels arranged in the vertical direction, between the right-eye image and the left-eye image. More subpixels located remote from the vertical standard position Oy need to change their displayed images than subpixels located in the vicinity of the vertical reference position Oy on the display surface 51. In other words, the farther away from the vertical reference position Oy the subpixel group is located, the narrower the intervals the controller 7 needs to change the displayed images in accordance with the displacement amount in the depth direction. By the controller 7 that controls the images displayed by the subpixels as described above, occurrence of crosstalk is suppressed in the three-dimensional display apparatus 3 that includes the display surface 51 having a curved shape.
Although a case in which an image is changed when a user's eye moves in the direction approaching the display surface 51 has been described in the above, the change of the image is performed in a similar manner in a case in which a user's eye moves in a direction away from the display surface 51. That is, on the display surface 51, the controller 7 needs to cause more subpixels that are located remote from the vertical reference position Oy to change the display than subpixels that are located in the vicinity of the vertical reference position Oy. Thus, the farther away from the vertical reference position Oy in the vertical direction a subpixel group is located, the smaller the displacement amount of the eye in the depth direction in response to which the controller 7 needs to change the displayed image.
Although the above embodiments has been described as a representative example, it will be apparent to those skilled in the art that various modifications and substitutions can be made within the spirit and scope of the present disclosure. Thus, the above embodiments should not be construed as limiting the present disclosure and may be varied or changed in a variety of manners without departing from the scope of the appended claims. For example, constituent blocks descried in the embodiments may be combined into one constituent block, or one constituent block may be subdivided into constituent blocks.
Although in the above embodiments the optical element is configured as the parallax barrier 6, this is not restrictive. For example, the optical element included in the three-dimensional display apparatus 3 may be configured as a lenticular lens 9. In this case, the lenticular lens 9 is formed by cylindrical lenses 10 arranged in the x-y plane as illustrated in
Further, the three-dimensional display system 1 may be mounted in a head-up display system 100 as illustrated in
Also, the HUD 100 and the three-dimensional display system 1 may be mounted in a mobile body as illustrated in
Number | Date | Country | Kind |
---|---|---|---|
2017-087699 | Apr 2017 | JP | national |
The present application is a Continuing Application based on International Application PCT/JP2018/016860 filed on Apr. 25, 2018, which in turn claims priority to Japanese Patent Application No. 2017-087699 filed on Apr. 26, 2017, the entire disclosure of these earlier applications being incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2018/016860 | Apr 2018 | US |
Child | 16660404 | US |