The present disclosure relates to a three-dimensional display device, a three-dimensional display system, a head-up display system, and a movable object.
This application claims priority to and the benefit of Japanese Patent Applications No. 2018-149305 (filed on Aug. 8, 2018), the entire contents of which are incorporated herein by reference.
A known three-dimensional (3D) display device includes an optical element that directs a part of light from a display panel to reach a right eye, another part of the light from the display panel to reach a left eye (refer to Patent Literature 1).
The invention provides a three-dimensional display device according to claim 1, a three-dimensional display system according to claim 10, a head-up display system according to claim 14, and a movable object according to claim 15. Further embodiments are described in the dependent claims.
A three-dimensional display device according to a first aspect of the present disclosure includes a display panel, a shutter panel, and a controller. The display panel includes subpixels that display a parallax image including a first image and a second image having parallax between the images (i.e. between the first image and the second image). The shutter panel is configured to define a traveling direction of image light representing the parallax image from the display panel. The controller changes, in a certain time cycle (e.g. after every lapse of a time duration of the certain time cycle and/or with a frequency corresponding to the certain time cycle), areas on the shutter panel in a light transmissive state to transmit the image light with at least a certain transmittance and areas on the shutter panel in a light attenuating state to transmit the image light with a transmittance lower than the transmittance in the light transmissive state. The controller is configured to change the areas in the light transmissive state and the areas in the light attenuating state, and changes the subpixels to display the first image and the second image based on positions of the areas in the light transmissive state. E.g., the controller changes the areas in the light transmissive state and the areas in the light attenuating state, and changes the subpixels to display the first image and the second image based on new/changed positions of the areas in the light transmissive state. E.g., the controller changes the areas in the light transmissive state and the areas in the light attenuating state, and correspondingly changes the subpixels to display the first image and the second image. E.g., the controller changes the areas in the light transmissive state and the areas in the light attenuating state, and changes the subpixels to display the first image and the second image depending on a change of the areas in the light transmissive state and the areas in the light attenuating state.
According to a second aspect, in the three-dimensional display device according to the first aspect, the controller changes at least selected ones of the areas in the light transmissive state to the light attenuating state, and changes selected ones of the areas in the light attenuating state with the same total size as the at least selected ones of the areas (e.g., areas (to be) changed from the light transmissive state to the light attenuating state) to the light transmissive state.
According to a third aspect, in the three-dimensional display device according to the second aspect, the controller changes all the areas in the light transmissive state to the light attenuating state, and changes selected ones of the areas in the light attenuating state with the same total size as the areas in the light transmissive state (e.g., areas (to be) changed from the light transmissive state to the light attenuating state) to the light transmissive state.
According to a fourth aspect, in the three-dimensional display device according to any of the first aspect to the third aspect, the controller is configured to control the shutter panel to cause a difference in a duration for which each of the areas on the display panel is in the light transmissive state (e.g. in a state for displaying a parallax image to be transmitted through areas of the shutter panel in the light transmissive state) to fall within a certain range. E.g., in an exemplary embodiment, the certain range may be less or equal to 5%, e.g. less or equal to 4%, e.g. less or equal to 3%, e.g. less or equal to 2%, e.g. less or equal to 1%, e.g. respectively over an operation period of at least 30 minutes, e.g. at least 1 hour.
According to a fifth aspect, in the three-dimensional display device according to any of the first aspect to the fourth aspect, the controller changes the certain time cycle based on at least one of first information associated with performance degradation or a failure factor for the display panel and date and time information.
According to a sixth aspect, in the three-dimensional display device according to the second aspect, the controller determines a size of the at least selected ones of the areas to be changed from the light transmissive state to the light attenuating state based on at least one of the first information associated with performance degradation or a failure factor for the display panel and the date and time information.
According to a seventh aspect, in the three-dimensional display device according to any of the fifth aspect or the sixth aspect, the first information includes (e.g. includes one or more of) a temperature (e.g. a device temperature, e.g. a device temperature of the display panel), a movement direction, and at least one of a latitude and a longitude of the display panel, an air temperature and an illuminance around the display panel, and weather, and the date and time information includes a date and time.
A three-dimensional display system according to an eighth aspect of the present disclosure includes a position detection device and a three-dimensional display device. The position detection device is configured to detect a position of an eye of a user. The three-dimensional display device includes a display panel, a shutter panel, and a controller. The display panel includes subpixels that display a parallax image including a first image and a second image having parallax between the images. The shutter panel is configured to define a traveling direction of image light representing the parallax image from the display panel. The controller changes, in a certain time cycle, areas on the shutter panel in a light transmissive state to transmit the image light with at least a certain transmittance and areas on the shutter panel in a light attenuating state to transmit the image light with a transmittance lower than the transmittance in the light transmissive state. The controller changes the subpixels to display the first image and the second image based on positions of the areas in the light transmissive state.
A head-up display system according to a ninth aspect of the present disclosure includes a three-dimensional display device. The three-dimensional display device includes a display panel, a shutter panel, and a controller. The display panel includes subpixels that display a parallax image including a first image and a second image having parallax between the images. The shutter panel is configured to define a traveling direction of image light representing the parallax image from the display panel. The controller changes, in a certain time cycle, areas on the shutter panel in a light transmissive state to transmit the image light with at least a certain transmittance and areas on the shutter panel in a light attenuating state to transmit the image light with a transmittance lower than the transmittance in the light transmissive state. The controller changes the subpixels to display the first image and the second image based on positions of the areas in the light transmissive state.
A movable object according to a tenth aspect of the present disclosure includes a three-dimensional display device. The three-dimensional display device includes a display panel, a shutter panel, and a controller. The display panel includes subpixels that display a parallax image. The parallax image includes a first image and a second image. The first image and the second image have parallax each other. The shutter panel is configured to define a traveling direction of image light representing the parallax image from the display panel. The controller changes, in a certain time cycle, areas on the shutter panel in a light transmissive state to transmit the image light with at least a certain transmittance and areas on the shutter panel in a light attenuating state to transmit the image light with a transmittance lower than the transmittance in the light transmissive state. The controller changes the subpixels to display the first image and the second image based on positions of the areas in the light transmissive state.
The structures according to the above aspects of the disclosure have less degradation caused by sunlight irradiation.
Such 3D display devices are intended to have less degradation caused by sunlight irradiation.
One or more aspects of the present disclosure are directed to a three-dimensional display device, a three-dimensional display system, a head-up display system, and a movable object that have less degradation caused by sunlight irradiation.
Embodiments of the present disclosure will now be described with reference to the drawings.
As shown in
As shown in
As shown in
Referring now to
The position detection device 1 may include no camera, and may be connected to an external camera. The position detection device 1 may include an input terminal for receiving signals from the external camera. The external camera may be connected to the input terminal directly. The external camera may be connected to the input terminal indirectly through a shared network. The position detection device 1, which includes no camera, may include the input terminal for receiving image signals from the camera. The position detection device 1, which includes no camera, may use the image signals received through the input terminal to detect the positions of the left and right eyes.
The position detection device 1 may include a sensor. The sensor may be an ultrasonic sensor or an optical sensor. The position detection device 1 may be configured to detect the position of the user's head with the sensor, and the positions of the left and right eyes based on the position of the head. The position detection device 1 may use one sensor or two or more sensors to detect the positions of the left and right eyes as coordinates in a 3D space.
The 3D display system 100 may not include the position detection device 1. When the 3D display system 100 does not include the position detection device 1, the 3D display device 3 may include an input terminal for receiving signals from an external detection device. The external detection device may be connected to the input terminal. The external detection device may use electrical signals or optical signals as transmission signals transmitted to the input terminal. The external detection device may be connected to the input terminal indirectly through a shared network. The 3D display device 3 may receive positional coordinates indicating the positions of the left and right eyes input from the external detection device.
The information detection device 2 is configured to detect first information. The first information is associated with performance degradation or failure factors for a display panel 6. The first information includes, for example, the temperature, the movement direction, and at least either the latitude or longitude of the display panel 6, as well as the air temperature and the illuminance around the display panel 6, and the weather. The information detection device 2 includes one or more components selected from following sensors. These sensors include a device temperature sensor 21, an outside air temperature sensor 22, an illuminance sensor 23, a weather information detector 24, a position detector 25, and a direction sensor 26. The display panel 6 may be surrounded by an area adjacent to the 3D display device 3. The display panel 6 may be surrounded by an area defined within a certain distance from the 3D display device 3. The area within the certain distance may be an area outside the display panel 6 that is likely to be under substantially the same temperature, illuminance, and weather as the display panel 6. The display panel 6, which is mounted on the movable object 10, may be surrounded by an area inside the movable object 10.
The device temperature sensor 21 is configured to detect the temperature of the display panel 6. The device temperature sensor 21 is configured to output the detected temperature to the 3D display device 3.
The outside air temperature sensor 22 is configured to detect the air temperature around the display panel 6. The outside air temperature sensor 22 is configured to output the detected air temperature to the 3D display device 3.
The illuminance sensor 23 is configured to detect an illuminance around the display panel 6. With the 3D display device 3 mounted on the movable object 10, for example, the illuminance sensor 23 is configured to detect the illuminance around the display panel 6 reachable by sunlight that is reflected on the windshield of the movable object 10. The illuminance sensor 23 is configured to output the detected illuminance to the 3D display device 3.
The weather information detector 24 is configured to detect weather information by receiving the information with an external device through a communication network. The weather information is configured to indicate the weather conditions or the weather at the location of the display panel 6. The weather information includes, for example, sunny weather, rainy weather, and cloudy weather. The weather information detector 24 is configured to output the detected weather information to the 3D display device 3.
The position detector 25 is configured to detect the position of the information detection device 2 including the position detector 25. The information detection device 2 may be immovable relative to the 3D display device 3. Thus, the position detector 25 may detect the position of the 3D display device 3 by detecting the position of the information detection device 2. The position detector 25 may include a receiver for a global navigation satellite system (GNSS). The position detector 25 is configured to detect its position based on signals transmitted from satellites and received by the GNSS receiver. The GNSS includes satellite navigation systems such as the Global Positioning System (GPS), the Global Navigation Satellite System (GLONASS), Galileo, or the Quasi-Zenith Satellite System. The position detector 25 is configured to detect the position of the 3D display device 3, and then outputs the detected position to the 3D display device 3.
The direction sensor 26 is configured to detect the direction in which the display panel 6 moves. The direction sensor 26 is configured to output the detected direction to the 3D display device 3.
The 3D display device 3 includes an obtaining unit 4, an illuminator 5, the display panel 6, a shutter panel 7 as an optical element, and a controller 8.
The obtaining unit 4 may be configured to obtain the positions of the user's eyes detected by the position detection device 1. The obtaining unit 4 may be configured to obtain the first information detected by the information detection device 2. When the 3D display device 3 is mounted on the movable object 10, the obtaining unit 4 may receive information from an engine control unit or an electric control unit (ECU) of the movable object 10. The information, which is received by the obtaining unit 4 from the ECU, may include lighting information indicating whether the headlights of the movable object 10 are turned on. The headlights may be configured to turn on and off based on the illuminance detected by the illuminance sensor included in the movable object 10. More specifically, the headlights may be configured to turn on when the illuminance is below a threshold, and turn off when the illuminance is equal to or greater than the threshold. The movable object 10 thus receives sunlight with lower illuminance near the display panel 6 mounted on the movable object 10 when the headlights are turned on than when the headlights are turned off. The obtaining unit 4 can obtain the range of illuminance levels near the display panel 6 based on the lighting information for the headlights.
The illuminator 5 can illuminate a surface of the display panel 6. The illuminator 5 may include a light source, a light guide plate, a diffuser plate, and a diffusion sheet. The illuminator 5 is configured to emit illumination light using the light source, and spread the illumination light uniformly for illuminating the surface of the display panel 6 using its component including the light guide plate, the diffuser plate, and the diffusion sheet. The illuminator 5 can emit the uniform light toward the display panel 6.
The display panel 6 may be a transmissive liquid crystal display panel. The display panel 6 is not limited to a transmissive liquid crystal display panel but may be another display panel such as an organic electroluminescent (EL) display. When the display panel 6 is a self-luminous display panel, the 3D display device 3 may not include the illuminator 5. The display panel 6 that is a liquid crystal panel will now be described. As shown in
Each divisional area corresponds to a subpixel. Thus, the active area 61 includes multiple subpixels arranged in a lattice in the horizontal and vertical directions.
Each subpixel corresponds to one of red (R), green (G), and blue (B). A set of three subpixels colored R, G, and B forms a pixel. A pixel may be referred to as a picture element. For example, multiple subpixels forming individual pixels are arranged in the horizontal direction. For example, subpixels having the same color are arranged in the vertical direction.
As described above, multiple subpixels arranged in the active area 61 form subpixel groups Pg. The subpixel groups Pg may include multiple subpixels from multiple pixels. The subpixel groups Pg are repeatedly arranged in the horizontal direction. The subpixel groups Pg are also repeatedly arranged in the vertical direction. The subpixel groups Pg each include subpixels in certain rows and columns. More specifically, the subpixel groups Pg each include (2×n×b) subpixels P1 to P(2×n×b), which are continuously arranged in b row(s) in the vertical direction and in 2×n columns in the horizontal direction. In the example shown in
Each subpixel group Pg is the smallest unit to be controlled by the controller 8 (described later) to display an image. The subpixels P1 to P(2×n×b) included in each subpixel group Pg with the same identification information are controlled by the controller 8 at the same time. For example, the controller 8 can cause the subpixels P1 in all the subpixel groups Pg displaying the left eye image to display the right eye image at the same time when switching the image to be displayed by the subpixels P1 from the left eye image to the right eye image.
As shown in
The shutter panel 7 includes a liquid crystal shutter. As shown in
The light transmissive areas 71 and the light attenuating areas 72 extend in the vertical direction. The light transmissive areas 71 and the light attenuating areas 72 are arranged repeatedly alternately in the horizontal direction. Thus, as shown in
As shown in
E:d=(n×Hp):g formula (1)
d:Bp=(d+g):(2×n×Hp) formula (2)
The optimum viewing distance d refers to a distance between the user's right eye or left eye and the shutter panel 7. The direction of a straight line passing through the right eye and the left eye (interocular direction) corresponds to the horizontal direction. The interocular distance E is a distance between the user's right eye and left eye. The interocular distance E may be, for example, a distance of 61.1 to 64.4 mm, which has been calculated through studies performed by the National Institute of Advanced Industrial Science and Technology. Hp is the horizontal length of a subpixel as shown in
In this structure, the shutter panel 7 is configured to transmit image light from selected subpixels in the active area 61 through the light transmissive areas 71 to reach the user's left eye. The shutter panel 7 is configured to transmit image light from other subpixels through the light transmissive areas 71 to reach the user's right eye. An image viewable to the user when the image light reaches each of the user's left and right eyes will now be described in detail with reference to
As described above, the left viewable areas 61aL in
The right viewable areas 61aR in
Thus, the left eye views the left eye image, and the right eye views the right eye image. As described above, the right eye image and the left eye image are having parallax between the images. Thus, the user views a 3D image.
The controller 8 may be connected to each of the components of the 3D display system 100, and be configured to control these components. The components controlled by the controller 8 include the position detection device 1 and the display panel 6. The controller 8 may be, for example, a processor. The controller 8 may include one or more processors. The processors may include a general-purpose processor that reads a specific program, and performs a specific function, or a processor dedicated to specific processing. The dedicated processor may include an application specific integrated circuit (ASIC). The processor may include a programmable logic device (PLD). The PLD may include a field-programmable gate array (FPGA). The controller 8 may either be a system on a chip (SoC) or a system in a package (SiP) in which one or more processors cooperate with other components. The controller 8 may include a storage unit, and store various items of information or programs to operate each component of the 3D display system 100. The storage unit may be, for example, a semiconductor memory. The storage unit may serve as a work memory for the controller 8.
The controller 8 is configured to control the shutter panel 7. More specifically, the controller 8 is configured to control the shutter panel 7 to arrange the light transmissive areas 71 and the light attenuating areas 72. More specifically, the controller 8 is configured to control the shutter panel 7 to arrange the light transmissive areas 71 and the light attenuating areas 72 alternately in the horizontal direction. The controller 8 may be configured to change the light transmissive areas 71 and the light attenuating areas 72 on the shutter panel 7 in every certain time cycle. The certain time cycle is, for example, a period for which continuous irradiation with sunlight is to be avoided in a specific part of the display panel 6. The cycle is determined through, for example, an experiment. The certain time cycle may be 10 seconds, 30 seconds, or a minute.
As shown in
The controller 8 is configured to control the first areas 701 on the shutter panel 7 into the light transmissive state, and is configured to control the parallax image to be displayed on the display panel 6 as shown in
In the example shown in
As shown in
The controller 8 is configured to control the second areas 702 on the shutter panel 7 into the light transmissive state, and is configured to control the parallax image to be displayed on the display panel 6 as shown in
In the example shown in
As shown in
The controller 8 is configured to control the third areas 703 on the shutter panel 7 into the light transmissive state, and is configured to control the parallax image to be displayed on the display panel 6 as shown in
Similarly, the controller 8 may be configured to control k-th areas on the shutter panel 7 into the light transmissive state during a k-th (k=1 to (2×n)) period. The controller 8 is configured to control the k-th areas on the shutter panel 7 into the light transmissive state, and is configured to control the parallax image to be displayed on the display panel 6 based on the positions of the user's eyes detected by the position detection device 1. More specifically, the controller 8 is configured to cause selected subpixels to display the left eye image emitting image light reaching the position of the user's left eye through the k-th areas in the light transmissive state. For example, the controller 8 is configured to cause at least a certain proportion of subpixels to display the left eye image emitting image light reaching the position of the left eye. In
Although the controller 8 is configured to control the first areas 701, the second areas 702, and the third areas 703 to enter the light transmissive state in the stated order, the areas may enter the light transmissive state in another order. For example, the controller 8 may be configured to control the first areas 701 to (2×n) areas into the light transmissive state in any order.
In other words, the controller 8 is configured to change at least selected areas from the light transmissive state to the light attenuating state, and is configured to change selected areas in the light attenuating state with the same total size as the areas changed from the light transmissive state to the light attenuating state to the light transmissive state in every certain time cycle. The controller 8 may be configured to change all areas in the light transmissive state to the light attenuating state, and to change selected areas in the light attenuating state with the same total size as the areas changed from the light transmissive state to the light attenuating state to the light transmissive state.
The controller 8 may cause selected subpixels on the display panel 6 emitting image light through the light transmissive areas 71 to display a black image while causing selected areas on the shutter panel 7 to switch between the light attenuating state and the light transmissive state. The black image is, for example, a black image with a certain luminance level. The certain luminance level may be a luminance level with the lowest gradation displayable by the subpixels, or a value corresponding to the luminance level with the gradation equivalent to the lowest gradation. This changes the position of the image viewable by the user's eyes to reduce flickering perceivable to the user's eyes.
The controller 8 may be configured to change the light transmissive areas 71 and the light attenuating areas 72 when raising the luminance level of an image to be displayed by the display panel 6 to higher than the certain level. This causes the user to perceive a significant change in the luminance of the image, thus reducing flickering perceivable to the user when the image position is changed.
The controller 8 may be configured to control the shutter panel 7 and to cause a difference in a duration for each area on the display panel 6 to be in the light transmissive state to fall within a certain range. The certain range set narrower reduces the difference in the duration for each area to receive sunlight, thus lowering the likelihood that some areas are damaged under more sunlight.
The controller 8 may be configured to determine the certain time cycle based on the first information and date and time information, and to change the light transmissive areas 71 and the light attenuating areas 72 on the shutter panel 7 in every certain time cycle. The controller 8 may obtain the first information from the information detection device 2. The controller 8 mounted on the movable object 10 may obtain the first information based on information transmitted from the ECU included in the movable object 10. The date and time information is configured to indicate the current date and time. The controller 8 may obtain the date and time information from an external device through a communication network. The controller 8 may use the date and time information managed by the 3D display device 3.
For example, the controller 8 may determine the certain time cycle based on the device temperature detected by the device temperature sensor 21 included in the information detection device 2. More specifically, the controller 8 may be configured to shorten the certain time cycle as the device temperature received by the obtaining unit 4 from the device temperature sensor 21 is higher. That is, e.g., in an exemplary embodiment, the controller 8 may be configured to shorten the certain time cycle as/with the device temperature received by the obtaining unit 4 from the device temperature sensor 21 increases/increasing. E.g., in an exemplary embodiment, the controller 8 may be configured to shorten the certain time cycle using a certain relationship of (equation etc.) device temperature and certain time cycle, e.g. in a manner proportional to an increase of the device temperature received by the obtaining unit 4 from the device temperature sensor 21. E.g., in an exemplary embodiment, the controller 8 may be configured to shorten the certain time cycle using a certain table assigning certain time cycles to different device temperatures (ranges).
For example, the controller 8 may determine the certain time cycle based on the outside air temperature detected by the outside air temperature sensor 22 included in the information detection device 2. More specifically, the controller 8 may be configured to shorten the certain time cycle as the outside air temperature received by the obtaining unit 4 is higher. That is, e.g., in an exemplary embodiment, the controller 8 may be configured to shorten the certain time cycle as/with the outside air temperature received by the obtaining unit 4 increases/increasing. E.g., in an exemplary embodiment, the controller 8 may be configured to shorten the certain time cycle using a certain relationship of (equation etc.) outside air temperature and certain time cycle, e.g. in a manner proportional to an increase of the outside air temperature received by the obtaining unit 4. E.g., in an exemplary embodiment, the controller 8 may be configured to shorten the certain time cycle using a certain table assigning certain time cycles to different outside air temperatures (ranges).
For example, the controller 8 may determine the certain time cycle based on the illuminance detected by the illuminance sensor 23 included in the information detection device 2. More specifically, the controller 8 may be configured to shorten the certain time cycle as the illuminance received by the obtaining unit 4 from the illuminance sensor 23 is higher. That is, e.g., in an exemplary embodiment, the controller 8 may be configured to shorten the certain time cycle as/with the illuminance received by the obtaining unit 4 from the illuminance sensor 23 increases/increasing. E.g., in an exemplary embodiment, the controller 8 may be configured to shorten the certain time cycle using a certain relationship of (equation etc.) illuminance and certain time cycle, e.g. in a manner proportional to an increase of the illuminance received by the obtaining unit 4 from the illuminance sensor 23. E.g., in an exemplary embodiment, the controller 8 may be configured to shorten the certain time cycle using a certain table assigning certain time cycles to illuminances. The controller 8 may determine the certain time cycle based on the range of illuminance levels based on the lighting information for the headlights obtained by the obtaining unit 4 from the ECU.
For example, the controller 8 may determine the certain time cycle based on the weather detected by the weather information detector 24 included in the information detection device 2. More specifically, the controller 8 may be configured to shorten the certain time cycle when the weather received by the obtaining unit 4 from the weather information detector 24 is sunny than when the received weather is cloudy or rainy. That is, e.g., in an exemplary embodiment, the controller 8 may be configured to shorten the certain time cycle for a case in which the weather received by the obtaining unit 4 from the weather information detector 24 is sunny. E.g., in an exemplary embodiment, the controller 8 may be configured to shorten/adjust the certain time cycle using a table assigning certain time cycles to different kinds of weather (as respectively received/receivable by the obtaining unit 4 from the weather information detector 24).
For example, the controller 8 may determine the certain time cycle based on the position detected by the position detector 25 included in the information detection device 2. More specifically, the controller 8 may be configured to shorten the certain time cycle as the latitude indicating the position of the 3D display device 3 including the display panel 6 received by the obtaining unit 4 from the position detector 25 is lower. That is, e.g., in an exemplary embodiment, the controller 8 may be configured to shorten the certain time cycle as/with the latitude indicating the position of the 3D display device 3 including the display panel 6 received by the obtaining unit 4 from the position detector 25 decreases/decreasing. E.g., in an exemplary embodiment, the controller 8 may be configured to shorten the certain time cycle using a relationship of the latitude indicating the position of the 3D display device 3 including the display panel 6 received by the obtaining unit 4 from the position detector 25 and the certain time cycle. E.g., in an exemplary embodiment, the controller 8 may be configured to shorten the certain time cycle using a table assigning certain time cycles to different latitudes or ranges thereof.
For example, the controller 8 may determine the certain time cycle based on the position detected by the position detector 25 included in the information detection device 2 and the current date. More specifically, when a longitude indicating a position is received by the obtaining unit 4 from the position detector 25, the controller 8 may determine the certain time cycle based on the longitude and the season corresponding to the current date. For example, the controller 8 may be configured to shorten the certain time cycle when the season associated with the position of the 3D display device 3 including the display panel 6 is determined to be summer based on the longitude and the date than when the season is other than summer. That is, e.g., in an exemplary embodiment, the controller 8 may be configured to shorten the certain time cycle when the season associated with the position of the 3D display device 3 including the display panel 6 is determined to be summer based on the longitude and the date.
For example, the controller 8 may determine the certain time cycle based on the movement direction of the 3D display device 3 including the display panel 6 detected by the direction sensor 26 included in the information detection device 2. More specifically, the controller 8 may be configured to shorten the certain time cycle when the movement direction of the 3D display device 3 received by the obtaining unit 4 from the direction sensor 26 includes a greater component in the south direction. That is, e.g., in an exemplary embodiment, the controller 8 may be configured to shorten the certain time cycle when the movement direction of the 3D display device 3 received by the obtaining unit 4 from the direction sensor 26 includes an increasing component in the south direction. E.g., in an exemplary embodiment, the controller 8 may be configured to shorten the certain time cycle using a certain relationship of a component in the south direction of the movement direction (or a variation thereof) and the certain time cycle. E.g., in an exemplary embodiment, the controller 8 may be configured to shorten the certain time cycle using a table assigning certain time cycles to different components in the south direction of the movement direction of the 3D display device 3. The controller 8 may determine the certain time cycle based on the movement direction and time. For example, the controller 8 may extend the certain time cycle when the movement direction is in a southern direction and the time is nighttime than when the time is daytime.
The controller 8 is not limited to the structure described above, and may determine the certain time cycle using at least one item of information obtained from the information detection device 2 or the ECU.
As described above, the controller 8 according to the present embodiment is configured to change the light transmissive areas 71 and the light attenuating areas 72 in every certain time cycle, and is configured to cause each subpixel to display either the left eye image or the right eye image depending on the positions of the user's eyes and the positions of the light transmissive areas 71. The areas irradiated with sunlight transmitted through the light transmissive areas 71 can thus be changed in the active area 61 of the display panel 6. This avoids continuous irradiation of a specific part of the active area 61 with sunlight. This avoids early damage in a specific part of the active area 61, and reduces the frequency of repair or replacement of the display panel 6 due to such early damage in a specific part.
The controller 8 according to one embodiment of the present disclosure is configured to change at least selected ones of the areas in the light transmissive state to the light attenuating state, and is configured to change selected areas in the light attenuating state with the same total size as the areas changed from the light transmissive state to the light attenuating state to the light transmissive state. This structure changes the position of the image perceived by each eye of the user less noticeably than when all the light transmissive areas 71 are changed to the light attenuating areas 72 and all the light attenuating areas 72 are changed to the light transmissive areas 71 on the shutter panel 7. This reduces flickering in the image perceivable to the user's eyes.
The controller 8 according to one embodiment of the present disclosure is configured to change all areas in the light transmissive state to the light attenuating state, and is configured to change selected areas in the light attenuating state to have the same size as the areas changed from the light transmissive state to the light attenuating state to the light transmissive state. This structure allows fewer areas to be irradiated continuously with sunlight in the display panel 6 than the structure for changing selected areas in the light transmissive state to the light attenuating state and selected areas in the light attenuating state to the light transmissive state. This avoids early damage in a specific part of the active area 61, and reduces the frequency of repair or replacement of the display panel 6 due to such early damage in a specific part.
The controller 8 according to the present embodiment is configured to control the shutter panel 7 and to cause a difference in a duration for each area on the display panel 6 to be in the light transmissive state to fall within a certain range. This enables the active area 61 of the display panel 6 to avoid being irradiated with sunlight in its specific area for a long time. This avoids early damage in a specific part of the active area 61, and reduces the frequency of repair or replacement of the display panel 6 due to such early damage in a specific part.
The controller 8 according to the present embodiment is configured to change the certain time cycle based on at least either the first information or the date and time information. This structure may be configured to change areas to be irradiated with sunlight more rapidly when the intensity of sunlight is high than when the intensity of sunlight is low. This structure avoids early damage of, for example, a specific part of the active area 61 irradiated with sunlight during a time zone with high sunlight intensity. This reduces the frequency of repair or replacement of the display panel 6 due to such early damage in a specific part.
Although the embodiment above is described as a typical example, it will be apparent to those skilled in the art that various modifications and substitutions can be made to the embodiment without departing from the spirit and scope of the present invention. Thus, the above embodiment should not be construed to be restrictive, but may be variously modified within the spirit and scope of the claimed invention. For example, multiple structural blocks described in the above embodiment or examples may be combined into a structural block, or each structural block may be divided.
Although the subpixel groups Pg in the display panel 6 according to the embodiment described above are repeatedly arranged in the horizontal direction and the vertical direction, the disclosure is not limited to this embodiment. For example, as shown in
Although the controller 8 according to the above embodiment is configured to change subpixels for displaying the right eye image or the left eye image based on the positions of the user's eyes and the positions of areas in the light transmissive state, the disclosure is not limited to this embodiment. For example, the controller 8 may use the fixed positions of the user's eyes. In this structure, the controller 8 is configured to change subpixels for displaying the right eye image or the left eye image based on the positions of areas in the light transmissive state. In this case, the 3D display system 100 may not include the position detection device 1. The obtaining unit 4 may not obtain the positions of the eyes.
Number | Date | Country | Kind |
---|---|---|---|
2018-149305 | Aug 2018 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2019/031496 | 8/8/2019 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2020/032212 | 2/13/2020 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6970290 | Mashitani et al. | Nov 2005 | B1 |
20060170764 | Hentschke | Aug 2006 | A1 |
20110157171 | Lin | Jun 2011 | A1 |
20120127286 | Sato et al. | May 2012 | A1 |
20120242723 | Miyake | Sep 2012 | A1 |
20120293500 | Nakahata | Nov 2012 | A1 |
20130201091 | Hung | Aug 2013 | A1 |
20130286168 | Park et al. | Oct 2013 | A1 |
20160325683 | Hayashi | Nov 2016 | A1 |
20180242413 | Hue | Aug 2018 | A1 |
Number | Date | Country |
---|---|---|
102480628 | May 2012 | CN |
2456211 | May 2012 | EP |
H09-138370 | May 1997 | JP |
2001-166259 | Jun 2001 | JP |
2006-520921 | Sep 2006 | JP |
2014-509465 | Apr 2014 | JP |
2018-120189 | Aug 2018 | JP |
Number | Date | Country | |
---|---|---|---|
20210302729 A1 | Sep 2021 | US |