This U.S. non-provisional patent application claims priority to and the benefit of Korean Patent Application No. 10-2015-0108263, filed on Jul. 30, 2015, in the Korean Intellectual Property Office, the content of which is hereby incorporated herein by reference in its entirety.
1. Field
One or more aspects of example embodiments of the present invention relate to a display apparatus.
2. Description of the Related Art
In recent years, various display devices, such as liquid crystal displays, organic light emitting displays, electrowetting display devices, plasma display panels, electrophoretic display devices, etc., have been developed. The display devices may be applied to various electronic equipment, such as smart phones, digital cameras, notebook computers, navigation devices, etc.
In general, the display device displays colors using three primary colors of red, green, and blue colors. The red, green, and blue colors correspond to spectral sensitivity curves of three cone cells of the human eye, respectively.
The above information disclosed in this Background section is for enhancement of understanding of the background of the invention, and therefore, it may contain information that does not constitute prior art.
One or more aspects of example embodiments of the present invention are directed toward a display apparatus that considers visual characteristics of a user.
According to an embodiment of the present invention, a display apparatus includes: a display panel configured to display a first image having a first image spectrum corresponding to a first animal spectral sensitivity curve of a first cone cell of an animal, and to display a second image having a second image spectrum corresponding to a second animal spectral sensitivity curve of a second cone cell of the animal, in response to output image data, and at least one of the first and second animal spectral sensitivity curves is different from first, second, and third human spectral sensitivity curves of cone cells of a human that perceive red, green, and blue colors, respectively.
The display apparatus may further include a controller configured to output the output image data, and the output image data may include: first primary color image data including information of a first animal primary color corresponding to the first animal spectral sensitivity curve; and second primary color image data including information of a second animal primary color corresponding to the second animal spectral sensitivity curve, and the display panel may be configured to display the first and second images in response to the first and second primary color image data, respectively.
At least one of the first and second animal primary colors may be different from the red, green, and blue colors corresponding to visual characteristics of the human.
At least one of the first and second images may include a non-visible component that is not perceived by the human.
The non-visible component may include an ultraviolet ray and/or a near-ultraviolet ray.
The controller may be configured to generate information corresponding to the non-visible component from input image data.
The controller may be configured to receive input image data, and to convert the input image data to the first primary color image data and the second primary color image data based on the first and second animal primary colors, and the input image data may include information corresponding to the red, green, and blue colors.
At least one of the first and second animal spectral sensitivity curves may include a peak wavelength that is different from peak wavelengths of the first, second, and third human spectral sensitivity curves, and at least one of the first and second animal spectral sensitivity curves may include a full width half maximum that is different from full width half maximums of the first, second, and third human spectral sensitivity curves.
The peak wavelength of the second animal spectral sensitivity curve may be shorter than the peak wavelength of the third human spectral sensitivity curve, the full width half maximum of the second animal spectral sensitivity curve may be wider than the full width half maximum of the third human spectral sensitivity curve, the peak wavelength of the second animal spectral sensitivity curve may be shorter than the peak wavelength of the first animal spectral sensitivity curve, and the peak wavelength of the third human spectral sensitivity curve may be shorter than the peak wavelengths of the first and second human spectral sensitivity curves.
The display panel may include: a first sub-pixel configured to display the first image; and a second sub-pixel configured to display the second image.
The first sub-pixel may include a first color filter having a first transmittance corresponding to the first animal spectral sensitivity curve, and the second sub-pixel may include a second color filter having a second transmittance corresponding to the second animal spectral sensitivity curve.
Center wavelengths of the first and second transmittances may be the same as center wavelengths of the first and second animal spectral sensitivity curves, respectively.
The display apparatus may further include a backlight including: a first light source configured to emit a first light having a first animal primary color corresponding to the first animal spectral sensitivity curve during a first field of a frame period; and a second light source configured to emit a second light having a second animal primary color corresponding to the second animal spectral sensitivity curve during a second field of the frame period, and the display panel may include a liquid crystal layer, the display panel being configured to display the first image during the first field, and to display the second image during the second field.
The first light source may be configured to be turned on during a plurality of first on periods of the first field, and may be configured to be turned off during a first off period between the first on periods, and the second light source may be configured to be turned on during a plurality of second on periods of the second field, and may be configured to be turned off during a second off period between the second on periods.
The frame period may include a first frame period and a second frame period, which may be sequentially arranged, the first frame period may include the first field, the second field, and the first field sequentially arranged, and the second frame period may include the second field, the first field, and the second field sequentially arranged.
The first light may have a brightness lower than a brightness of the second light during the first frame period, and the brightness of the first light may be higher than the brightness of the second light during the second frame period.
The second animal spectral sensitivity curve may include a short wavelength component and a long wavelength component, the second image may include a third image having a third image spectrum corresponding to the short wavelength component, and a fourth image having a fourth image spectrum corresponding to the long wavelength component, and the display panel may be configured to divide the second image into the third and fourth images, and to display the third and fourth images.
The display panel may include a first sub-pixel configured to display the first image, a third sub-pixel configured to display the third image, and a fourth sub-pixel configured to display the fourth image.
The first sub-pixel may include a first color filter configured to transmit the first image, the third sub-pixel may include a third color filter configured to transmit the third image, and the fourth sub-pixel may include a fourth color filter configured to transmit the fourth image.
The third image may include an ultraviolet ray or a near-ultraviolet ray.
The display apparatus may further include a backlight including: a first light source configured to emit a first light having a first animal primary color during a first field of a frame period; a third light source configured to emit a third light having a color corresponding to a third animal spectral sensitivity curve during a second field of the frame period; and a fourth light source configured to emit a fourth light having a color corresponding to a fourth animal spectral sensitivity curve during a third field of the frame period, and the display panel may include a liquid crystal layer, the display panel being configured to display the first image during the first field, to display the third image during the second field, and to display the fourth image during the third field.
The third light source may be configured to emit the third light during a third on period, the fourth light source may be configured to emit the fourth light during a fourth on period, and at least a portion of the third on period may be overlapped with the fourth on period.
A width of the third on period may be the same as a width of the fourth on period, and the third and fourth on periods may be provided concurrently.
According to an embodiment of the present invention, a display apparatus includes a display panel configured to display an image including a non-visible component in response to output image data including information corresponding to the non-visible component that is not perceived by a human.
The non-visible component may include an ultraviolet ray and/or a near-ultraviolet ray.
The display apparatus may further include a backlight source configured to emit a light including the non-visible component, and the display panel may be configured to receive the light and to display the image using the light.
The display panel may include a color filter configured to transmit the non-visible component.
The display apparatus may further include a controller configured to receive input image data that does not include the non-visible component, and to convert the input image data to the output image data.
According to an embodiment of the present invention, a display apparatus includes: a controller configured to map a first color gamut of input image data to a second color gamut to convert the input image data to output image data; and a display panel configured to display a first image having a first animal primary color, and to display a second image having a second animal primary color in response to the output image data, the first color gamut including red, green, and blue colors, the second color gamut including the first and second animal primary colors, and the first animal primary color includes a first spectrum corresponding to a first animal spectral sensitivity curve of a first cone cell of an animal, and the second animal primary color includes a second spectrum corresponding to a second animal spectral sensitivity curve of a second cone cell of the animal.
According to one or more example embodiments of the present invention, animals having cone cells different from those of humans may perceive the same image as an intended image (e.g., a real image) of an object through the image displayed on the display panel.
The above and other aspects and features of the present invention will become readily apparent by reference to the following detailed description when considered in conjunction with the accompanying drawings wherein:
Hereinafter, example embodiments will be described in more detail with reference to the accompanying drawings, in which like reference numbers refer to like elements throughout. The present invention, however, may be embodied in various different forms, and should not be construed as being limited to only the illustrated embodiments herein. Rather, these embodiments are provided as examples so that this disclosure will be thorough and complete, and will fully convey the aspects and features of the present invention to those skilled in the art. Accordingly, processes, elements, and techniques that are not necessary to those having ordinary skill in the art for a complete understanding of the aspects and features of the present invention may not be described. Unless otherwise noted, like reference numerals denote like elements throughout the attached drawings and the written description, and thus, descriptions thereof may not be repeated.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the present invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and/or the present specification, and should not be interpreted in an idealized or overly formal sense, unless expressly so defined herein.
Hereinafter, one or more aspects of exemplary embodiments of the present invention will be described in more detail with reference to the accompanying drawings.
Referring to
The controller 100 receives a plurality of control signals CS from an external source (e.g., external to the display apparatus 1000 or external to the controller 100). The controller 100 generates a data control signal D-CS (e.g., including an output start signal, a horizontal start signal, etc.), and a gate control signal G-CS (e.g., including a vertical start signal, a vertical clock signal, a vertical clock bar signal, etc.), based on the control signals CS. The data control signal D-CS is applied to the data driver 300, and the gate control signal G-CS is applied to the gate driver 200.
The gate driver 200 sequentially outputs gate signals in response to the gate control signal G-CS provided from the controller 100. The gate signals are applied to the display panel 400.
The data driver 300 receives output image data Idata from the controller 100. The data driver 300 converts the output image data Idata to data voltages in response to the data control signal D-CS provided from the controller 100. The data voltages are applied to the display panel 400.
The display panel 400 includes a plurality of gate lines GL1 to GLn, a plurality of data lines DL1 to DLm, and a plurality of sub-pixels SPX, where n and m are natural numbers.
The gate lines GL1 to GLn extend in a first direction DR1, and are arranged along a second direction DR2. The data lines DL1 to DLm are insulated from the gate lines GL1 to GLn, and cross the gate lines GL1 to GLn. For example, the data lines DL1 to DLm extend in the second direction DR2, and are arranged along the first direction DR1. The first direction DR1 may be perpendicular or substantially perpendicular to the second direction DR2.
The sub-pixels SPX are arranged in a matrix form along the first and second directions DR1 and DR2.
The sub-pixels SPX may be grouped into pixels PX. Each pixel PX displays a unit image, and the display panel 400 has a resolution that may be determined depending on the number of the pixels PX included in the display panel 400.
As an example, two sub-pixels SPX are grouped together for one pixel PX, but the number of the sub-pixels SPX that are grouped together for one pixel PX is not limited to two. For example, three or more sub-pixels SPX may be grouped together for one pixel PX, or each sub-pixel SPX may be defined as one pixel PX.
Each of the sub-pixels SPX is connected to a corresponding data line of the data lines DL1 to DLm, and connected to a corresponding gate line of the gate lines GL1 to GLn.
The display panel 400 is not limited to a specific display panel. That is, various display panels, such as an organic light emitting display panel, a liquid crystal display panel, a plasma display panel, an electrophoretic display panel, an electrowetting display panel, etc., may be used as the display panel 400. Hereinafter, for convenience of explanation, the display panel 400 will be described as a liquid crystal display panel 400. As shown in
In general, a display apparatus displays an image using three primary colors. The three primary colors include red, green, and blue colors, and are determined depending on visual characteristics of the human eye for perceiving colors in trichromacy. The red, green, and blue colors are perceived by a first human cone cell (e.g., an L cone cell), a second human cone cell (e.g., an M cone cell), and a third human cone cell (e.g., an S cone cell), respectively.
As shown in
As shown in
As shown in
As shown in
Although the human and the dog may view a same object in the same way, the human and the dog may sense different primary colors from the same object, and may process information about the sensed primary colors differently to perceive colors or objects differently due to the difference between the spectral sensitivity curves. Accordingly, even though the dog perceives the image displayed on the display panel that is designed and driven for the human, the dog does not necessarily perceive the same image as the object. Therefore, the display panel according to an exemplary embodiment of the present invention may be designed and driven to allow animals, other than humans, to perceive the same image as the object by using the difference between the spectral sensitivity curves.
For example, an ultraviolet ray and/or a near-ultraviolet ray may not be perceived by the human (e.g., a naked human eye). In more detail, the human may not have cone cells required to sense the ultraviolet ray and/or the near-ultraviolet ray, and a crystalline lens of the human eye may not transmit the ultraviolet ray and/or the near-ultraviolet ray. In comparison, a crystalline lens of the dog may transmit the ultraviolet ray and/or the near-ultraviolet ray, and the second animal spectral sensitivity curve SC_A2 may be overlapped with areas corresponding to the ultraviolet ray and/or the near-ultraviolet ray. Accordingly, the dog may sense and perceive the ultraviolet ray and/or the near-ultraviolet ray provided from the object. Thus, the display panel according to an exemplary embodiment of the present invention may be designed and driven to display the ultraviolet ray and/or the near-ultraviolet ray.
Hereinafter, a light that is not capable of being sensed by the cone cells of the human and/or not capable of being perceived by the human will be referred to as a non-visible component. For example, the non-visible component may include the ultraviolet ray and/or the near-ultraviolet ray, but is not limited thereto or thereby. That is, the non-visible component may include a light having a wavelength longer than a red wavelength, and may include an infrared ray and/or a near-infrared ray.
Referring to
According to one or more exemplary embodiments of the present invention, the input image data RGB may include information for colors corresponding to the visual characteristics of the human (e.g., red, green, and blue colors). For example, the input image data RGB may include red, green, and blue data RD, GD, and BD having information for the red, green, and blue colors, respectively. The input image data RGB may not include the non-visible component, since the input image data RGB are provided based on the human.
The output image data Idata may include information for colors corresponding to the visual characteristics of one or more animals other than the human. For example, the output image data Idata may include first primary color image data ID1 and second primary color image data ID2. The first and second primary color image data ID1 and ID2 may include information for the first and second animal primary colors, respectively. Since the output image data Idata may be provided for the dog or an animal other than the human, the output image data Idata may include information corresponding to the non-visible component. In this case, the controller 100a generates the information corresponding to the non-visible component based on the input image data RGB, and generates the output image data Idata using the information corresponding to the non-visible component.
The controller 100a generates the output image data Idata based on the input image data RGB. In other words, the controller 100a maps a first color gamut of the input image data RGB to a second color gamut to convert the input image data RGB to the output image data Idata. Here, the first color gamut is defined by the red, green, and blue colors, and the second color gamut is defined by the first and second animal primary colors.
For example, the controller 100a may convert the input image data RGB to the output image data Idata based on the visual characteristics of the human and/or based on the visual characteristics of the dog. In more detail, the controller 100a performs the converting operation based on the first and second animal spectral sensitivity curves SC_A1 and SC_A2.
For example, the controller 100a may generate the first and second primary color image data ID1 and ID2 by using a correlation between the first and second animal spectral sensitivity curves SC_A1 and SC_A2 and the first to third human spectral sensitivity curves SC_H1 to SC_H3, or by analyzing the spectrum of the input image data RGB and using the analyzed result.
In addition, the input image data RGB may include information for the first and second animal primary colors. In this case, since the input image data RGB already include information corresponding to the visual characteristics of the dog, the controller 100a may not perform a process of converting the input image data RGB in consideration of the visual characteristics of the dog.
The first image may represent the first animal primary color. For example, the first and second sub-pixels SPX1 and SPX2 may display first and second images, respectively. The first image includes a first image spectrum IS1 corresponding to the first animal spectral sensitivity curve SC_A1 (refer to
In addition, the second image may represent the second animal primary color. For example, the second image includes a second image spectrum IS2 corresponding to the second animal spectral sensitivity curve SC_A2 (refer to
The first sub-pixel SPX1 receives the first primary color image data ID1, and displays the first image in response to the first primary color image data ID1. The second sub-pixel SPX2 receives the second primary color image data ID2, and displays the second image in response to the second primary color image data ID2. For example, the first and second primary color image data ID1 and ID2 may be provided as a shape of the data voltage.
In the case where the animal (e.g., the dog) has different cone cells from that of the human, and perceives the image of the object displayed through the display panel designed and driven in consideration of the visual characteristics of the human, the animal perceives the image differently from the intended image (e.g., real image) of the object.
However, according to one or more embodiments of the present invention, the pixel PX is applied with the first and second primary color image data ID1 and ID2 including information corresponding to the first and second primary colors, instead of the input image data RGB (refer to
Referring to
The backlight unit 500a may be located at a rear side of the display panel 400a, and may provide a backlight BL to the rear side of the display panel 400a. The backlight BL includes the first and second animal primary colors. In more detail, the backlight BL includes a light having a wavelength in a range (e.g., of about 360 nm to about 640 nm) in which the first and second animal spectral sensitivity curves SC_A1 and SC_A2 are distributed.
The backlight unit 500a may include a light source for generating the backlight, and an optical sheet for controlling a distribution of the backlight BL.
The liquid crystal layer LC is between the lower substrate LS and the upper substrate US. The upper and lower substrates US and LS respectively include electrodes to form an electrical field in the liquid crystal layer LC. The electrodes correspond to the first and second sub-pixels SPX1 and SPX2. Liquid crystal molecules included in the liquid crystal layer LC are controlled by the electrical field, and thus, a transmittance of the first and second sub-pixels SPX1 and SPX2 with respect to the backlight BL may be controlled.
The color filter CF includes a first color filter CF1 and a second color filter CF2. For example, the first and second color filters CF1 and CF2 are included in the upper substrate US, and form the first and second sub-pixels SPX1 and SPX2, respectively, but the present invention is not limited thereto. The first and second sub-pixels SPX1 and SPX2 may have colors depending on the transmittance in accordance with the wavelength of the first and second color filters CF1 and CF2.
In more detail, a first transmittance T1 of the first color filter CF1 may correspond to the first animal spectral sensitivity curve SC_A1 (refer to
A second transmittance T2 of the second color filter CF2 may correspond to the second animal spectral sensitivity curve SC_A2 (refer to
As described above, the display panel 400a may display the first and second images corresponding to the first and second primary color image data ID1 and ID2 to be spatially divided by using the first and second color filters CF1 and CF2.
In
The backlight BL may include a first peak P1 and a second peak P2. The first peak P1 may have a center wavelength of about 380 nm to about 483 nm, and may have a full width half maximum of about 5 nm to about 50 nm. The second peak P2 may have a center wavelength of about 480 nm to about 580 nm (e.g., wavelengths of normal yellow/green/red), and may have a full width half maximum of about 5 nm to about 50 nm.
Referring to
The display panel 400b includes a transmissive pixel TPX. The transmissive pixel TPX has the same or substantially the same structure and function as those of the first and second sub-pixels SPX1 and SPX2, except that the transmissive pixel TPX does not include the color filter. Thus, repeat details thereof will be omitted.
Since the transmissive pixel TPX does not include the color filter, the color of the light is not changed while passing through the transmissive pixel TPX.
The backlight unit 500b includes a first light source LS1 for emitting a first light L1, and a second light source LS2 for emitting a second light L2. The first light L1 may include the first animal primary color. For example, the first light L1 may include a first light spectrum IL1 corresponding to the first animal spectral sensitivity curve SC_A1. The center wavelength of the first light spectrum IL1 may be the same or substantially the same as the center wavelength of the first animal spectral sensitivity curve SC_A1.
The second light L2 may include the second animal primary color. For example, the second light L2 may include a second light spectrum IL2 corresponding to the second animal spectral sensitivity curve SC_A2. The center wavelength of the second light spectrum IL2 may be the same or substantially the same as the center wavelength of the second animal spectral sensitivity curve SC_A2. In this case, the second image spectrum IS2 may include components corresponding to the non-visible components.
Hereinafter, the time division driving operation of the display apparatus will be described in more detail with reference to
During the first field F1, the transmissive pixel TPX receives the first primary color image data ID1. Accordingly, the transmissive pixel TPX has a transmittance corresponding to the first primary color image data ID1.
In addition, the first light source LS1 emits the first light L1 during the first field F1. In more detail, the first light L1 is provided during a first on period OP1 defined in the first field F1.
As a result, the first light L1 passing through the transmissive pixel TPX has a brightness that is adjusted by the transmittance of the transmissive pixel TPX, and the transmissive pixel TPX displays the first image representing the first animal primary color using the first light L1.
The transmissive pixel TPX receives the second primary color image data ID2 during the second field F2. Therefore, the transmissive pixel TPX has a transmittance corresponding to the second primary color image data ID2.
In addition, the first light source LS1 does not emit the first light L1 during the second field F2, and the second light source LS2 emits the second light L2 during the second field F2. In more detail, the second light L2 is provided during a second on period OP2 defined in the second field F2.
As a result, the second light L2 passing through the transmissive pixel TPX has a brightness that is adjusted by the transmittance of the transmissive pixel TPX, and the transmissive pixel TPX displays the second image representing the second animal primary color using the second light L2.
As described above, the display panel 400b displays the first and second images corresponding to the first and second primary color image data ID1 and ID2 after dividing the first and second images in time using the first and second fields F1 and F2.
In addition, since the transmissive pixel TPX does not include the color filter, the transmissive pixel TPX transmits the first and second lights L1 and L2 without loss or substantially without loss of the first and second lights L1 and L2, which may occur when a color filter is used. Thus, a light efficiency of the display apparatus 2000 may be improved.
A frequency of the frame period FR may be determined depending on the visual characteristics of the animal (e.g., the dog). In general, a critical frequency of the animal may be different from a critical frequency of the human. In more detail, since the critical frequency of the animal may be higher than about 60 Hz, which is the critical frequency of the human, the animal may perceive image flickering in the unit of a frame when the display apparatus displays the image at the frequency of about 60 Hz.
Thus, according to some embodiments of the present invention, the frame period FR may have a frequency that is higher than the critical frequency of the animal. For example, the frame period FR may be provided at a frequency of about 80 Hz, and each of the first and second fields F1 and F2 may be provided at a frequency of about 160 Hz.
Referring to
During the first field F1, a rising period RP, a display period DP, and a falling period FP are defined in the first field F1 according to the transmittance of the transmissive pixel TPX. The transmittance of the transmissive pixel TPX corresponds to the first primary color image data ID1 applied to the transmissive pixel TPX during the display period DP. During the rising period, the transmittance of the transmissive pixel TPX increases to a transmittance level of the transmissive pixel TPX during the display period DP. During the falling period FP, the transmittance of the transmissive pixel TPX decreases to a transmittance level corresponding to a zero grayscale (e.g., a zero gray level) from the transmittance level during the display period DP.
The first on periods OP1 are provided during the display period DP of the first field F1, and are spaced in time from each other by the first off period OFF1. Accordingly, the first light L1 is provided during the display period DP (e.g., only during the display period DP), and the transmissive pixel TPX displays a grayscale (e.g., a gray level) corresponding to the first primary color image data ID1.
The first image is displayed during the first on periods OP1, and not displayed during the first off period OFF1. As described above, the first image is not provided to the display panel during the first off period OFF1, and the first image is provided to the display panel during the first field F1 after being divided in time, and thus, a flicker phenomenon may be reduced.
The second light source LS2 is turned on during a plurality of second on periods OP2 of the second field F2, and turned off during a second off period OFF2 of the second field F2. For example, the second field F2 may include two second on periods OP2. The second on periods OP2 are provided during the display period DP of the second field F2, and are spaced in time from each other by the second off period OFF2.
The second image is displayed during the second on periods OP2, and not displayed during the second off period OFF2. As a result, the transmissive pixel TPX displays a grayscale (e.g., a gray level) corresponding to the second primary color image data ID2, and thus, the flicker phenomenon may be reduced.
As an example, the first and second on periods OP1 and OP2 may have the same or substantially the same width as each other, and the first and second off periods OFF1 and OFF2 may have the same or substantially the same width as each other. However, the present invention is not limited thereto.
The first frame period FR1 includes the first field F1, the second field F2, and the first field F1, which are sequentially arranged. The second frame period FR2 includes the second field F2, the first field F1, and the second field F2, which are sequentially arranged.
The transmissive pixel TPX receives the first primary color image data ID1 during each of the first fields F1 of the first frame period FR1. Accordingly, the transmissive pixel TPX has the transmittance corresponding to the first primary color image data ID1 during the first fields F1. In addition, the first light source LS1 emits the first light L1 during the first fields F1 of the first frame period FR1. The first light L1 has a first brightness LM1 during the first frame period FR1.
The first brightness LM1 of the first light L1 passing through the transmissive pixel TPX during the first field F1 of the first frame period FR1 is controlled by the transmittance of the transmissive pixel TPX. The transmissive pixel TPX displays the first image corresponding to the first frame period FR1 during the two first fields F1 using the first light L1.
The transmissive pixel TPX receives the first primary color image data ID1 during the first field F1 of the second frame period FR2. Therefore, the transmissive pixel TPX has the transmittance corresponding to the first primary color image data ID1 during the first field F1 of the second frame period FR2. In addition, the first light source LS1 emits the first light L1 during the first field F1 of the second frame period FR2. The first light L1 has a second brightness LM2 during the second frame period FR2.
The second brightness LM2 of the first light L1 passing through the transmissive pixel TPX during the first field F1 of the second frame period FR2 is controlled by the transmittance of the transmissive pixel TPX. The transmissive pixel TPX displays the first image corresponding to the second frame period FR2 during the one first field F1 of the second frame period FR2 using the first light L1.
The first brightness LM1 allows the first image corresponding to the first frame period FR1 to be displayed during the two first fields F1 of the first frame period FR1, and the second brightness LM2 allows the first image corresponding to the second frame period FR2 to be displayed during the one first field F1 of the second frame period FR2. As an example, the second brightness LM2 may be two times greater than the first brightness LM1, but the present invention is not limited thereto.
The transmissive pixel TPX receives the second primary color image data ID2 during the second field F2 of the first frame period FR1. Thus, the transmissive pixel TPX has the transmittance corresponding to the second primary color image data ID2 during the second field F2 of the first frame period FR1. In addition, the second light source LS2 emits the second light L2 during the second field F2. The second light L2 has a third brightness LM3 during the first frame period FR1.
The third brightness LM3 of the second light L2 passing through the transmissive pixel TPX during the second field F2 of the first frame period FR1 is controlled by the transmittance of the transmissive pixel TPX. The transmissive pixel TPX displays the second image corresponding to the first frame period FR1 during the one second field F2 of the first frame period FR1 using the second light L2.
The transmissive pixel TPX receives the second primary color image data ID2 during each of the second fields F2 of the second frame period FR2. Accordingly, the transmissive pixel TPX has the transmittance corresponding to the second primary color image data ID2 during the second fields F2 of the second frame period FR2. In addition, the second light source LS2 emits the second light L2 during the second fields F2 of the second frame period FR2. The second light L2 has a fourth brightness LM4 during the second frame period FR2.
As a result, the fourth brightness LM2 of the second light L2 passing through the transmissive pixel TPX during the second fields F2 of the second frame period FR2 is controlled by the transmittance of the transmissive pixel TPX. The transmissive pixel TPX displays the second image corresponding to the second frame period FR2 during the two second fields F2 using the second light L2.
The fourth brightness LM4 allows the second image corresponding to the second frame period FR2 to be displayed during the two second fields F2 of the second frame period FR2, and the third brightness LM3 allows the first image corresponding to the first frame period FR1 to be displayed during the one second field F2 of the first frame period FR1. As an example, the fourth brightness LM4 may be half of the third brightness LM3, but the present invention is not limited thereto.
In addition, the first brightness LM1 may be smaller than the third brightness LM3, and the fourth brightness LM4 may be smaller than the second brightness LM2.
The second animal spectral sensitivity curve SC_A2 may be divided into the short wavelength component and the long wavelength component. As shown in
The controller 100b shown in
The controller 100b receives input image data RGB, and outputs output image data Idata. The output image data Idata includes the first primary color image data ID1 and the second primary color image data ID2.
As described above, the second primary color image data ID2 includes information corresponding to the second animal primary color. For example, the second primary color image data ID2 includes the third primary color image data ID3 having information corresponding to the short wavelength component and the fourth primary color image data ID4 having information corresponding to the long wavelength component.
The third primary color image data ID3 may include information corresponding to the non-visible component. As described above, the non-visible component may include the ultraviolet ray and/or the near-ultraviolet ray. In addition, the third primary color image data ID3 may include only information corresponding to the non-visible component, and the fourth primary color image data ID4 may include only information corresponding to the blue color.
The controller 100b generates the output image data Idata based on the input image data RGB. For example, the controller 100b converts the input image data RGB to the output image data Idata based on the visual characteristics of the human and/or the visual characteristics of the animal (e.g., the dog). In more detail, the controller 100b may convert the input image data RGB to the output image data Idata based on the first and second animal spectral sensitivity curves SC_A1 and SC_A2 (refer to
The controller 100b may generate the third and fourth primary color image data ID3 and ID4 by using a correlation between the spectrum SC_A3 of the short wavelength component and the spectrum SC_A4 of the long wavelength component and between the first and second animal spectral sensitivity curves SC_A1 and SC_A2 and the first to third human spectral sensitivity curves SC_H1 to SC_H3, or by analyzing the spectrum of the input image data RGB and using the analyzed result.
The third sub-pixel SPX3, the fourth sub-pixel SPX4, and the first sub-pixel SPX1 may form one pixel PX′. The first sub-pixel SPX1 has been described with reference to
The third and fourth sub-pixels SPX3 and SPX4 display third and fourth images, respectively.
The third image may represent the short wavelength component. For example, the third image has a third image spectrum IS3 corresponding to the spectrum SC_A3 of the short wavelength. In addition, the fourth image may represent the long wavelength component. For example, the fourth image has a fourth image spectrum IS4 corresponding to the spectrum SC_A4 of the long wavelength.
The third sub-pixel SPX3 receives the third primary color image data ID3, and displays the third image in response to the third primary color image data ID3. The fourth sub-pixel SPX4 receives the fourth primary color image data ID4, and displays the fourth image in response to the fourth primary color image data ID4. As an example, the third and fourth primary color image data ID3 and ID4 may be provided in the shape of the data voltage.
As described above, the third and fourth primary color image data ID3 and ID4 including the information corresponding to the second animal primary color, and the first primary color image data ID1 including the information corresponding to the first animal primary color, are provided to the pixel PX′ instead of the input image data RGB generated in consideration of the visual characteristics of the human. In addition, when the first, third, and fourth sub-pixels SPX1, SPX3, and SPX4 are driven to display the first, third, and fourth primary color image data ID1, ID3, and ID4, respectively, the animal may perceive the same image as the real object's image through the first, third, and fourth images.
The display apparatus 3000 shown in
Referring to
The upper and lower substrates US and LS include electrodes to form an electric field in the liquid crystal layer LC. The electrodes correspond to the first, third, and fourth sub-pixels SPX1, SPX3, and SPX4. Liquid crystal molecules included in the liquid crystal layer LC are controlled by the electric field, and thus, a transmittance of the first, third, and fourth sub-pixels SPX1, SPX3, and SPX4 are controlled.
The color filter includes a first color filter CF1, a third color filter CF3, and a fourth color filter CF4. As an example, the first, the third, and fourth color filters CF1, CF3, and CF4 are included in the upper substrate US, but the present invention is not limited thereto. The first, third, and fourth sub-pixels SPX1, SPX3, and SPX4 may have colors determined by the transmittance of the first, third, and fourth color filters CF1, CF3, and CF4, respectively.
In more detail, a third transmittance T3 of the third color filter CF3 may correspond to the spectrum SC_A3 (refer to
A fourth transmittance T4 of the fourth color filter CF4 may correspond to the spectrum SC_A4 (refer to
As described above, the display panel 400c may display the first, second, and third images to be spatially divided by using the first, third, and fourth color filters CF1, CF3, and CF4, respectively.
Referring to
The display panel 400d includes the transmissive pixel TPX. The transmissive pixel TPX is the same or substantially the same as that shown in
The backlight unit 500c includes a third light source LS3 for emitting a third light L3, and a fourth light source LS4 for emitting a fourth light L4. In addition, the backlight unit 500c includes the first light source LS1.
The third light L3 has a color corresponding to the short wavelength component. For example, the third light L3 has a third light spectrum IL3 corresponding to the spectrum SC_A3 of the short wavelength component. A center wavelength of the third light spectrum IL3 may be the same or substantially the same as the center wavelength of the spectrum SC_A3 of the short wavelength component. As an example, the third light L3 includes the non-visible component (e.g., the ultraviolet ray and/or the near-ultraviolet ray).
In addition, the fourth light L4 has a color corresponding to the long wavelength component. For example, the fourth light L4 has a fourth light spectrum IL4 corresponding to the spectrum SC_A4 of the long wavelength component. A center wavelength of the fourth light spectrum IL4 may be the same or substantially the same as the center wavelength of the spectrum SC_A4 of the long wavelength component.
Hereinafter, the time division driving operation of the display apparatus will be described in more detail with reference to
During the first field F1, the transmissive pixel TPX receives the first primary color image data ID1. Accordingly, the transmissive pixel TPX has the transmittance corresponding to the first primary color image data ID1.
In addition, the first light source LS1 emits the first light L1 during the first field F1. The first light L1 is provided during a first on period OP1 defined in the first field F1.
As a result, the first light L1 passing through the transmissive pixel TPX has a brightness that is adjusted by the transmittance of the transmissive pixel TPX, and the transmissive pixel TPX displays the first image using the first light L1.
The transmissive pixel TPX receives the third primary color image data ID3 during the second field F2. Therefore, the transmissive pixel TPX has the transmittance corresponding to the third primary color image data ID3.
In addition, each of the first and fourth light sources LS1 and LS4 is turned off during the second field F2, and the third light source LS3 emits the third light L3 during the second field F2. In more detail, the third light L3 is provided during a third on period OP3 defined in the second field F2. As an example, the third on period OP3 may have the same or substantially the same width as that of the first on period OP1.
The transmissive pixel TPX receives the fourth primary color image data ID4 during the third field F3. Thus, the transmissive pixel TPX has the transmittance corresponding to the fourth primary color image data ID4.
In addition, each of the first and third light sources LS1 and LS3 is turned off during the third field F3, and the fourth light source LS4 emits the fourth light L4 during the third field F3. In more detail, the fourth light L4 is provided during a fourth on period OP4 defined in the third field F3. As an example, the fourth on period OP4 may have the same or substantially the same width as that of the third on period OP3.
As described above, the display panel 400d displays the first, third, and fourth images corresponding to the first, third, and fourth primary color image data ID1, ID3, and ID4, respectively, after dividing the first, third, and fourth images in time using the first to third fields F1 to F3.
In addition, since the transmissive pixel TPX does not include the color filter, the transmissive pixel TPX transmits the first, third, and fourth lights L1, L3, and L4 without losing the light, which may be caused by the color filter. Accordingly, a light efficiency of the display apparatus 4000 may be improved.
To prevent or substantially prevent the flicker phenomenon from being perceived, the frame period FR may have a frequency higher than that of the critical frequency of the animal (e.g., the dog). For example, the frame period FR may have a frequency of about 80 Hz, and each of the first to third fields F1 to F3 may have a frequency of about 240 Hz.
Referring to
During the first field F1, the transmissive pixel TPX receives the first primary color image data ID1. Accordingly, the transmissive pixel TPX has the transmittance corresponding to the first primary color image data ID1. In addition, the first light source LS1 emits the first light L1 during the first field F1.
The transmissive pixel TPX receives the fourth primary color image data ID4 during the second field F2. Therefore, the transmissive pixel TPX has the transmittance corresponding to the fourth primary color image data ID4. In addition, the third and fourth light sources LS3 and L4 emit the third and fourth lights L3 and L4, respectively, during the second field F2. For example, the third and fourth on periods OP3 and OP4 may be provided concurrently (e.g., simultaneously or at the same point in time). That is, the third and fourth lights L3 and L4 may be concurrently (e.g., simultaneously) emitted.
However, the present invention is not limited thereto. For example, as shown in
The short wavelength and the long wavelength have a high correlation. Therefore, the transmissive pixel TPX and the third and fourth light sources LS3 and LS4 may be driven such that the short and long wavelength components of the first animal primary color are displayed together during the second field F2.
In addition, the transmissive pixel TPX receives the fourth primary color image data ID4 during the second field F2 as shown in
Example embodiments have been described with reference to the accompanying drawings. In the drawings, the relative sizes of elements, layers, and regions may be exaggerated for clarity. Spatially relative terms, such as “beneath,” “below,” “lower,” “under,” “above,” “upper,” and the like, may be used herein for ease of explanation to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or in operation, in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” or “under” other elements or features would then be oriented “above” the other elements or features. Thus, the example terms “below” and “under” can encompass both an orientation of above and below. The device may be otherwise oriented (e.g., rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein should be interpreted accordingly.
It will be understood that, although the terms “first,” “second,” “third,” etc., may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, a first element, component, region, layer or section described below could be termed a second element, component, region, layer or section, without departing from the spirit and scope of the present invention.
It will be understood that when an element or layer is referred to as being “on,” “connected to,” or “coupled to” another element or layer, it can be directly on, connected to, or coupled to the other element or layer, or one or more intervening elements or layers may be present. In addition, it will also be understood that when an element or layer is referred to as being “between” two elements or layers, it can be the only element or layer between the two elements or layers, or one or more intervening elements or layers may also be present.
The terminology used herein is for the purpose of describing particular embodiments and is not intended to be limiting of the present invention. As used herein, the singular forms “a” and “an” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes,” and “including,” when used in this specification, specify the presence of the stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. Expressions such as “at least one of,” when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
As used herein, the term “substantially,” “about,” and similar terms are used as terms of approximation and not as terms of degree, and are intended to account for the inherent variations in measured or calculated values that would be recognized by those of ordinary skill in the art. Further, the use of “may” when describing embodiments of the present invention refers to “one or more embodiments of the present invention.” As used herein, the terms “use,” “using,” and “used” may be considered synonymous with the terms “utilize,” “utilizing,” and “utilized,” respectively. Also, the term “exemplary” is intended to refer to an example or illustration.
The electronic or electric devices and/or any other relevant devices or components according to embodiments of the present invention described herein may be implemented utilizing any suitable hardware, firmware (e.g. an application-specific integrated circuit), software, or a combination of software, firmware, and hardware. For example, the various components of these devices may be formed on one integrated circuit (IC) chip or on separate IC chips. Further, the various components of these devices may be implemented on a flexible printed circuit film, a tape carrier package (TCP), a printed circuit board (PCB), or formed on one substrate. Further, the various components of these devices may be a process or thread, running on one or more processors, in one or more computing devices, executing computer program instructions and interacting with other system components for performing the various functionalities described herein. The computer program instructions are stored in a memory which may be implemented in a computing device using a standard memory device, such as, for example, a random access memory (RAM). The computer program instructions may also be stored in other non-transitory computer readable media such as, for example, a CD-ROM, flash drive, or the like. Also, a person of skill in the art should recognize that the functionality of various computing devices may be combined or integrated into a single computing device, or the functionality of a particular computing device may be distributed across one or more other computing devices without departing from the spirit and scope of the exemplary embodiments of the present invention.
Although exemplary embodiments of the present invention have been described, it should be understood that the present invention is not limited to these exemplary embodiments, and that various changes and modifications may be made by one of ordinary skill in the art without departing from the spirit and scope of the present invention as set forth in the following claims, and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
10-2015-0108263 | Jul 2015 | KR | national |
Number | Name | Date | Kind |
---|---|---|---|
6761131 | Suzuki | Jul 2004 | B2 |
7333117 | Kim | Feb 2008 | B2 |
8345338 | Jeong et al. | Jan 2013 | B2 |
20150204497 | Grajcar | Jul 2015 | A1 |
20170105391 | Grajcar | Apr 2017 | A1 |
Number | Date | Country |
---|---|---|
3820266 | Jun 2006 | JP |
2011-072388 | Apr 2011 | JP |
10-0940004 | Feb 2010 | KR |
10-2014-0043631 | Apr 2014 | KR |
10-2016-0090454 | Aug 2016 | KR |
10-2016-0092129 | Aug 2016 | KR |
Entry |
---|
Wright et al., “Light emitting diodes can be used to phase delay the melatonin rhythm”, Journal of Pineal Research, Dec. 2001, 31:350-355. |
Brainard et al., “Action Spectrum for Melatonin Regulation in Humans; Evidence for a Novel Circadian Photoreceptor”, The Journal of Neuroscience, Aug. 2001, 21(16):6405-6412. |
Kasparson et al., “Colour cues proved to be more informative for dogs than brightness,” Proceedings of the Royal Society B 280:20131356, rspb.royalsocietypublishing.org, 2013 (4 pages). |
Jacobs et al., “Photopigments of dogs and foxes and their implications for canid vision,” Visual Neuroscience, (1993), 10, pp. 173-180. |
Peng et al., “620nm red light promotes cellular viability and mRNA expression of Collagen type I in bone mesenchymal stem cells of rat”, Department of Orthopedics, Tongji Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China, 2010, IEEE, (3 pages). |
Lan et al., “Low-energy helium-neon laser induces melanocyte proliferation via interaction with type IV collagen: visible light as a therapeutic option for vitiligo”, Clinical and Laboratory Investigations, British Journal of Dermatology, 2009, 161, pp. 273-280. |
Number | Date | Country | |
---|---|---|---|
20170032747 A1 | Feb 2017 | US |