This application claims priority under 35 U.S.C. § 119 to Korean Patent Application No. 10-2015-0127716, filed on Sep. 9, 2015, in the Korean Intellectual Property Office, the disclosure of which is incorporated by reference herein in its entirety.
Exemplary embodiments of the inventive concept relate to a display device and a method of driving the same.
Melatonin is a hormone that is secreted from a human body and serves as a biological clock. For example, when night comes, melatonin is secreted to inform parts of the body that night is approaching. When melatonin is secreted, sleep is induced.
When morning comes and light illuminates, melatonin secretion is suppressed and a human begins to wake. For example, when a wavelength ranging from about 464 nm to about 470 nm is recognized by a human body, melatonin secretion is suppressed. In other words, a human body discriminates between night and day depending on the recognition of a wavelength in the range from about 464 nm to about 470 nm. Generally, since people recognize light having a central wavelength ranging from about 440 nm to about 495 nm as blue light, a wavelength ranging from about 464 nm to about 470 nm is considered as blue light.
Therefore, when an audience views an image via an electronic device such as a television (TV), a smartphone, and a personal computer (PC), in the case where light of a wavelength ranging from about 464 nm to about 470 nm is emitted from the electronic device, secretion of the audience's melatonin may be suppressed. Therefore, the normal sleeping pattern of a person in the audience may be disturbed.
According to an exemplary embodiment of the inventive concept, a display device includes: a display unit; a plurality of pixels disposed in the display unit, each pixel including first and second blue sub-pixels; and a driving mode controller configured to set a driving mode to one of a first driving mode in which both of the first and second blue sub-pixels emit light, and a second driving mode in which one of the first and second blue sub-pixels emits light, wherein the first blue sub-pixel emits light of a first frequency, and the second blue sub-pixel emits light of a second frequency different from the first frequency.
A central wavelength of the light emitted from the first blue sub-pixel is longer than a central wavelength of the light emitted from the second blue sub-pixel.
The central wavelength of the light emitted from the first blue sub-pixel is in a range from about 464 nm to about 470 nm, and the central wavelength of the light emitted from the second blue sub-pixel is in a range from about 440 nm to about 464 nm.
The second driving mode includes a (2-1)-th driving mode in which the first blue sub-pixel emits light, and a (2-2)-th driving mode in which the second blue sub-pixel emits light.
The driving mode controller is configured to determine a current time as day or night, set the driving mode to one of the first driving mode and the (2-1)-th driving mode when the current time is day, and set the driving mode to the (2-2)-th driving mode when the current time is night.
The driving mode controller is configured to determine a current location as an outdoor space or an indoor space, set the driving mode to the first driving mode when the current location is the outdoor space, and set the driving mode to one of the (2-1)-th driving mode and the (2-2)-th driving mode when the current location is the indoor space.
The device further includes: an output signal generator configured to receive an input image signal, and generate an output signal based on the input image signal, wherein the output signal generator generates a first output signal to allow the first and second blue sub-pixels to emit light when the driving mode is the first driving mode, generates a second output signal to allow the first blue sub-pixel emit light when the driving mode is the (2-1)-th driving mode, and generates a third output signal to allow the second blue sub-pixel to emit light when the driving mode is the (2-2)-th driving mode.
When the input image signal comprises first and second red image signals, first and second green image signals, and first and second blue image signals, the output signal generator: generates an output red image signal corresponding to the first and second red image signals, generates an output green image signal corresponding to the first and second green image signals, generates first and second output blue image signals corresponding to the first and second blue image signals when the driving mode is the first driving mode, generates the first blue image signal corresponding to the first and second blue image signals when the driving mode is the (2-1)-th driving mode, and generates the second blue image signal corresponding to the first and second blue image signals when the driving mode is the (2-2)-th driving mode.
The device further includes: a source driver configured to receive the output red image signal, the output green image signal, and the first and second output blue image signals, and apply data signals to the plurality of pixels, wherein each of the plurality of pixels comprises a red sub-pixel and a green sub-pixel, and the source driver applies a data signal generated based on the output red image signal to the red sub-pixel, applies a data signal generated based on the output green image signal to the green sub-pixel, applies a data signal generated based on the first output blue image signal to the first blue sub-pixel, and applies a data signal generated based on the second output blue image signal to the second blue sub-pixel.
The output signal generator is configured to perform gamma correction by using a first gamma value when the driving mode is the first driving mode, by using a second gamma value when the driving mode is the (2-1)-th driving mode, and by using a third gamma value when the driving mode is the (2-2)-th driving mode.
Each of the plurality of pixels further comprises at least two sub-pixels that emit a different color than a blue color.
Each of the plurality of pixels comprises a red sub-pixel and a green sub-pixel.
Each of the plurality of pixels comprises a first sub-pixel group comprising the red sub-pixel and the green sub-pixel arranged in a first direction, and a second sub-pixel group comprising the first and second blue sub-pixels arranged in the first direction, and the display unit comprises a first sub-pixel row in which the first sub-pixel group is arranged in the first direction, and a second sub-pixel row in which the second sub-pixel group is arranged in the first direction.
The display unit comprises the first and second sub-pixel rows arranged in a second direction substantially perpendicular to the first direction, the first sub-pixel row includes: a (1-1)-th sub-pixel row in which the red sub-pixel and the green sub-pixel are repeatedly arranged in sequence, and a (1-2)-th sub-pixel row in which the green sub-pixel and the red sub-pixel are repeatedly arranged in sequence, and the second sub-pixel row includes: a (2-1)-th sub-pixel row in which the first blue sub-pixel and the second blue sub-pixel are repeatedly arranged in sequence, and a (2-2)-th sub-pixel row in which the second blue sub-pixel and the first blue sub-pixel are repeatedly arranged in sequence.
According to an exemplary embodiment of the inventive concept, a method of driving a display device including a display unit, a plurality of pixels disposed in the display unit, each pixel including first and second blue sub-pixels, and a driving mode controller configured to set a driving mode to one of a first driving mode in which both of the first and second blue sub-pixels emit light, a (2-1)-th driving mode in which the first blue sub-pixel emits light, and a (2-2)-th driving mode in which the second blue sub-pixel emits light, wherein the first blue sub-pixel emits light of a first frequency, and the second blue sub-pixel emits light of a second frequency different from the first frequency, the method includes: determining a current time as day or night; determining a current position as an outdoor space or an indoor space; and setting the driving mode to the first driving mode when the current time is day and the current position is the outdoor space, setting the driving mode to the (2-1)-th driving mode when the current time is day and the current position is the indoor space, and setting the driving mode to the (2-2)-th driving mode when the current time is night.
The method further includes: receiving an input image signal; and generating an output signal based on the input image signal, wherein the generating of the output signal comprises: generating a first output signal to allow the first and second blue sub-pixels to emit light when the driving mode is the first driving mode, generating a second output signal to allow the first blue sub-pixel to emit light when the driving mode is the (2-1)-th driving mode, and generating a third output signal to allow the second blue sub-pixel to emit light when the driving mode is the (2-2)-th driving mode.
When the input image signal comprises first and second red image signals, first and second green image signals, and first and second blue image signals, the generating of the output signal includes: generating an output red image signal corresponding to the first and second red image signals; generating an output green image signal corresponding to the first and second green image signals; generating first and second output blue image signals corresponding to the first and second blue image signals when the driving mode is the first driving mode, generating the first blue image signal corresponding to the first and second blue image signals when the driving mode is the (2-1)-th driving mode, and generating the second blue image signal corresponding to the first and second blue image signals when the driving mode is the (2-2)-th driving mode.
The method further includes: after the generating of the output signal, applying a data signal generated based on the output red image signal to a red sub-pixel, applying a data signal generated based on the output green image signal to a green sub-pixel, applying a data signal generated based on the first output blue image signal to the first blue sub-pixel, and applying a data signal generated based on the second output blue image signal to the second blue sub-pixel.
The generating of the output signal includes: performing gamma correction by using a first gamma value when the driving mode is the first driving mode, by using a second gamma value when the driving mode is the (2-1)-th driving mode, and by using a third gamma value when the driving mode is the (2-2)-th driving mode.
According to an exemplary embodiment of the inventive concept, a display unit including a plurality of pixels, at least one of the pixels including a first blue sub-pixel and a second blue sub-pixel; and a controller configured to drive at least one of the first and second blue sub-pixels based on a driving mode, the driving mode being based on a current time of day and a current location of the display device.
The features of the aforementioned exemplary embodiments may be embodied by using a system, a method, a computer program, or a combination of a system, a method, and a computer program.
The above and other features of the inventive concept will become more apparent by describing in detail exemplary embodiments thereof, with reference to the accompanying drawings in which:
Exemplary embodiments of the inventive concept will now be described below in detail together with the drawings. However, the inventive concept is not limited to the below exemplary embodiments and may be implemented in various forms.
As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
Like reference numerals in the drawings may denote like or corresponding elements, and thus, a repeated description thereof will be omitted.
Referring to
The display device 100 may be a liquid crystal display apparatus, an organic light-emitting display device, a flexible display, a three-dimensional (3-D) display, an electrophoretic display, etc. The inventive concept is not limited thereto and various electronic devices that may provide visual information by emitting light may be the display device 100. Hereinafter, the case where the display device 100 is an organic light-emitting display device is described as an example.
The display device 100 may display an image via a pixel P. The display device 100 may be, for example, an electronic device, such as a smartphone, a tablet personal computer (PC), a notebook PC, a monitor, and a television (TV), and may be a part for displaying an image of an electronic device.
A pixel P may include a plurality of sub-pixels that respectively display a plurality of colors to display various colors. Throughout the specification, a pixel P mainly denotes one sub-pixel. However, exemplary embodiments are not limited thereto and a pixel P may denote one unit pixel including a plurality of sub-pixels. In other words, even when it is described that one pixel P exists in the present specification, it may be construed that one sub-pixel exists and that a plurality of sub-pixels forming one unit pixel exist.
A pixel P may include a light-emitting device and a pixel circuit. The pixel circuit may receive a driving voltage and a data signal, and output a driving current to the light-emitting device. In this case, the driving voltage may include a first driving voltage and a second driving voltage. The first driving voltage may be a driving voltage having a relatively high level, and the second driving voltage may be a driving voltage having a relatively low level. A level of a driving voltage supplied to each pixel P may be a difference in a level between the first driving voltage and the second driving voltage.
The display device 100 may receive a plurality of image frames from the outside. The plurality of image frames may be image frames that allow one moving image to be displayed when a plurality of image frames are sequentially displayed. Each of the plurality of image frames may include an input image signal IIS. The input image signal IIS may include information regarding luminance of light emitted via a pixel P, and a number of bits of an input image signal IIS may be determined depending on a predetermined step of brightness. For example, in the case where a number of steps of brightness of light emitted via a pixel P is 256, an input image signal IIS may be an 8-bit digital signal. In the case where a darkest gray scale that may be displayed via the display unit 120 is a first step, and a brightest gray scale that may be displayed via the display unit 120 is a 256-th step, an input image signal IIS corresponding to the first step may be 0 and an input image signal IIS corresponding to the 256-th step may be 255. The darkest gray scale that may be displayed via the display unit 120 may be referred to as a minimum gray scale, and the brightest displayable gray scale may be referred to as a maximum gray scale. A number of steps of brightness of light emitted via a pixel P may be determined as various numbers such as 64, 256, and 1024.
The controller 110 may be connected to the display unit 120, the gate driver 130, and the source driver 140. The controller 110 may receive an input image signal IIS and output first control signals CON1 to the gate driver 130. The first control signals CON1 may include a horizontal synchronization signal HSYNC. The first control signals CON1 may include control signals which the gate driver 130 uses for outputting scan signals SCAN1 to SCANm synchronized with a horizontal synchronization signal HSYNC. The controller 110 may output second control signals CON2 to the source driver 140.
The controller 110 may output an output image signal OIS to the source driver 140. The second control signals CON2 may include control signals which the source driver 140 uses for outputting data signals DATA1 to DATAn corresponding to the output image signal OIS. The output image signal OIS may include image information used for generating the data signals DATA1 to DATAn. The output image signal OIS may be image data generated by correcting an input image signal IIS received from the outside.
The display unit 120 may include a plurality of pixels, a plurality of scan lines each being connected to pixels located in one row from among the plurality of pixels, and a plurality of data lines each being connected to pixels located in one column from among the plurality of pixels. For example, as illustrated in
The gate driver 130 may output scan signals SCAN1 to SCANm to the scan lines. The gate driver 130 may output scan signals SCAN1 to SCANm in synchronization with a vertical synchronization signal. The pixel P may receive scan signal SCANa as shown in
The source driver 140 may output data signals DATA1 to DATAn to the data lines in synchronization with scan signals SCAN1 to SCANm. The source driver 140 may output data signals DATA1 to DATAn proportional to input image data to the data lines. The pixel P may receive data signal DATAb as shown in
Generally, an electronic device that displays an image displays an image by using a plurality of sub-pixels respectively emitting different light. Such an electronic device may include, for example, sub-pixels respectively emitting light of red, green, and blue colors. Among the color light, blue light may include a large amount of light having a wavelength ranging from about 464 nm to about 470 nm. Light having a wavelength ranging from about 464 nm to about 470 nm suppresses melatonin secretion, which can negatively impact a viewer's normal sleeping pattern.
The display device 100 according to an exemplary embodiment of the inventive concept may not adversely impact a viewer's normal sleeping pattern. For example, the display device 100 according to an exemplary embodiment of the inventive concept may include two kinds of blue sub-pixels respectively having different central wavelengths. For example, a central wavelength of first blue light may be longer than a central wavelength of second blue light. For example, the display device 100 according to an exemplary embodiment of the inventive concept may include a first blue sub-pixel that emits the first blue light having a central wavelength ranging from about 464 nm to about 470 nm, and a second blue sub-pixel that emits the second blue light having a central wavelength ranging from about 440 nm to about 464 nm. For example, the first blue sub-pixel may emit light blue light, and the second blue sub-pixel may emit dark blue light.
Since light emitted from the first blue sub-pixel has a central wavelength ranging from about 464 nm to about 470 nm, the light may suppress melatonin secretion of a viewer. In addition, since light emitted from the second blue sub-pixel has a central wavelength ranging from about 440 nm to about 464 nm which is separated from a band ranging from about 464 nm to about 470 nm, the light emitted from the second blue sub-pixel may not suppress melatonin secretion. Therefore, the light emitted from the second blue sub-pixel induces sleep in a viewer, and the light emitted from the first blue sub-pixel keeps a viewer awake. Therefore, the display device 100 according to an exemplary embodiment of the inventive concept may wake-up a viewer or induce sleep in a viewer by driving the two blue sub-pixels in a particular driving mode.
Referring to
The controller 110 according to the present exemplary embodiment may correspond to one or more processors or include one or more processors. Accordingly, the controller 110 may be driven in a form included in another hardware device such as a microprocessor or a general computer system.
Referring to
The driving mode controller 111 according to an exemplary embodiment of the inventive concept may set a driving mode of the display unit 120. A method of setting the driving mode by the driving mode controller 111 is described with reference to
The output signal generator 112 may generate an output image signal OIS to be applied to the display unit 120. The output signal generator 112 may receive an input image signal IIS from the outside, and generate an output image signal OIS based on the input image signal IIS. The output signal generator 112 may output an output image signal OIS to the source driver 140 to allow a data voltage corresponding to the output image signal OIS to be applied to the display unit 120 via the source driver 140. A method of generating an output image signal OIS by the output signal generator 112 is described with reference to
The flowchart illustrated in
Referring to
After that, the display device 100 may perform an operation (operation S200) of driving at least a portion of the first blue sub-pixel and the second blue sub-pixel depending on the set driving mode. In other words, the display device 100 may induce a viewer's sleeping or wake-up states based on a current time and a current location, and output an image at a brightness suitable for a viewing environment. This driving may be performed by the controller 110, the source driver 140, and the display unit 120 of the display device 100.
An example of determining the driving mode of the display device 100 and driving the display device 100 according to the driving mode is described with reference to
Referring to
After that, the display device 100 may perform an operation (operation S120) of determining whether a current time is day or night. If the current time is day, the display device 100 may perform an operation (operation S130) of determining whether a current location is an outdoor space.
When the current time is day and the current location is an outdoor space, the display device 100 may perform an operation (operation S140) of setting the driving mode to the first driving mode. When the current time is day and the current location is an indoor space, the display device 100 may perform an operation (operation S150) of setting the driving mode to the (2-1)-th driving mode. When the current time is night, the display device 100 may perform an operation (operation S160) of setting the driving mode to the (2-2)-th driving mode.
Operations S110 to S160 may be included in operation S100, and performed by the driving mode controller 111 of the display device 100.
When the driving mode is set to the first driving mode, the display device 100 may perform an operation (operation S210) of generating an output signal that allows both first and second blue sub-pixels to emit light. In addition, when the driving mode is set to the (2-1)-th driving mode, the display device 100 may perform an operation (operation S220) of generating an output signal that allows the first blue sub-pixel to emit light. In addition, when the driving mode is set to the (2-2)-th driving mode, the display device 100 may perform an operation (operation S230) of generating an output signal that allows the second blue sub-pixel to emit light.
In other words, when the current time is day, the display device 100 may provide a wake-up effect by allowing the first blue sub-pixel to emit light and thus suppressing melatonin secretion of a viewer. When the current time is night, in order not to disturb a viewer's sleeping pattern, the display device 100 may a display blue color by using only the second blue sub-pixel. In addition, when the current time is day and the current location is an outdoor space, since ambient illuminance is relatively high, the display device 100 may increase brightness of light output from the display device 100 by allowing both the first and second blue sub-pixels to emit light. When the current time is day and the current location is an indoor space, since ambient illuminance is relatively low, the display device 100 may display the blue color by using only the first blue sub-pixel.
Operations S210 to S230 may be included in operation S200, and performed by the output signal generator 112 of the display device 100.
An example of generating an output image signal OIS to drive the display device 100 according to a determined driving mode is described with reference to
Referring to
After that, the output signal generator 112 may perform an operation (operation S400) of performing de-gamma correction on each of a red image signal, a green image signal, and a blue image signal to convert a received RGB signal into a numerical value linearly representing the intensity of light.
For example, a change of a gray level in an RGB signal may not linearly coincide with a change of the intensity of light actually recognized by a human being. In other words, the intensity of light in the case where a gray level is 100 may not be double the intensity of light in the case where a gray level is 50. Therefore, a process of changing a linear numerical value into a gray level depending on a change of the intensity of light is used. Such a process is referred to as gamma correction. An RGB signal received from the outside may be in a gamma-corrected state.
Here, the present exemplary embodiment may include a process of calculating an average of two different sub-pixel values. For example, the output signal generator 112 may perform an operation of changing each of a red image signal, a green image signal, and a blue image signal included in a received RGB signal into a linear numerical value depending on a change of the intensity of light. This process is referred to as de-gamma correction.
After that, the output signal generator 112 may perform an operation (operation S500) of applying sub-pixel rendering to a numerical value that expresses a change of the intensity of light by using a linear change. For example, in the case where each of the pixels has three sub-pixels of a red sub-pixel, a green sub-pixel, and a blue sub-pixel, a received RGB signal may be a signal having gray levels respectively corresponding to the sub-pixels. In this case, the display device 100 according to the present exemplary embodiment may be designed such that one pixel includes four sub-pixels of a red color, a green color, a first blue color, and a second blue color. In addition, in the case where one pixel includes three sub-pixels of a red color, a green color, and a blue color, the display device 100 according to the present exemplary embodiment may be designed to express, by using only one pixel, image information to be expressed by using two pixels. In this case, assuming that information included in two successive input signals includes first and second red image signals, first and second green image signals, and first and second blue image signals, the output signal generator 112 may generate four signals of an output red image signal, an output green image signal, a first output blue image signal, and a second output blue image signal by using six signals. In this case, the output signal generator 112 may calculate an output red image signal by using an average value of the first and second red image signals, and calculate an output green image signal by using an average value of the first and second green image signals. In addition, the output signal generator 112 may calculate the first and second output blue image signals by using different methods depending on whether the driving mode is the first driving mode, the (2-1)-th driving mode, or the (2-2)-th driving mode.
For example, when the driving mode is the first driving mode, the output signal generator 112 may directly borrow a value of the first blue image signal and apply that value as the first output blue image signal, and may directly borrow a value of the second blue image signal and apply that value as the second output blue image signal. In other words, since image signals representing a blue color from among input image signals may one-to-one correspond to sub-pixels in use from among blue sub-pixels of the display device 100, the output signal generator 112 may directly use an input image signal as an output image signal.
In addition, when the driving mode is the (2-1)-th driving mode, the output signal generator 112 may calculate the first output blue image signal by using an average value of the first and second blue image signals. In other words, since the (2-1)-th driving mode uses only the first blue sub-pixel and does not use the second blue sub-pixel, the output signal generator 112 may display the first blue image signal and the second blue image signal by using the above method based on the first blue sub-pixel. In this case, the output signal generator 112 may output 0 as a pixel value corresponding to the second blue sub-pixel.
In addition, when the driving mode is the (2-2)-th driving mode, the output signal generator 112 may calculate the second output blue image signal by using an average value of the first and second blue image signals. In other words, since the (2-2)-th driving mode uses only the second blue sub-pixel and does not use the first blue sub-pixel, the output signal generator 112 may display the first blue image signal and the second blue image signal by using the above method based on the second blue sub-pixel. In this case, the output signal generator 112 may output 0 as a pixel value corresponding to the first blue sub-pixel.
After that, the output signal generator 112 may perform an operation (operation S600) of applying a gamma to each of an intensity value of red light, an intensity value of green light, an intensity value of first blue light, and an intensity value of second blue light to express intensity values of sub-pixel rendering-applied to the red, green, first blue, and second blue light by using gray levels again.
In the case of expressing a first red image signal as RI1, a second red image signal as RI2, a first green image signal as GI1, a second green image signal as GI2, a first blue image signal BI1, a second blue image signal BI2, an output red image signal RO, an output green image signal GO, a first output blue image signal BO1, and a second output blue image signal BO2, operations S400 to S600 may be expressed by Equations 1 to 9 below.
For example, in the first driving mode, the (2-1)-th driving mode, and the (2-2)-th driving mode, an output red image signal and an output green image signal may be expressed by Equations 1 and 2 below.
In addition, an example of a process of calculating first and second output blue image signals in the first driving mode may be expressed by Equations 3 and 4 below.
BO1=BI1 Equation 3
BO2=BI2 Equation 4
In addition, an example of a process of calculating first and second output blue image signals in the (2-1)-th driving mode may be expressed by Equations 5 and 6 below.
In addition, an example of a process of calculating first and second output blue image signals in the (2-2)-th driving mode may be expressed by Equations 7 and 8 below.
After that, the output signal generator 112 may perform an operation (operation S700) of performing various image processes on the red, green, first blue, and second blue image signals. Various algorithms such as color enhancement, edge enhancement, noise filtering, and dithering may be applied to these image processes, and the display device 100 may perform an image process by applying various image processing algorithms besides the above algorithms.
After that, the output signal generator 112 may output a generated output red image signal, output green image signal, first output blue image signal, and second output blue image signal (operation S800).
Through this method, the display device 100 according to the present exemplary embodiment may suppress secretion of a viewer's melatonin and thus provide a wake-up effect during the daytime, and may not hinder secretion of a viewer's melatonin and thus induce sleeping at night by driving two blue sub-pixels depending on the driving mode. In addition, the display device 100 may provide an image of a brightness suitable for a location to an audience (e.g., person or persons viewing the display device 100) by determining whether the audience's view location is an indoor space or an outdoor space.
Referring to
In this case, each pixel P may include a first sub-pixel group SPG1 including a red sub-pixel R and a green sub-pixel G, and a second sub-pixel group SPG2 including a first blue sub-pixel B1 and a second blue sub-pixel B2. In this case, the same sub-pixel groups may be disposed in the same sub-pixel row. In other words, all sub-pixel groups disposed in the same row in which the illustrated first sub-pixel group SPG1 is disposed may be the first sub-pixel groups SPG1, and all sub-pixel groups disposed in the same row in which the illustrated second sub-pixel group SPG2 is disposed may be the second sub-pixel groups SPG2. In this case, a row in which the first sub-pixel groups SPG1 are disposed may be referred to as a first sub-pixel row SPR1, and a row in which the second sub-pixel groups SPG2 are disposed may be referred to as a second sub-pixel row SPR2. In this case, in the display device 100 according to the present exemplary embodiment, the first sub-pixel row SPR1 and the second sub-pixel row SPR2 may be disposed alternately in a second direction perpendicular to a first direction. In other words, as illustrated in
In the first sub-pixel group SPG1, sub-pixels may be disposed in the order of a red sub-pixel R and a green sub-pixel G from the left, and may be disposed in the order of a green sub-pixel G and a red sub-pixel R from the left. In the second sub-pixel group SPG2, sub-pixels may be disposed in the order of a first blue sub-pixel B1 and a second blue sub-pixel B2 from the left, and may be disposed in the order of a second blue sub-pixel B2 and a first blue sub-pixel B1 from the left. In this case, all sub-pixel groups included in the same sub-pixel row may have the same arrangement. In other words, in the case where one first sub-pixel group SPG1 included in one first sub-pixel row SPR1 includes sub-pixels disposed in the order of a red sub-pixel R and a green sub-pixel G from the left, all first sub-pixel groups SPG1 included in the one first sub-pixel row SPR1 may be the first sub-pixel groups SPG1 each including sub-pixels disposed in the order of a red sub-pixel R and a green sub-pixel G from the left.
Here, the first sub-pixel row SPR1 including the first sub-pixel groups SPG1 in which sub-pixels are disposed in the order of a red sub-pixel R and a green sub-pixel G from the left may be referred to as a (1-1)-th sub-pixel row SPR1-1, and the first sub-pixel row SPR1 including the first sub-pixel groups SPG1 in which sub-pixels are disposed in the order of a green sub-pixel G and a red sub-pixel R from the left may be referred to as a (1-2)-th sub-pixel row SPR1-2. The second sub-pixel row SPR2 including the second sub-pixel groups SPG2 in which sub-pixels are disposed in the order of a first blue sub-pixel B1 and a second blue sub-pixel B2 from the left may be referred to as a (2-1)-th sub-pixel row SPR2-1, and the second sub-pixel row SPR2 including the second sub-pixel groups SPG2 in which sub-pixels are disposed in the order of a second blue sub-pixel B2 and a first blue sub-pixel B1 from the left may be referred to as a (2-2)-th sub-pixel row SPR2-2. In this case, the (1-1)-th sub-pixel row SPR1-1 and the (1-2)-th sub-pixel row SPR1-2 may be arranged alternately. In addition, the (2-1)-th sub-pixel row SPR2-1 and the (2-2)-th sub-pixel row SPR2-2 may be arranged alternately. For example, as illustrated in
Through this arrangement method, one pixel P may include four sub-pixels, in other words, a red sub-pixel R, a green sub-pixel G, a first blue sub-pixel B1, and a second blue sub-pixel B2, and these pixels P may be arranged in the display unit 120.
Referring to
In other words, sub-pixels may be arranged in one of the ways illustrated in
The display device 100 and the method of driving the same according to the exemplary embodiments of the inventive concept may provide a wake-up or sleeping-inducing effect to a user depending on a time and a place. In addition, the display device 100 and the method of driving the same according to the exemplary embodiments of the inventive concept may provide a wake-up or sleeping-inducing effect to a user by making the central wavelengths of light emitted from two blue sub-pixels different from each other in the case where each of the pixels included in the display device 100 includes one red sub-pixel, one green sub-pixel, and two blue sub-pixels.
The exemplary embodiments of the inventive concept may be embodied in the form of computer program(s) executable through various components on a computer, and the computer program(s) may be recorded on a non-transitory computer-readable recording medium. In this case, examples of the non-transitory computer-readable recording medium include magnetic recording media such as hard disks, floppy disks, and magnetic tapes, optical recording media such as compact disk read only memories (CD-ROMs) and digital video disks (DVDs), magneto-optical recording media such as floppy disks, and hardware devices such as ROMs, random access memories (RAMs), and flash memories that are configured to store and execute program commands. Furthermore, the non-transitory computer-readable recording medium may include an intangible medium embodied in a transmittable form on a network, and may be, for example, a medium embodied in the form of software or an application and transmittable and distributable via a network.
Examples of the computer programs include machine language codes that may be generated by a compiler, and high-level language codes that may be executed by a computer by using an interpreter.
The methods according to exemplary embodiments of inventive concept are not necessarily limited to the described order of the operations. For example, certain steps may be performed out of order.
While the inventive concept has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the inventive concept as defined by the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2015-0127716 | Sep 2015 | KR | national |
Number | Name | Date | Kind |
---|---|---|---|
9406277 | Letourneur | Aug 2016 | B1 |
9867962 | Mok | Jan 2018 | B2 |
20050162360 | Ishihara | Jul 2005 | A1 |
20060152525 | Woog | Jul 2006 | A1 |
20070013717 | Kempf | Jan 2007 | A1 |
20070296741 | Park | Dec 2007 | A1 |
20080224968 | Kashiwabara | Sep 2008 | A1 |
20090009453 | Furihata | Jan 2009 | A1 |
20090281604 | De Boer | Nov 2009 | A1 |
20100123731 | Morimoto | May 2010 | A1 |
20100264850 | Yamamoto | Oct 2010 | A1 |
20110273494 | Jun | Nov 2011 | A1 |
20120105517 | Yang | May 2012 | A1 |
20120147065 | Byun | Jun 2012 | A1 |
20120256938 | So | Oct 2012 | A1 |
20130075579 | Chen | Mar 2013 | A1 |
20130328842 | Barnhoefer | Dec 2013 | A1 |
20140071030 | Lee | Mar 2014 | A1 |
20140098143 | Lee et al. | Apr 2014 | A1 |
20140118385 | Buckley | May 2014 | A1 |
20150009194 | Kim | Jan 2015 | A1 |
20150009242 | Park | Jan 2015 | A1 |
20150062892 | Krames | Mar 2015 | A1 |
20150070337 | Bell | Mar 2015 | A1 |
20150194101 | Yaras | Jul 2015 | A1 |
20150332628 | Ren | Nov 2015 | A1 |
20150348468 | Chen | Dec 2015 | A1 |
20160005362 | Chen | Jan 2016 | A1 |
20160121073 | Mok | May 2016 | A1 |
20160129218 | Mok | May 2016 | A1 |
20160189670 | Kim | Jun 2016 | A1 |
20160243379 | Hommes | Aug 2016 | A1 |
20160293139 | Kwon | Oct 2016 | A1 |
20170061842 | Cok | Mar 2017 | A1 |
20170193880 | Lee | Jul 2017 | A1 |
Number | Date | Country |
---|---|---|
101984487 | Mar 2011 | CN |
1020110123531 | Nov 2011 | KR |
10-1146992 | May 2012 | KR |
1020140035239 | Mar 2014 | KR |
1020140044568 | Apr 2014 | KR |
Number | Date | Country | |
---|---|---|---|
20170069290 A1 | Mar 2017 | US |