Pursuant to 35 U.S.C. §119(a), this application claims the benefit of earlier filing and right of priority to Korean Patent Application No. 10-2011-0041335 filed on May 2, 2011, the contents of which are hereby incorporated by reference herein in its entirety.
1. Field
Embodiments of the present invention may relate to a mobile terminal and a method of controlling the same, enabling the application of a backlight dimming algorithms according to an application.
2. Related Art
An active matrix type liquid crystal display displays an image by using a thin film transistor (hereinafter, referred to as a “TFT”) as a switching device. The liquid crystal display is advantageous over a cathode ray tube (CRT) in terms of miniaturization, thus rapidly replacing the CRT in the applications fields of portable information devices, office machines, and computers as well as televisions.
The liquid crystal display is not a self-emissive device and thus requires a backlight unit for light emission to a liquid crystal display panel. The liquid crystal display is being utilized as a display device for a portable information device. To extend the useful life of a battery in the portable information device, the liquid crystal display needs to achieve a reduction in power consumption. Power consumed by light sources of a backlight unit and an inverter circuit for driving the light sources is almost half of the overall power consumed by the liquid crystal display. Therefore, it is effective to reduce power consumption in the backlight unit in order to reduce the power consumption of the liquid crystal device.
As for a method of reducing power consumption in the backlight unit, a backlight dimming method is in the widest use. The backlight dimming method may be divided into a local backlight dimming method and a global backlight dimming method.
In the local backlight dimming method, a display area is divided into a plurality of blocks, and each of the blocks is individually controlled in terms of the luminance of the backlight, while in the global backlight dimming method, the luminance of the backlight is collectively lowered for the entire display area. The local backlight dimming method allows the luminance of a display area to be locally controlled within a single frame period, thus being desirably contributive to improving static contrast and reducing power consumption; however, it has limitations due to its high complexity of the algorithm and hardware and its poor applicability only for a direct type light emitting diode (LED) backlight unit which allows for easy luminance control upon each block. Meanwhile, the global backlight dimming method has low complexity in the algorithm and hardware and is easily applicable to any type of backlight unit.
Accordingly, one object of the present invention is to address the above-noted and other drawbacks of the related art.
Another object of the present invention is to provide a mobile terminal and a method of controlling the same, enabling the application of a different backlight dimming algorithm depending on an application.
Also, another object of the present invention is to provide a mobile terminal and a method of controlling the same, enabling the selective application of a dimming algorithm in association with an input image during the execution of a specific application.
To accomplish the objects of the present invention, according to an aspect of the present invention, there is provided a mobile terminal including: a liquid crystal display panel; a backlight unit emitting backlight to the liquid crystal display panel; and a controller selecting a dimming algorithm from among a plurality of dimming algorithms according to an application, and controlling luminance of the backlight unit, wherein, when an input image having a luminance value falling outside a preset reference range is displayed on the liquid crystal display panel during execution of a specific application, the controller controls luminance of the backlight unit by applying a dimming algorithm which is different from the selected dimming algorithm according to the specific application.
According to another aspect of the present invention, there is provided a method of controlling a mobile terminal, the method comprising: when a specific application is called for, selecting a dimming algorithm according to the specific application, and controlling luminance of a backlight unit emitting backlight to a liquid crystal display panel of the mobile terminal; and when an input image having a luminance value falling outside a preset reference range is displayed on the liquid crystal display panel during execution of the specific application, controlling luminance of the backlight unit by applying a dimming algorithm which is different from the dimming algorithm selected according to the specific application.
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the principle of the invention. In the drawings:
Objects, advantages, and features of the present invention will be clarified through following embodiments described with reference to the accompanying drawings. Reference will now be made in detail to the preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings. In the drawings, the same reference numerals will be used throughout to designate the same or like elements. Also, detailed descriptions with regard to well known functions and configurations, which may obscure the substance of the present invention, will be omitted.
Hereinafter, a mobile terminal related to the present invention will be described in more detail with reference to the accompanying drawings. The suffixes ‘module’, ‘unit’ and ‘part’ may be used for elements in order to facilitate the disclosure. Significant meanings or roles may not be given to the suffixes themselves and it is understood that the ‘module’, ‘unit’ and ‘part’ may be used together or interchangeably.
Embodiments of the present invention may be applicable to various types of terminals. Examples of such terminals may include mobile phones, smart phones, laptop computers, digital broadcast terminals, personal digital assistants, portable multimedia players (PMP) and/or navigators.
The mobile terminal 100 includes a wireless communication unit 110, an audio/video (A/V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180 and a power supply 190. Other embodiments and arrangements may also be provided.
The components will now be described.
The wireless communication unit 110 may be configured with one or more modules enabling wireless communication between the mobile terminal 100 and a wireless communication system or between the mobile terminal 100 and a network where the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-range communication module 114 and a position-location module 115.
The broadcast receiving module 111 may receive a broadcast signal and/or broadcast associated information from an external broadcast managing server via a broadcast channel.
The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast managing server may refer to a server that generates a broadcast signal and/or broadcast associated information and transmits it or a server that receives a generated broadcast signal and/or broadcast associated information and transmits it to a terminal. The broadcast signal may be a TV broadcast signal, a radio broadcast signal, and/or a data broadcast signal. The broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal.
Examples of broadcast associated information may include information associated with a broadcast channel, a broadcast program, a broadcast service provider, etc. The broadcast associated information may be provided via a mobile communication network. In this case, the broadcast associated information may be received by the mobile communication module 112.
The broadcast associated information may be in various forms. For example, broadcast associated information may include an electronic program guide (EPG) of digital multimedia broadcasting (DMB) and an electronic service guide (ESG) of digital video broadcast-handheld (DVB-H).
The broadcast receiving module 111 may receive broadcast signals transmitted from various types of broadcast systems. In particular, the broadcast signals may be received by using a digital broadcasting system, such as digital multimedia broadcasting-terrestrial (DMB-T), digital multimedia broadcasting-satellite (DMB-S), digital video broadcast-handheld (DVB-H), a data broadcasting system known as media forward link only (MediaFLO) or integrated services digital broadcast-terrestrial (ISDB-T). Of course, the broadcast receiving module 111 may be configure to be suitable to the above digital broadcasting systems as well as other broadcast systems providing broadcast signals.
The broadcast signals and/or broadcast associated information received through the broadcast receiving module 111 may be stored in the memory 160.
The mobile communication module 112 may communicate wireless signals with at least one of a base station, an external terminal and a server. The signals may represent audio, video, multimedia, control signaling, and data, etc.
The wireless Internet module 113 refers to a module for wireless Internet access. This wireless Internet module 113 may be internally or externally coupled to the mobile terminal 100. Suitable technologies for wireless Internet may include, but are not limited to, WLAN (Wireless LAN)(Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), and/or HSDPA (High Speed Downlink Packet Access).
The short-range communication module 114 refers to a module for short-range communications. Suitable technologies for short-range communication may include, but are not limited to, Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), ultra-wideband (UWB), ZigBee, and the like.
The position-location module 115 may identify or otherwise obtain a location of the mobile terminal 100. The position-location module 115 may obtain information regarding a location by using a global Navigation Satellite System (GNSS). Here, the GNSS is a term used to explain wireless navigation satellite systems that revolve around the earth to send reference signals allowing predetermined types of wireless navigation receivers to determine their locations on or in the vicinity of the surface of the earth. Non-limiting examples of the GNSS may include Global Position System in the United Sates, Galileo in Europe, Global Orbiting Navigational Satellite System (GLONASS) in Russia, COMPASS in China, Quasi-Zenith Satellite System in Japan, etc.
For example, the position-location module 115 may be a GPS module, a representative example of the GNSS. The GPS module may precisely calculate current 3-dimensional position information based on longitude, latitude and altitude by calculating distance information and precise time information from at least three satellites and then by applying triangulation to the calculated information. Location and time information may be calculated using three satellites, and errors of the calculated location position and time information may then be amended or changed using another satellite. The GPS module may calculate speed information by continuously calculating a real-time current location.
Referring to
The image frames processed in the camera 121 may be stored in the memory 160 or externally sent through the wireless communication unit 110. According to a configuration aspect of the terminal, two or more cameras 121 may be provided.
The microphone 122 may receive an external audio signal while the mobile terminal is in a particular mode, such as a phone call mode, a recording mode and/or a voice recognition mode, and then process it into an electrical audio data. In the case of the phone call module, the processed audio data may be converted into a form transmittable to the mobile communication terminal through the mobile communication module 112. The microphone 122 may be provided with various noise removing algorithms (or noise canceling algorithms) to remove noise generated in the course of receiving the external audio signal.
The user input unit 130 may generate input data responsive to user manipulation. Examples of the user input unit 130 may include a keypad, a dome switch, a touchpad (e.g., static pressure/capacitance), a jog wheel and/or a jog switch.
The sensing unit 140 may provide status measurements of various aspects of the mobile terminal 100. For example, the sensing unit 140 may detect current states of the mobile terminal 100 such as an open/close status (or state) of the mobile terminal 100, a position of the mobile terminal 100 or a component of the mobile terminal 100, a presence or absence of user contact with the mobile terminal 100, and/or an orientation or acceleration/deceleration of the mobile terminal 100, and then generates a sensing signal for controlling the operation of the mobile terminal 100. For example, the mobile terminal 100 may be configured as a slide-type mobile terminal. In such a configuration, the sensing unit 140 may sense whether a sliding portion of the mobile terminal 100 is open or closed. The sensing unit 140 may also sense presence or absence of power provided by the power supply 190, presence or absence of a coupling or other connection between the interface unit 170 and an external device, etc. The sensing unit 140 may include a proximity sensor 141.
The output unit 150 may generate an output relevant to a sight sense, an auditory sense, a tactile sense and/or the like. The output unit 150 may include a display 151, an audio output module 152, an alarm 153, a haptic module 154 and/or the like.
The display 151 may display (output) information processed by the terminal 100. For example, in the case that the terminal is in a call mode, the display 151 may display a user interface (UI) or a graphic user interface (GUI) associated with the call. If the mobile terminal 100 is in a video communication mode or a photograph mode, the display 151 may display a photographed and/or received picture, a UI or a GUI.
The display 151 may include at least one of a liquid crystal display (LCD), a thin film transistor liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display, and a 3-dimensional display.
The display 151 may have a transparent or light-transmittive type configuration to enable an external environment to be seen through. This may be called a transparent display. A transparent LCD may be an example of a transparent display. A backside structure of the display 151 may also have the light-transmittive type configuration. In this configuration, a user may see an object located behind the terminal body through the area occupied by the display 151 of the terminal body.
At least two displays 151 may also be provided. For example, a plurality of displays may be provided on a single face of the terminal 100 by being built in one body or spaced apart from the single face. Alternatively, each of a plurality of displays may be provided on different faces of the terminal 100.
If the display 151 and a sensor for detecting a touch action (hereafter a touch sensor) are constructed in a mutual-layered structure (hereafter a touchscreen), the display 151 may be used as an input device as well as an output device. For example, the touch sensor may include a touch film, a touch sheet, a touchpad and/or the like.
The touch sensor may convert a pressure applied to a specific portion of the display 151 or a variation of electrostatic capacity generated from a specific portion of the display 151 to an electric input signal. The touch sensor may detect a pressure of a touch as well as a position and size of the touch.
If a touch input is provided to the touch sensor, signal(s) corresponding to the touch input may be transferred to a touch controller. The touch controller may process the signal(s) and then transfer corresponding data to the controller 180. The controller 180 may therefore know which portion of the display 151 is touched.
Referring to
Examples of the proximity sensor may include a transmittive photoelectric sensor, a direct reflective photoelectric sensor, a mirror reflective photoelectric sensor, a radio frequency oscillation proximity sensor, an electrostatic capacity proximity sensor, a magnetic proximity sensor, an infrared proximity sensor and/or the like.
If the touchscreen is an electrostatic type, the proximity sensor may detect proximity of a pointer using a variation of an electric field according to the proximity of the pointer. In this case, the touchscreen (touch sensor) may be classified into the proximity sensor.
An action in which a pointer approaches the touchscreen without contacting the touchscreen may be called a proximity touch. An action in which a pointer actually touches the touchscreen may be called a contact touch. The location of the touchscreen proximity-touched by the pointer may be the position of the pointer that vertically opposes the touchscreen when the pointer performs the proximity touch.
The proximity sensor may detect a proximity touch and/or a proximity touch pattern (e.g., a proximity touch distance, a proximity touch duration, a proximity touch position, a proximity touch shift state, etc.). Information corresponding to the detected proximity touch action and/or the detected proximity touch pattern may be outputted to the touchscreen.
The audio output module 152 may output audio data that is received from the wireless communication unit 110 in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode and/or the like. The audio output module 152 may output audio data stored in the memory 160. The audio output module 152 may output an audio signal relevant to a function (e.g., a call signal receiving sound, a message receiving sound, etc.) performed by the mobile terminal 100. The audio output module 152 may include a receiver, a speaker, a buzzer and/or the like. Also, the audio output module 152 may output audio data through an earphone jack 116. A user can listen to a sound by connecting an earphone to the earphone jack 116.
The alarm 153 may output a signal for announcing an event occurrence of the mobile terminal 100. An event occurring in the mobile terminal 100 may include one of a call signal reception, a message reception, a key signal input, a touch input and/or the like. The alarm 153 may output a signal for announcing an event occurrence by way of vibration or the like as well as a video signal or an audio signal. The video signal may be outputted via the display 151. The audio signal may be outputted via the audio output module 152.
The haptic module 154 may bring about various haptic effects that can be sensed by a user. Vibration is a representative example for the haptic effect brought about by the haptic module 154. Strength and pattern of the vibration generated from the haptic module 154 may be controllable. For example, vibrations differing from each other may be outputted in a manner of being synthesized together or may be sequentially outputted.
The haptic module 154 may generate various haptic effects including a vibration, an effect caused by such a stimulus as a pin array vertically moving against a contact skin surface, a jet power of air via outlet, a suction power of air via inlet, a skim on a skin surface, a contact of an electrode, an electrostatic power and the like, and/or an effect by hot/cold sense reproduction using an endothermic or exothermic device as well as the vibration.
The haptic module 154 may provide the haptic effect via direct contact. The haptic module 154 may enable a user to experience the haptic effect via muscular sense of a finger, an arm and/or the like. Two or more haptic modules 154 may be provided according to a configuration of the mobile terminal 100.
The memory 160 may store a program for operations of the controller 180. The memory 160 may temporarily store input/output data (e.g., phonebook, message, still picture, moving picture, etc.). The memory 160 may store data of vibration and sound in various patterns outputted in case of a touch input to the touchscreen.
The memory 160 may include at least one of a flash memory, a hard disk, a multimedia card micro type memory, a card type memory (e.g., SD memory, XD memory, etc.), a random access memory (RAM), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory, a programmable read-only memory, a magnetic memory, a magnetic disk, an optical disk, and/or the like. The mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 in the Internet.
The interface unit 170 may play a role as a passage to external devices connected to the mobile terminal 100. The interface unit 170 may receive data from an external device. The interface unit 170 may be supplied with a power and then the power may be delivered to elements within the mobile terminal 100. The interface unit 170 may enable data to be transferred to an external device from an inside of the mobile terminal 100. The interface unit 170 may include a wire/wireless headset port, an external charger port, a wire/wireless data port, a memory card port, a port for coupling to a device having an identity module, an audio input/output (I/O) port, a video input/output (I/O) port, an earphone port and/or the like.
The identity module may be a chip or card that stores various kinds of information for authenticating use of the mobile terminal 100. The identify module may include a user identity module (UIM), a subscriber identity module (SIM), a universal subscriber identity module (USIM) and/or the like. A device provided with the above identity module (hereafter an identity device) may be manufactured in the form of a smart card. The identity device may be connected to the mobile terminal 100 via the port.
The interface unit 170 may play a role as a passage for supplying a power to the mobile terminal 100 from a cradle that is connected to the mobile terminal 100. The interface unit 170 may play a role as a passage for delivering various command signals, which are inputted from the cradle by a user, to the mobile terminal 100. Various command signals inputted from the cradle or the power may work as a signal for recognizing that the mobile terminal 100 is correctly loaded in the cradle.
The controller 180 may control overall operations of the mobile terminal 100. For example, the controller 180 may perform control and processing relevant to a voice call, a data communication, a video conference and/or the like. The controller 180 may have a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented within the controller 180 or may be configured separate from the controller 180.
The controller 180 may perform pattern recognizing processing for recognizing a handwriting input performed on the touchscreen as a character and/or recognizing a picture drawing input performed on the touchscreen as an image.
The power supply 190 may receive an external or internal power and then supply the power required for operations of the respective elements under control of the controller 180.
Embodiments of the present invention explained in the following description may be implemented within a recording medium that can be read by a computer or a computer-like device using software, hardware or combination thereof.
According to the hardware implementation, arrangements and embodiments may be implemented using at least one of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, microcontrollers, microprocessors and electrical units for performing other functions. In some cases, embodiments may be implemented by the controller 180.
For a software implementation, arrangements and embodiments described herein may be implemented with separate software modules, such as procedures and functions, each of which may perform one or more of the functions and operations described herein. Software codes may be implemented with a software application written in any suitable programming language and may be stored in memory such as the memory 160, and may be executed by a controller or processor, such as the controller 180.
As shown in
The body may include a case (casing, housing, cover, etc.) that forms an exterior of the terminal. In this embodiment, the case may be divided into a front case 101 and a rear case 102. Various electric/electronic parts may be provided in a space between the front case 101 and the rear case 102. A middle case may be further provided between the front case 101 and the rear case 102.
The cases may be formed by injection molding of synthetic resin or may be formed of metal substance such as stainless steel (STS), titanium (Ti) or the like, for example.
The display 151, the audio output unit 152, the camera 121, user input units 130/131/132, the microphone 122, the interface unit 170 and the like may be provided on the terminal body, and more particularly on the front case 101.
The display 151 may occupy most of a main face of the front case 101. The audio output module 152 and the camera 121 may be provided at an area adjacent to one end portion of the display 151, while the user input unit 131 and the microphone 122 may be provided at another area adjacent to the other end portion of the display 151. The user input unit 132 and the interface unit 170 may be provided on lateral sides of the front and rear cases 101 and 102.
The user input unit 130 may receive a command for controlling an operation of the mobile terminal 100. The user input unit 130 may include a plurality of manipulating units 131 and 132. The manipulating units 131 and 132 may be called a manipulating portion and may adopt any mechanism of a tactile manner that enables a user to perform a manipulation action by experiencing a tactile feeling.
Content inputted by the first manipulating unit 131 or the second manipulating unit 132 may be diversely set. For example, a command such as start, end, scroll and/or the like may be inputted to the first manipulating unit 131. A command for a volume adjustment of sound outputted from the audio output unit 152, a command for a switching to a touch recognizing mode of the display 151 or the like may be inputted to the second manipulating unit 132.
As shown in
For example, the camera 121 may have a lower number of pixels to capture and transmit a picture of user's face for a video call, while the camera 121′ may have a greater number of pixels for capturing a general subject for photography without transmitting the captured subject. Each of the cameras 121 and 121′ may be installed on the terminal body to be rotated and/or popped up.
A flash 123 and a mirror 124 may be additionally provided adjacent to the camera 121′. The flash 123 may project light toward a subject in case of photographing the subject using the camera 121′. If a user attempts to take a picture of the user (self-photography) using the camera 121′, the mirror 124 may enable the user to view a user's face reflected by the mirror 124.
An additional audio output unit 152′ may be provided on the backside of the terminal body. The additional audio output unit 152′ may implement a stereo function together with the audio output unit 152 shown in
A broadcast signal receiving antenna 124 may be additionally provided at the lateral side of the terminal body as well as an antenna for communication or the like. The antenna 124 may be considered a portion of the broadcast receiving module 111 shown in
The power supply 190 for supplying a power to the mobile terminal 100 may be provided to the terminal body. The power supply 190 may be built within the terminal body. Alternatively, the power supply 190 may be detachably connected to the terminal body.
The touchpad 135 may be activated by interconnecting with the display 151 of the front case 101. The touchpad 135 may be provided in rear of the display 151 in parallel to one another. The touchpad 135 may have a size equal to or less than a size of the display 151.
Referring to
The liquid crystal display panel 10 includes a liquid crystal layer between two glass substrates. A lower glass substrate of the liquid crystal display panel 10 includes a plurality of data lines 14, a plurality of gate lines 15 intersecting the data lines 14, thin film transistors (TFTs) formed at the crossings of the data lines 14 and the gate lines 15, pixel electrodes respectively connected to the TFTs, and a storage capacitor Cst. Due to the intersecting arrangements of the data lines 14 and the gate lines 15, pixels are arranged on the liquid crystal display panel 10 in a matrix configuration.
An upper glass substrate of the liquid crystal display panel 10 includes a black matrix, a color filter, and a common electrode. In a vertical field driving mode such as a Twisted Nematic (TN) mode and a Vertical Alignment (VA) mode, the common electrode is formed on the upper glass; however, in a horizontal field driving mode such as an In-Plane Switching (IPS) mode and a Fringe Field Switching (FFS) mode, it is formed on the lower glass substrate. A polarizer is attached to each of the upper and lower glass substrates of the liquid crystal display panel 10, and an alignment layer for setting the pre-tilt angle of liquid crystals are formed on the inner surface coming into contact with liquid crystals.
The timing controller 11 receives digital video data RGB of an input image and timing signals Vsync, Hsync, DE, and DCLK from an external system board. The timing signals Vsync, Hsync, DE, and DCLK includes a vertical synchronizing signal Vsync, a horizontal synchronizing signal Hsync, a data enable signal DE, a dot clock signal DCLK, and/or the like. The timing controller 11 generates timing control signals DDC and GDC for controlling the operating timing of the source driver 12 and the gate driver 13 on the basis of the timing signals Vsync, Hsync, DE, and DCLK from the external system board. The system board or the timing controller 11 inserts an interpolation frame between frames of an input image signal inputted at a frame frequency of 60 Hz and multiples a gate timing control signal GDC with a source timing control signal DDC so that the operations of the source driver 12 and the gate driver 13 can be controlled with a frame frequency of 60×N Hz where N is a positive integer which is not less than 2.
The timing controller 11 supplies digital video data RGB of an input image, inputted from the external system board, to the dimming controller 30, and delivers digital video data R′G′B′, modulated by the local dimming controller 30, to the source driver 12.
The source driver 12 latches the digital video data R′G′B′ under control of the timing controller 11. The source driver 12 converts the digital video data R′G′B′ into a positive/negative analog data voltage by using a positive/negative gamma compensation voltage and supplies it to the data lines 14.
The gate driver 13 sequentially outputs gate pulses (or scan pulses) to select pixels to which data is to be written. The gate pulses are sequentially supplied to the gate lines in synchronization with a data voltage being supplied to the data lines 14 and sequentially supplied to the gate lines 15.
The backlight unit 20 is disposed under the liquid crystal display panel 10a and emits backlight to the liquid crystal display panel 10. The backlight unit 20 includes a plurality of light sources, and may be implemented into a direct type backlight unit or an edge type backlight unit. The light sources of the backlight unit 20 may include at least one or two or more types of light sources among Hot Cathode Fluorescent Lamps (HCFL), Cold Cathode Fluorescent Lamps (CCFL), External Electrode Fluorescent Lamps (EEFL), and Light Emitting Diodes (LED).
The light source driver 21 controls luminance of the light sources of the backlight unit 20 by using Pulse Width Modulation (PWM) by which a duty ratio is varied according to a dimming value (DIM) inputted from the dimming controller 30. A PWM signal controls the ratio between lighting and lighting-out of the light sources, and the duty ratio (%) is determined according to a dimming value (DIM) inputted from the dimming controller 30.
The dimming controller 30 receives digital video data (RGB) of an input image, namely, pixel data, calculates the histogram of the input image, and calculates a CDF or a CCDF on the basis of equation 1 or 2. The dimming controller 30 calculates an area of the input image on the basis of the CDF or CCDF and compares the area value with an area threshold value preset in a lookup table, thus determining a final MLD value satisfying an image-quality index of a predetermined level or higher. The dimming controller 30 generates a dimming value (DIM) for controlling the backlight luminance to the final MLD value or less. Furthermore, the dimming controller 30 calibrates pixel data of an input image using a preset calibration algorithm so as to calibrate transmittance of the liquid crystal display panel 20 according to a reduction in the backlight luminance, thereby generating modulated data R′G′B′, and supplies the data R′G′B′ to the timing controller 11. The dimming controller 30 may be configured within the timing controller 11.
Meanwhile, the dimming controller 30 may be configured separately from the controller 180 or incorporated into the controller 180.
Referring to
The MLD determination part 31 receives pixel data RGB of an input image. The MLD determination part 31 calculates a histogram of the input image, calculates an area while increasing MLD_i on the basis of the CDF or CCDF of the histogram, and compares the area value with an area threshold value preset in a lookup datable, thus determining a final MLD value satisfying an image-quality index of a predetermined level or higher.
The backlight luminance controller 33 controls the backlight driver 21 and lowers the backlight luminance by K/MIL(=255) where K denotes a clipped point of the final MLD value. The image calibration part 32 increases pixel data RGB of the input image by MIL(=255)/K so that the reduction in the backlight luminance is compensated for with the increase in the transmittance of the liquid crystal display panel 10.
The method of controlling a mobile terminal according to an example embodiment of the present invention can be implemented in the mobile terminal 100 described above with reference to
The mobile terminal 100 may include both the controller 180 and the dimming controller 30 or only the controller 180, and the controller 180 may perform all the functions performed by the dimming controller 30, described above with reference to
The controller 180 may call for a specific application stored in the memory 160 in operation S100.
The memory 160 may store various applications including the specific application.
Furthermore, the memory 160 may store a database defining a matching relationship between the application and a dimming algorithm. For example, the specific application may be matched with a specific dimming algorithm among a plurality of dimming algorithms stored in the database.
A first dimming algorithm shown in
A second dimming algorithm shown in
A third dimming algorithm shown in
A fourth dimming algorithm shown in
That is, the fourth dimming algorithm makes conversion from the first dimming algorithm to the second or third dimming algorithm on the basis of the preset maximum luminance.
The first to fourth dimming algorithms described above are applicable to Content Adapted Brightness Control (CABC). The CABC refers to a method of controlling luminance according to contents (input image).
According to technical aspects of the present invention, the first to fourth dimming algorithms may be used differently depending on an application. That is, when an application is changed, a dimming algorithm being used is also changed.
The database stored in the memory 160 may define a matching relationship between various algorithms and various applications.
The matching relationship between dimming algorithms and applications may be determined in due consideration of preset criteria including the importance of image quality.
For example, the first dimming algorithm does not cause any deterioration in image quality; however, it fails to reduce power consumption when full white (Gray 255) is present in any pixel data of a frame image.
Also, for example, the second dimming algorithm achieves at least the minimum power-saving effect in every image. However, the second dimming algorithm has a limitation in that deterioration in image quality occurs as the power-saving effect is boosted.
Thus, the first dimming algorithm may be applied to an application where image quality is considered relatively important, while the second dimming algorithm may be applied to an application where image quality is considered relatively less important.
The third dimming algorithm may also be applied to an application where image quality is considered relatively less important.
In general, the yellowish phenomenon of a skin color and color degradation occur in an RGB domain. Thus, the third dimming algorithm applies CABC image improvements only to Y, which is a brightness factor in a YCbCr domain. When an image is calibrated using Y, which is a brightness factor in the YCbCr domain, only brightness is changed without changing chrominance components. Therefore, as compared with the second dimming algorithm where calibration is performed in an RGB domain, the third dimming algorithm does not bring about skin-color degradation and color degradation.
Meanwhile, according to the fourth dimming algorithm, conversion from the first dimming algorithm to the second or third dimming algorithm occurs on the basis of the preset maximum luminance, which can thus ensure the minimum power-saving effect. Here, the preset maximum luminance may be arbitrarily set in due consideration of a desired minimum power-saving effect.
However, the fourth dimming algorithm may be applicable to a discontinuous image such as a still-image viewer because, if algorithm conversion occurs in a video where images are continuously changing, a viewer may feel the changes in the screen.
The database defining the matching relationship between applications and dimming algorithms may be determined in due consideration of predetermined criteria such as importance of image quality, power-saving degree, and/or the like.
The controller 180 may choose a dimming algorithm corresponding to the specific application in operation S110.
For example, referring to
Considering that the message application does not necessarily have good image quality, the second or third dimming algorithm may be set in advance to be applied thereto as shown in
When the message application depicted in
In calling for and executing the specific application, the controller 180, in operation S120, may control the luminance of the backlight unit 20 according to the dimming algorithm chosen in operation S110.
For example, in
In operation S130, the controller 180, while executing the specific application, may determine whether or not a luminance value of an input image falls within a reference luminance range corresponding to the dimming algorithm chosen in operation S110.
When it is determined in operation S130 that the luminance value of the input image falls outside the corresponding reference luminance range, the controller 180 may control the luminance of the backlight unit 20 by applying a different dimming algorithm corresponding to the luminance value of the input image in operation S140.
For example, a user may want to attach a picture to a message that the user is writing. Herein, since techniques for attaching a picture to a message are known, a detailed description thereof will be omitted.
In general, when a picture is attached to a message, the controller 180 may display the attached picture 40 on the display 151 in entirety or in the form of a preview image as shown in
In this case, when a luminance value of the attached picture 40 goes out of a reference luminance range corresponding to the second or third dimming algorithm, the controller 180 may control the luminance of the backlight unit 20 by applying the first dimming algorithm thereto.
The application of the first dimming algorithm prevents degradation in image quality with respect to the attached picture 40. However, if the second or third dimming algorithm rather than the first dimming algorithm is continuously used, the possibility of degradation in image quality of the attached picture 40 is significantly increased.
Here, it is assumed that a user has deleted the attached picture 40 because the user does not like the attached picture 40.
Since the attached picture 40 has been deleted, the controller 180 stops the application of the first dimming algorithm and resumes the applications of the second or third dimming algorithm which was initially matched with the message application.
The above example embodiment of the present invention has used the message application by way example, but it should be noted that technical aspects of the invention are not limited just to the message application.
The matching relationship between the dimming algorithms and applications, exemplified in
Of late, mobile terminals and electronic devices such as smart phones, and tablet PCs use 3G or 4G communication networks to support functions for freely downloading and installing a variety of applications.
When a user downloads a specific application and installs it in the mobile terminal 100, the controller 180 may acquire information regarding image-quality characteristics of the specific application at the time of initial downloading thereof (or at the time of the first execution after downloading), select a dimming algorithm conforming to the acquired information regarding image-quality characteristics from among various dimming algorithms (e.g., the first to fourth dimming algorithms), match the selected dimming algorithm with the specific application, and store it in the database.
The controller 180 may acquire the image regarding image-quality characteristics by using information provided by the specific application (e.g., header information or attribute information).
Also, the controller 180 may acquire the information regarding image-quality characteristics by directly scanning an image frame of a specific screen included n the specific application in the units of pixels.
Furthermore, the controller 180 may provide a user interface for the acquisition of the image regarding image-quality characteristics at the time of downloading or installing the specific application, and acquire the image regarding image-quality characteristics of the specific application through the user interface.
For example, referring to
A user may select a desired dimming algorithm by using the user's knowledge or information provided to the user interface 45. The selected dimming algorithm may be changed or reconfigured by the user.
Furthermore, when the fourth dimming algorithm is applied for a specific application or screen, the controller 180 may provide a user with a user interface allowing the user to set the minimum power saving strength. Here, when the user increases the minimum power saving strength, a reference point of the maximum luminance is lowered, which thus is in proportion to degradation in image quality.
Also, as described above, the fourth dimming algorithm may be applied to an application where no change in an input image occurs on a time axis, such as a still image viewer, but this is not always the case.
For example, in the case of high-speed search such as slipping, the fourth dimming algorithm is applicable even to an application where changes in an input image occur on the time axis. This is possible because a change in an image caused by algorithm conversion is hard to recognize since the high-speed search such as slipping renders the input image discontinuous on the time axis.
Meanwhile, a communication module may be additionally configured in the liquid crystal display (corresponding to the display 151) described with reference to
For example, referring to
The mobile terminal and the method of controlling the mobile terminal according to the present invention have the following effects.
According to the present invention, a different backlight dimming algorithm is applied according to an application, thus allowing for backlight dimming reflecting characteristics of the application.
Also, according to the present invention, a dimming algorithm is selectively applied in association with an input image during execution of a specific application, thereby allowing for backlight dimming reflecting characteristics of an input image even within the specific application.
The above described method of controlling a mobile terminal according to the present invention can be written as computer programs and can be implemented in digital computers that execute the programs using a computer readable recording medium.
The method of controlling a mobile terminal according to embodiments of the present invention can be executed through software. The software can include code segments that perform required tasks. Programs or code segments can also be stored in a processor readable medium and transmitted.
The computer readable recording medium includes all types of recording devices storing data readable by computer systems. Examples of the computer readable recording medium include ROM, RAM, CD-ROM, DVD±ROM, DVD-RAM, magnetic tapes, floppy disks, hard disks, and optical data storage devices. The computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2011-0041335 | May 2011 | KR | national |