This application claims priority from and the benefit of Korean Patent Application No. 10-2014-0160949, filed on Nov. 18, 2014, which is hereby incorporated by reference for all purposes as if fully set forth herein.
1. Field
Exemplary embodiments of the present invention relate to a touch display device, and in particular, relate to a low-latency touch display device having reduced display lag between a touch sensor and an image display unit, and a driving method thereof.
2. Discussion of the Background
A data flow of a touch display device including a touch screen panel (TSP) will be generally described.
A user's touch input is received through a touch sensor, and the touch input is processed by an application processor (AP). An operating system and an application are operated by the application processor. Processed touch information is combined with a still image or a motion picture and is then output to an image display unit.
During the process, at least 100 ms passes before a user's touch input is visibly output. In general, a user is able to recognize a time lag around the time of 30 ms. As a result, when the above-noted data processing is performed, the user recognizes that a touch reaction is delayed.
Therefore, it is important to minimize the display lag from the touch input to a touch information output.
To minimize the display lag, a technique for immediately visualizing a touch input and outputting a visualized touch input with a default value when it is input by a user has been developed. The technique reduces the display lag since it immediately visualizes the touch input with a predetermined default value without undergoing the processing by the application processor.
The technique generally reduces the display lag by overlaying the image on the touch input.
However, the lag-reducing technique cannot be used in all cases. That is, processing by the application processor may be necessary depending on different types of touch information, which therefore fails to reduce the display lag in all cases.
The above information disclosed in this Background section is only for enhancement of understanding of the background of the inventive concept, and, therefore, it may contain information that does not form the prior art that is already known in this country to a person of ordinary skill in the art.
Exemplary embodiments of the present invention disclose a touch display device for immediately applying a user's touch input to a display frame.
Additional features of the invention will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the invention.
An exemplary embodiment of the present invention discloses a touch display device, including: an image display unit includes a plurality of pixels; a touch sensor for sensing a user's touch input; a first memory for storing data per display frame; and a controller for, when a pixel group including at least one pixel is selected according to a position of a touch input sensed by the touch sensor and data that correspond to the pixel group are not read from the first memory, updating data that correspond to the pixel group based on the data caused by the touch input.
An exemplary embodiment of the present invention also discloses a method for driving a touch display device, including: receiving a user's touch input; selecting a pixel group including at least one pixel according to a position of the touch input; when data that correspond to the pixel group are not read from a first memory, updating the data that correspond to the pixel group in a present display frame; when data that correspond to the pixel group are read from the first memory, updating the data that correspond to the pixel group in a subsequent present display frame; and reading the data of the present display frame and outputting the same as an image by an image display unit.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed.
The accompanying drawings, which are included to provide a further understanding of the inventive concept, and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the inventive concept, and, together with the description, serve to explain principles of the inventive concept.
In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of various exemplary embodiments. It is apparent, however, that various exemplary embodiments may be practiced without these specific details or with one or more equivalent arrangements. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring various exemplary embodiments.
In the accompanying figures, the size and relative sizes of layers, films, panels, regions, etc., may be exaggerated for clarity and descriptive purposes. Also, like reference numerals denote like elements.
When an element or layer is referred to as being “on,” “connected to,” or “coupled to” another element or layer, it may be directly on, connected to, or coupled to the other element or layer or intervening elements or layers may be present. When, however, an element or layer is referred to as being “directly on,” “directly connected to,” or “directly coupled to” another element or layer, there are no intervening elements or layers present. For the purposes of this disclosure, “at least one of X, Y, and Z” and “at least one selected from the group consisting of X, Y, and Z” may be construed as X only, Y only, Z only, or any combination of two or more of X, Y, and Z, such as, for instance, XYZ, XYY, YZ, and ZZ. Like numbers refer to like elements throughout. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
Although the terms first, second, etc. may be used herein to describe various elements, components, regions, layers, and/or sections, these elements, components, regions, layers, and/or sections should not be limited by these terms. These terms are used to distinguish one element, component, region, layer, and/or section from another element, component, region, layer, and/or section. Thus, a first element, component, region, layer, and/or section discussed below could be termed a second element, component, region, layer, and/or section without departing from the teachings of the present disclosure.
Spatially relative terms, such as “beneath,” “below,” “lower,” “above,” “upper,” and the like, may be used herein for descriptive purposes, and, thereby, to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the drawings. Spatially relative terms are intended to encompass different orientations of an apparatus in use, operation, and/or manufacture in addition to the orientation depicted in the drawings. For example, if the apparatus in the drawings is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the exemplary term “below” can encompass both an orientation of above and below. Furthermore, the apparatus may be otherwise oriented (e.g., rotated 90 degrees or at other orientations), and, as such, the spatially relative descriptors used herein interpreted accordingly.
The terminology used herein is for the purpose of describing particular embodiments and is not intended to be limiting. As used herein, the singular forms, “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Moreover, the terms “comprises,” comprising,” “includes,” and/or “including,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, components, and/or groups thereof, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof
Referring to
Touch sensing by the touch sensor 100 is realized through a sensor. The sensor is classified according to various methods including a resistive type, a capacitive type, an electro-magnetic (EM) type, and an optical type, but is not limited thereto.
The image display unit 200 includes any type of display device. The display unit 100 includes any type of devices for outputting still images or motion pictures recognizable by viewers such as a plasma display, a liquid crystal display (LCD), a light emitting diode (LED) display, and an organic light emitting diode (OLED) display.
The image display unit 200 includes a plurality of pixels PX. The pixels PX are arranged in a matrix form.
The touch sensor 100 and the image display unit 200 may be integrally formed.
The first memory 400 includes any type of memory device. The memory device may be a volatile memory device and/or a non-volatile memory device. The volatile memory device may be a memory device for deleting stored data when power is no longer supplied. The volatile memory device may be, for example, a static RAM (SRAM), a dynamic RAM (DRAM), and a synchronous DRAM (SDRAM).
In the present specification, a DDR-SDRAM available for simultaneous reading and writing will be described as a reference, but other configurations are contemplated. For example, in the case of a memory that is inappropriate for performing simultaneous reading and writing, exemplary embodiments of the present invention are realizable by using an additional buffer (i.e., a second memory).
The memory is applicable to a wide range of usage depending on its role in the device, and the first memory 400 according to an exemplary embodiment of the present invention may be used as a frame buffer. Hence, the first memory 400 may store a display frame to be output to the image display unit 200.
The second memory may be a line buffer for storing the data of the display frame of the first memory for respective lines. The lines in this case may be singular or plural. The image display unit 200 reads data not from the first memory 400 but from the second memory, and outputs the same as an image.
The respective data configuring the display frame may correspond to the respective pixels of the image display unit. The data may indicate gray values of the pixels. The correspondence relationship may depend on the types of display device and the associated rendering. For ease of description, a case where the data matches the pixels respectively will be referred to. In this case, positions of the data may be determined by positions of the pixels.
A controller 300 may update the data stored in the first memory according to a condition. The condition represents a state in which the data corresponding to a pixel group are read from the first memory 400. The condition also may indicate a state of whether the data corresponding to the first pixel is read from the first memory 400. As described above, the second memory rather than the first memory may be usable as a reference.
The updated data may represent data that correspond to the pixel group. For example, when the user generates a touch through a writing trace, the data is updated by overlaying the writing trace data caused by a touch input on the existing image data of the pixel group.
The controller 300 may include a driver IC, a touch IC, a timing controller, a gate driver, a source driver, an application processor, and a memory for processing an image. This is merely exemplary and the invention is not limited thereto.
Respective quadrangles divided by horizontal and vertical dotted lines of
Sensed ranges when a touch induced by the user's finger is sensed on the touch sensor 100 are referred to as touch inputs 110 and 111. Referring to
In the exemplary embodiment of
In this instance, in the case of the touch input 110, a pixel group configured with four pixels PX1, PX2, PX3, and PX4 is selected.
A memory reading position 121 signifies a position where data of a display frame are read from the first memory 400.
A memory reading start position 120 signifies the position of initial data where the display frame is read.
In the exemplary embodiment of
In the case of the touch input 110, the pixels PX1 and PX2 that are read first with reference to the first direction may be referred to as first pixels.
Referring to
A touch frame and a display frame will be described later with reference to the exemplary embodiment of
A case in which the touch input 110 is performed will now be described. When the touch frame in which the touch input 110 is generated is finished, a touch frame ready pulse (Ready) may occur.
When the touch frame ready pulse (Ready) is generated, the memory reading position 121 is a third row as shown in
The updated data may be read by updating the data that corresponds to the selected pixel group. As the updated data is read, the user's touch is immediately applied in the present display frame.
As described above, the present exemplary embodiment will be described with reference to the DDR-SDRAM that is simultaneously readable and writable. However, in the case of a memory which is not simultaneously readable and writable, exemplary embodiments of the present invention may be realizable by using, for example, an additional buffer (i.e., a second memory).
In comparative embodiments, the data of the display frame that has started being read is not updated so the touch input that is input in the present display frame may be applicable in the next display frame. However, in exemplary embodiments of the present invention a display device driven at 60 Hz may reduce display lag to be less than 16.67 ms (the length one frame).
Referring to
When the touch frame ready pulse (Ready) occurs, the memory reading position 121 is the third row as shown in
The data that corresponds to the selected pixel group is updated in the coming display frame.
In a like manner of the exemplary embodiment of
The touch frame is independently driven from the display frame. For convenience of description, a period of the touch frame is described to be less than that of the display frame in exemplary embodiments of the present invention. The touch frame is driven at 90-100 Hz and the display frame is driven at 60 Hz. Therefore, at least one touch frame is temporally provided in the display frame.
The touch frame signifies a unit of time to scan a region of the touch sensor 100. Hence, a first touch frame, a second touch frame, and a third touch frame appear for convenience of explanation as if they are separated in a spatial manner in
The first touch frame, the second touch frame, and the third touch frame in
The pixel groups respectively correspond to the touch frames. In the example shown in
A portion where the user's touch input is not recognizable is interpolated.
In the exemplary embodiment of
The first pixels of the pixel group of the first touch frame are pixels 112 and 112′, the first pixel of the pixel group of the second touch frame is pixel 113, and the first pixels of the pixel group of the third touch frame are pixels 114 and 114′.
When the data corresponding to the above-noted first pixels is not yet read from the first memory 400, the data corresponding to the pixel group is updated in the present display frame.
When the data corresponding to the above-noted first pixels is already read from the first memory 400, the data corresponding to the pixel group is updated in the next display frame.
Referring to
When a touch frame ready pulse (Ready 1) of the first touch frame is generated, the memory reading position 121 is the second row as shown in
Therefore, the data that corresponds to the pixel group of the first touch frame is updated in the present display frame.
When a touch frame ready pulse (Ready 2) of the second touch frame is generated, a memory reading position 122 is the fourth row as shown in
Therefore, the data that correspond to the pixel group of the second touch frame are updated in the present display frame.
When a touch frame ready pulse (Ready 3) of the third touch frame is generated, a memory reading position 123 is the eighth row as shown in
Therefore, the data that corresponds to the pixel group of the third touch frame is updated in the subsequent display frame.
A user's touch input is sensed by the touch sensor 100 (S610).
Rates of the display frame and the touch frame may be variable by a touch report rate and a display refresh rate of the touch sensor 100. The touch frame may be driven in synchronization with a horizontal synchronization period Hsync or a vertical synchronization period Vsync.
It is determined whether the data of the pixel is read from a memory (S620). As described in the above-noted exemplary embodiments, when the corresponding touch frame is ready, the first pixel of the pixel group may be provided after the position where the present memory is read with reference to the first direction in which the data is read. That is, the first pixel may not have been read from the present memory.
In this case, it may be determined that the output of the image display unit 200 can be immediately updated when the pixel group of the corresponding touch frame is updated in the present display frame. Therefore, it is updated by applying the data having the modified present display frame (S630).
However, when the first pixel is already read from the memory and the pixel group of the corresponding touch frame is updated in the present display frame, it may be determined that the output of the image display unit 200 is not immediately updated. Hence, the modified data is updated not in the present display frame but in the next display frame (S640).
The present display frame that is updated in S630 or that is not updated in S640 is output by the image display unit (S650).
According to exemplary embodiments, controllers, such as timing controllers, gate drivers, and/or data drivers, and the like, may be implemented via one or more general purpose and/or special purpose components, such as one or more discrete circuits, digital signal processing chips, integrated circuits, application specific integrated circuits, microprocessors, processors, programmable arrays, field programmable arrays, instruction set processors, and/or the like.
According to exemplary embodiments, the processes described herein to facilitate image signal processing and the display of images via image display unit 200 may be implemented via software, hardware (e.g., general processor, Digital Signal Processing (DSP) chip, an Application Specific Integrated Circuit (ASIC), Field Programmable Gate Arrays (FPGAs), etc.), firmware, or a combination thereof. In this manner, the display device including image display unit 200 may include or otherwise be associated with one or more memories including code (e.g., instructions) configured to cause the image display unit 200 to perform one or more of the processes and/or features described herein.
The memories described herein may be any non-transitory medium that participates in providing code/instructions to the one or more software, hardware, and/or firmware for execution. Such memories may take many forms, including but not limited to non-volatile media and volatile media. Non-volatile media include, for example, optical or magnetic disks. Volatile media include dynamic memory. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, CDRW, DVD, any other optical medium, punch cards, paper tape, optical mark sheets, any other physical medium with patterns of holes or other optically recognizable indicia, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read.
The accompanying drawings and the exemplary embodiments of the present invention are only examples of the present invention, and are used to describe the present invention but do not limit the scope of the present invention as defined by the following claims. Thus, it will be understood by those of ordinary skill in the art that various modifications and equivalent embodiments may be made. Therefore, the technical scope of the present invention may be defined by the technical idea of the following clams.
Although certain exemplary embodiments and implementations have been described herein, other embodiments and modifications will be apparent from this description. Accordingly, the inventive concept is not limited to such embodiments, but rather to the broader scope of the presented claims and various obvious modifications and equivalent arrangements.
Number | Date | Country | Kind |
---|---|---|---|
10-2014-0160949 | Nov 2014 | KR | national |