The present disclosure relates to a method for providing a feedback in response to a user input related to a touch panel, and a terminal implementing the same.
Currently, a terminal, such as a smart phone and a tablet PC, is generally equipped with a touch panel. The terminal displays objects on a screen and when a user touches a certain object among the displayed objects, the terminal detects a user touch from the touch panel, and may perform a corresponding function in response to the touch. For example, when a certain text is touched in a webpage, another webpage that is linked to the text may be displayed on the screen.
Accordingly there is a need for an improved method for providing a visual feedback in response to a hovering of the touch input means related to an object, and a terminal implementing the same.
The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present disclosure.
Currently, in the relater art, a visual effect for an object before touch is not provided. Therefore, when the objects are packed densely, it may be difficult to select an object that user needs. That is, it may cause a problem that a function is executed when an unintended object is touched by the user. In addition, the user may not know which function is related to the object. In this case, the corresponding function may not be executed until the user recognizes which function is related to the object. Meanwhile, other object may be displayed on the object. For example, an electronic document may be displayed on a webpage. That is, a certain object of the webpage may not be displayed due to the electronic document. Thus, a certain object may not be executed until a display of the electronic document is terminated.
Aspects of the present disclosure are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide a method for providing a visual feedback in response to a hovering of the touch input means (e.g., a finger or a pen) related to an object, and a terminal implementing the same.
In accordance with an aspect of the present disclosure, a method for providing a feedback is provided. The method includes detecting a hovering of a touch input means with respect to the object, and providing a visual feedback corresponding to a distance between the object and the touch input means in response to the hovering.
In accordance with another aspect of the present disclosure, a method for providing a feedback is provided. The method includes displaying an object on the screen, detecting a hovering of a touch input means with respect to the object, and providing a visual feedback related to a function corresponding to the object in response to the hovering.
In accordance with another aspect of the present disclosure, a method for providing a feedback is provided. The method includes displaying an image on an object, detecting a hovering of a touch input means with respect to the object, displaying the object on the image in response to the hovering, detecting a touch gesture of the touch input means corresponding to the object in a state in which the object is displayed on the image, and performing a function of the object in response to the touch gesture.
In accordance with another aspect of the present disclosure, a terminal is provided. The terminal includes a display unit configured to display an object on a screen, a touch panel installed on a screen of the display unit, and a controller configured to control the display unit and the touch panel, wherein the controller controls the display unit to detect a hovering of the touch input means corresponding to the object from the touch panel, and provides a visual feedback corresponding to a distance between the object and the touch input means in response to the hovering.
In accordance with another aspect of the present disclosure, a terminal is provided. The terminal includes a display unit configured to display an object on a screen, a touch panel installed on the screen of the display unit, and a controller configured to control the display unit and the touch panel, wherein the controller detects a hovering of a touch input means corresponding to the object from the touch panel, and provides a visual feedback related to a function corresponding to the object in response to the hovering.
In accordance with another aspect of the present disclosure, a terminal is provided. The terminal includes a display unit configured to display an image on an object, a touch panel installed on a screen of the display unit, and a controller configured to control the display unit and the touch panel, wherein the controller detects a hovering of a touch input means corresponding to the object from the touch panel, and controls the object displayed on the image in response to the hovering.
Other aspects, advantages, and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses various embodiments of the present disclosure.
The above and other aspects, features, and advantages of certain embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the present disclosure as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the various embodiments described herein can be made without departing from the scope and spirit of the present disclosure. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the present disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the present disclosure is provided for illustration purpose only and not for the purpose of limiting the present disclosure as defined by the appended claims and their equivalents.
It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
A terminal according to the present disclosure has a touch panel. In addition, the terminal according to the present disclosure may be a portable terminal, or a terminal that is installed in a vehicle, or a computer that is for business or home use. Specifically, the terminal according to the present disclosure may be a smart phone, a tablet PC, a notebook PC, a digital camera, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), a media player (for example, a MP3 player), a navigation terminal, and a desktop PC, and the like. In addition, the terminal according to the present disclosure may be a display unit and a touch panel installed in the display unit, and a home appliances (e.g., refrigerator, a TV, a washing machine) having a controller controlling thereof.
When the touch input means (e.g., a pen point, a fingertip, a capacitive touch pen, an electromagnetic touch pen, and the like.) approaches, for example, within 10 cm on the screen, the terminal may detect a proximity of the touch input means. For example, when a distance (e.g., a depth) between these two is within 5 cm, the terminal may detect a movement and a direction of the touch input means. For example, when a distance between these two is less than 3 cm, the terminal can detect a position of the touch input means. Likewise, the distance between the touch input means and the screen may be named as a hovering when the touch input means approaches to a level where the terminal can detect the position of the touch input means. Here, 3 cm is just one of values, and the detection of hovering is not defined by this value. For example, the value may be influenced according to a performance of the terminal.
The terminal according to the present disclosure may provide a visual feedback in response to the hovering. An auditory feedback (e.g., a voice) and a tactile feedback (e.g., a vibration of a terminal) as well as the visual feedback may be provided by the terminal. The visual feedback may be referred to as visual information, a visual content, a visual effect, and a visual hint, and the like. The auditory feedback may be referred to as auditory information, an auditory content, an auditory effect, and an auditory hint, and the like. The tactile feedback may be referred to as tactile information, a tactile content, a tactile effect, a haptics, and the like.
Referring to
The display unit 110 may display a data on a screen under the control of the controller 160. That is, when the controller 160 processes (e.g., a decoding) the data and stores the data in a buffer, the display unit 110 converts the data stored in the buffer into an analog signal and display the data on the screen. When a power is supplied to the display unit 110, the display unit 110 may display a lock image on the screen. When unlock information is detected in a state in which the lock image is displayed, the controller 160 may release the lock. The display unit 110 may display an indicator instead of the lock image on an upper part of the screen, and may display a home image on the bottom of the screen under the control of the controller 160. The indicator is an image to guide a use state of the portable terminal 100, for example, a 3G connection status, a battery level, a current time, and a Wi-Fi connection status to the user. When the user touches the indicator and drags the indicator down in such state, the controller 160 may extend an area of the indicator in response to the drag. The additional information such as a reception message, the weather, the stock information, and a temperature may be displayed on an extended area. The home image may include a background image (e.g., a photo set by the user) and a plurality of icons displayed on the background image. Here, the icons may indicate each application or the content (e.g., a photo file, a video file, a recording file, a document, a message, and the like.). When one of the icons, for example, an application icon is selected by the user, the controller 160 may execute the corresponding application. The display unit 110 may receive an execution image of the application from the controller 160 to output by converting into an analog signal.
The display unit 110 may display the images on the screen as a multi-layer structure under the control of the controller 160. For example, the display unit 110 may display a photo on the indicator and the home image.
The display unit 110 may be configured with a Liquid Crystal Display (LCD), an Active Matrix Organic Light Emitted Diode (AMOLED), a Flexible display, or a transparent display.
A touch panel 111 is installed on a screen of the display unit 110. In detail, the touch panel 111 may be implemented as an add-on type positioned on the screen of the display unit 110, or as an on-cell type or an in-cell type that is inserted within the display unit 110.
The touch panel 111 may generate an event in response to a user gesture for the screen, and deliver to the controller 160, by Analog to Digital (A/D) conversion, the event. Here, the event may be an access event, a hovering event, or a touch event.
When the touch input means approaches the screen, the touch panel 111 may generate the access event in response to the approach of the touch input means. The access event may include information indicating a movement and a direction of the touch input means.
When the touch input means is hovering in proximity to the screen, the touch panel 111 may generate the hovering event in response to the hovering. The hovering event may include one or more hovering coordinates (x, y). For example, a touch Integrated Circuit (IC) of the touch panel 111 may detect the hovering, determines a hovering area on the screen in response to the hovering, and may deliver the coordinate (x, y) that is included in the hovering area to the controller 160. Here, the hovering coordinate may be a pixel unit. For example, when a resolution of the screen is 640 (the number of horizontal pixels)*480 (the number of vertical pixels), an X-axis coordinate is (0, 640), and a Y-axis coordinate is (0, 480). When the hovering coordinate is received from the touch panel 111, the controller 160 may determine that the touch input means is hovering in proximity to the touch panel 111, and when the hovering coordinate is not received from the touch panel 111, the controller 160 may determine that the hovering of the touch input means is being released. In addition, the hovering event may include the detecting of information to calculate a depth value. For example, the hovering coordinate may be (x, y, z). That is, z value refers to the depth value.
When the touch input means touches the screen, the touch panel 110 may generate the touch event in response to the touch. Here, the touch event may include one or more of a touch coordinates (x, y). For example, a touch Integrated Circuit (IC) of the touch panel 111 may detect a user touch, may determine a touch area in response to the touch, and may deliver a touch coordinate (x, y) included in the touch area to the controller 160. Here, the touch coordinate may be a pixel unit. When the touch coordinate is received from the touch panel 111, the controller 160 may determine that the touch input means is touching the touch panel 111. When the touch coordinate is not received from the touch panel, the controller 160 may determine that the touch of the touch input means is being released. In addition, when the touch coordinate is changed, and the touch coordinate's variation exceeds a preset movement threshold, the controller 160 may generate the movement of the touch input means. The controller 160 may calculate a position variation (dx, dy) of the touch input means and a movement speed of the touch input means in response to the movement of the touch input means. The controller 160 may determine a user gesture related to the screen as any one of a touch, a multi touch, a tap, a double tap, a long tap, a tap & touch, a drag, a flick, a press, a pinch in, or a pinch out based on the touch coordinate, the touch release of the touch input means, the movement of the touch input means, and the position variation of the touch input means, and the movement speed of the touch input means.
The touch panel 111 may be a complex touch panel configured with a finger touch panel detecting a hand gesture and a pen touch panel detecting a pen gesture. Here, the hand touch panel may be implemented by a capacitive type. Of course, the hand touch panel may also be implemented by a resistive type, an infrared type, or an ultrasonic type. In addition, the hand touch panel may generate the event not only by the hand gesture, but also generate the event by other physical solid (e.g., a conductive material physical solid that may cause a capacitive change). The pen touch panel may be configured with an electromagnetic induction type. Accordingly, the pen touch panel may generate the event by specially designed touch pen to form a magnetic field. The pen touch panel may generate a key event. For example, when a key installed in the pen is pressed, the magnetic field generated from a coil of the pen may be changed. The pen touch panel may generate the key event in response to the change of the magnetic field, and may deliver to the controller 160.
The key input unit 120 may include at least one touch key. The touch key may be implemented by a capacitive type or a resistive type to detect a user touch. The touch key may generate the event in response to the user touch, and may deliver to the controller 160. In addition, the touch key may be installed adjacent to the screen (e.g., bottom of the screen). Furthermore, the key input unit 120 may include another type of key in addition to the touch type. For example, the key input unit 120 may include a home key of a dome key type. When the user presses the home key, the home key may be changed to be contacted to a printed circuit board, and thus, the key event may be generated from the printed circuit board to deliver to the controller 160.
The storing unit (secondary memory unit: 130) may be a disc, a RAM, a ROM, or a flash memory, and the like. The storing unit 130 may store a data generated from the portable terminal 100, or received from an external devices (e.g., a server, a desktop PC, and a tablet PC) through the wireless communication unit 140 under the control of the controller 160. In addition, the storing unit 130 may temporarily store a data copied from a message, a photo, a webpage, or a document by the user in order to copy & paste. Furthermore, the storing unit 130 may store various set values (e.g., a brightness of the screen, a vibration in the touch generation, an automatic rotation of the screen, and the like.) for an operation of the portable terminal 100.
The storing unit 130 may store a booting program, one or more operating systems, and applications. The operating system serves as an interface between a hardware and an application, and between the applications, and may manage a computer resources such as a Central Processing Units (CPU), Graphic Processing Units (GPU), a main memory, and a storing unit 130, and the like. The applications may be categorized by an embedded application and a 3rd party application. For example, the embedded application may be a Web browser, an email program, and an instant messenger, and the like.
The wireless communication unit 140 may perform a voice call, a video call, or a data communication with an external device through a network under the control of the controller 160. The wireless communication unit may include a radio frequency transmission unit that performs up conversion and amplification of a frequency of the transmitted signal, and a radio frequency reception unit that performs low noise amplification and down conversion of a frequency of a received signal. In addition, the wireless communication unit 130 may include a mobile communication module (e.g., 3-Generation mobile communication module, 3.5-Generation mobile communication module, or 4-Generation mobile communication module, and the like.), a Digital Broadcasting Module (e.g., DMB module), and a local communication module (e.g., a Wi-Fi module, a Bluetooth module, a Near Field Communication module).
The audio processing unit 150 may perform an input and an output of an audio signal (e.g., a voice data) for voice recognition, a voice recording, a digital recording, and a phone call by combining to a SPK and a MIC. The audio processing unit 150 may receive the audio signal from the controller 160, and may output the received audio signal to the SPK after performing Digital to Analog (D/A) conversion and amplifying thereof. The SPK may convert the signal received from the audio processing unit 150 to a sound wave and output. Meanwhile, the portable terminal 100 may be equipped with a plurality of speakers. For example, a first speaker is used for call, and may be named as a receiver. That is, the first speaker is used when a user calls while holding the portable terminal 100 near an ear. A second speaker is used for play of data such as music and video as well as call, and named as a loud speaker. The MIC converts a sound wave delivered from a person or other sound sources into the audio signal. The audio processing unit 150 may deliver the audio signal received from the MIC to the controller 160 after performing an Analog to Digital (A/D) conversion.
The controller 160 may control an overall operation of the portable terminal 100 and a signal flow between the internal configurations of the portable terminal 100, perform a function processing a data, and control a power supply to the configurations from a battery.
The controller 160 may include one or more Central Processing Units (CPU). As is well known, the CPU is an essential control unit of the computer system that performs a calculation and a comparison of a data, and an interpretation and an execution of an instruction. The CUP may include various registers that stores the data and the instruction temporarily. The controller 160 may include one or more Graphic Processing Units (GPU). The GPU is a graphic control unit that performs a calculation and a comparison of a data related to the graphic and an interpretation and an execution of an instruction, on behalf of the CPU. The CUP and the GPU may be integrated into one package in which two or more independent cores (e.g., a quad-core) are consisted as a single integrated circuit. That is, the CPUs may be integrated as one multi core processor. In addition, a plurality of GPUs may also be integrated as one multi core processor. Further, the CPU and the GPU may be integrated as one chip (System on Chip: SoC). Furthermore, the CPU and the GPU may be packaged with a multi-layer. Meanwhile, an Application Processor (AP) may include the CPU and the GPU. Furthermore, the AP may further include an ISP.
The controller 160 may include a main memory unit, for example, a RAM. The main memory may store various programs, for example, a booting program, an operating system, and applications loaded from the storing unit 130. When a power of the battery is supplied to the controller 160, first, the booting program is loaded to the main memory unit of the controller 160. Such booting program may load the operating system to the main memory unit. The operating system may load the applications to the main memory. The controller 160 (e.g., AP) may decode the program instruction by the approaching of the touch input means to an object associated with such program, and execute a function (e.g., provide a feedback in response to a hovering) according to the decoding result. In addition, the controller 160 may temporarily store a data to be written in the storing unit 130, and include a cache memory which temporarily stores a data read from the storing unit 130.
A pen 170 is an accessory of the portable terminal 100 that is detachable from the portable terminal, and may include a pen point placed at the end of a penholder, a coil that is placed inside of the penholder adjacent to the pen point and generates a magnetic field, and a button to change a magnetic field. The coil of the pen 170 may form the magnetic field around the pen point. The touch panel 111 may detect the magnetic field, and may generate the event corresponding to the magnetic field.
Meanwhile, the portable terminal 100 may further include the configurations which were not described above such as an ear jack, a vibration motor, a camera, an acceleration sensor, a proximity sensor, an illuminance sensor, a GSP receiving unit, and an accessory. Here, the accessory is an accessory of the portable terminal 100 which is detachable from the portable terminal 100, and may be a pen 170.
Referring to
Referring to
A hand of a user may approach in proximity to the screen. In response to this approach, the touch panel 111 may generate an access event and deliver the access event to the controller 160. Accordingly, the controller 160 may recognize that something has approached in proximity to the screen.
Referring to
Referring to
Referring to
Referring to
When the user releases a touch in the first item 311, the controller 160 may control the display unit 110 to display a detailed content of the first item 311 in response to the touch release. As the index finger 320 gets farther away from the first item 311, the background color of the first item 311 becomes lighter. When the hovering is released from the first item 311, the background color of a first message 311 becomes identical to other items.
Meanwhile, the “luminosity” as described above is just only one of the properties of the corresponding object, and the “luminosity” does not limit the technological concept of the present disclosure. That is, besides the luminosity, a color, a brightness, a size (e.g., a size of a letter included in the object), and a shape (e.g., a shape of a letter included in the object) may be changed.
In addition, an “item” in the above is one example of the object, and it does not limit the technological concept of the present disclosure. That is, the object is displayed on the screen and provides information to the user, and may be a message, a text, an image, an icon, a thumbnail, a photo, a tool bar, a check box, a widget, a webpage, a page of an electronic book, a key or a menu, and the like, in addition to the item.
In addition, an effect in the above example may include an auditory effect or a tactile effect besides the visual effect. For example, when the hovering is detected, the controller 160 may control the audio processing unit 150 so that a first sound effect may be outputted. In addition, when a touch is detected, the controller 160 may control the audio processing unit 150 so that a second sound effect may be output. Further, when the touch is detected, the controller 160 may vibrate a vibration motor.
In addition, the touch input means in the above is exemplified with a user's index finger, however, other physical solid, for example, a pen (e.g., a capacitive touch pen, an electromagnetic touch pen) may also be the touch input means.
Further, in the above, when the detected hovering is a single hovering (e.g., hovering an object by only one finger) and a multi hovering (e.g., hovering one object with two or more fingers or hovering different objects with two or more fingers respectively), the visual effect may be different. For example, when an identical object is hovering, the multi hovering may be displayed further bolder than a single hovering case. In addition, the present hovering technology is able to detect a finger type. Hence, the object hovering with a thumb, and the object hovering with an index finger may be displayed differently.
Here, the interactive object refers to an object which is able to perform an interaction with a user. For example, when an application icon is selected, the controller 160 may execute the corresponding application in response to this. Therefore, the application icon corresponds to the interactive object. In addition, when a photo is selected, the controller 160 may control the display unit 110 to display the tag information tagged in the photo, for example, photograph date, a photograph place, other photograph information, a contact number, and a person's name in response to the photo selection. Therefore, the photo in which the tag information is tagged may correspond to the interactive object. Beside this, various interactive objects may include a keypad, an indicator, a webpage, and the like.
Referring to
Referring to
Referring to
Referring to
When the user touches the boundary line 530 with the index finger 540, and moves to the left or right, the controller 160 adjusts a width of two areas in response to this.
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
The user may touch the second key 712 with the index finger 720. When the touch is maintained for a certain time (e.g., 1 sec), the controller 160 may control the wireless communication unit 140 to attempt a phone call to Sammy's phone number.
Referring to
After deleting the second pen writing 820, the user's pen may approach in proximity to a repeat button. The controller 160 may provide the visual hint related to the repeat button. For example, the display unit 110 may re-display the second pen writing 820 lighter than the first pen writing under the control of the controller 160. When the user taps the repeat button with the pen, the controller 160 may re-display the second pen writing with the same luminosity as the first pen writing 810 on the screen, in response to the tap.
Meanwhile, in the above, the hint may include an auditory hint, or a tactile hint besides the visual hint.
The ‘hidden interactive object’ refers to an object that is not visible to a user, because the ‘hidden interactive object’ is obscured by an image layered on the object. When a display of the image layered on the object is terminated, the ‘hidden interactive object’ may be displayed. Here, various layered images may include a web page, a photo, a message, a menu, and a text, and the like.
Referring to
The controller 160 may determine whether a hovering of an index finger 1010 corresponding to the hidden interactive object is detected at operation 920.
When the hovering of the index finger 1030 corresponding to the hidden interaction object is detected, the controller 160 may control the display unit 110 to display the hidden interactive object on the image at operation 930. Referring to
After the hidden interactive object is called (i.e., displayed), the controller 160 may determine whether a touch gesture for the called object is detected at operation 940.
When the touch gesture corresponding to the called object is not detected, the controller 160 may determine whether the hovering corresponding to the called object is released at operation 950.
When the hovering corresponding to the called object is released, the process may be returned to operation 910. That is, the controller 160 may terminate a display of the called object. Otherwise, the controller 160 may determine the hovering corresponding to the called object is not released and the process may be returned to operation 940.
At operation 940, when the touch gesture related to the called object is detected, the controller 160 may perform a function related to the called object in response to the touch gesture at operation 960.
Referring to
The foregoing method of the present disclosure may be implemented in an executable program command form by various computer means and be recorded in a computer readable recording medium. In this case, the computer readable recording medium may include a program command, a data file, and a data structure individually or a combination thereof. In the meantime, the program command recorded in a recording medium may be specially designed or configured for the present disclosure or be known to a person having ordinary skill in a computer software field to be used. The computer readable recording medium includes Magnetic Media such as hard disk, floppy disk, or magnetic tape, Optical Media such as Compact Disc Read Only Memory (CD-ROM) or Digital Versatile Disc (DVD), Magneto-Optical Media such as floptical disk, and a hardware device such as ROM. RAM, flash memory storing and executing program commands. Further, the program command includes a machine language code created by a complier and a high-level language code executable by a computer using an interpreter. The foregoing hardware device may be configured to be operated as at least one software module to perform an operation of the present disclosure.
Accordingly, the present disclosure can provide a visual feedback in response to a hovering of the touch input means related to an object
While the present disclosure has been shown and described with reference to various embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present disclosure as defined by the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
10-2013-0019527 | Feb 2013 | KR | national |
This application is a divisional application of prior application Ser. No. 14/185,203, filed on Feb. 20, 2014, which claimed the benefit under 35 U.S.C. §119(a) of a Korean patent application filed on Feb. 23, 2013 in the Korean Intellectual Property Office and assigned Serial No. 10-2013-0019527, the entire disclosure of which is hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
Parent | 14185203 | Feb 2014 | US |
Child | 15264191 | US |