The above and other objects, features and advantages of the present invention will become more apparent from the following detailed description in conjunction with the accompanying drawings, in which:
Hereinafter, the present invention is described in detail with reference to the accompanying drawings. The same reference numbers are used for the same or like components in the drawings. Additionally, detailed explanations of well-known functions and compositions may be omitted to avoid obscuring the subject matter of the present invention.
Referring to
The keypad 110 is a portion of a key input unit formed on a specific area of a mobile terminal body, and alphanumeric keys are disposed on the keypad 110 in a format of 3 columns×4 rows or 5 columns×4 rows. The keypad 110 inputs characters or numbers by a user's normal operation of pressing, or short cut commands for performing special functions.
The touch sensor 120 is installed under the keypad 110, and preferably occupies the same location as that formed with the keypad 110. The touch sensor 120 is a type of pressure sensor, such as a gliding sensor, and various types of touch sensor may be used. The touch sensor 120 detects, if a user performs a touch operation on the keypad 110, generation of the touch by detecting a change of physical properties such as resistance and capacitance. The detected change of the physical property is converted to an electric signal (hereinafter, a touch signal). The touch signal generated by the touch and detected by the touch sensor 120 is transmitted to the touch identifier 132 of the control unit 130.
The touch sensor 120 is partitioned into a plurality of physical or virtual areas. Therefore, if a touch is generated, the corresponding position of the touch may be identified. Position information is transmitted to the control unit 130 together with the touch signal. The touch signal controls the operation of the pointer 142 in the display unit 140, and is used as an input signal for displaying the option window 144. The touch signal generated by touching the keypad 110 is completely different from a normal input signal generated by pressing the keypad 110. Apart from a function allocated to a normal keypad input signal, a function for a pointer control and option window display may be allocated to the touch signal.
The control unit 130 controls general operation of the mobile terminal 100. The touch identifier 132 receives a touch signal transmitted by the touch sensor 120, and identifies the type of the touch signal (i.e. first touch signal or second touch signal) if the touch is identified as a first touch signal, the pointer controller 134 links the position of the first touch signal generated by the keypad 110 with the position of the pointer in the display unit 140 and controls the display of the pointer 142 by using the position information transmitted with a touch signal. If the touch signal is identified as a second touch signal, the option controller 136 identifies whether an executable option is available, and displays option window 144 on the display unit 140.
The display unit 140 displays various menus for the mobile terminal 100, information input by a user, and information to be provided for the user. The display unit 140 may be formed with an LCD. A position of the pointer 142 is linked with a position of the first touch by the pointer controller 134, and the display of the pointer 142 is determined by a position change of the first touch (i.e. a change of position information corresponding to a touch signal). The option window 144 is a type of pop-up window listing various selectable menus, and it appears on the display unit 140 according to the control of the option controller 136.
A user interface method according to the present invention is described in detail as follows.
Referring to
If a user touches a point 92a (first point) of the keypad 110 with a finger 90a (first finger), the touch detector 122 of the touch sensor 120 located under the keypad 110 detects a physical property change corresponding to the touch operation. The signal converter 124 converts the value of the physical property change to a touch signal, and transmits the touch signal to the control unit 130. At this moment, position information on the generated touch is also transmitted with the touch signal.
The touch identifier 132 of the control unit 130 receives the touch signal and position information, and identifies the type of the touch from the touch signal. Where only one touch signal is received (i.e., in the case that a touch is generated at a position on the keypad 110), the touch identifier 132 identifies the touch as a first touch.
In step S12, the first point 92a of the first touch is linked with the position of the pointer 142.
The pointer controller 134 of the control unit 130 obtains a touch position (X, Y) of the first point 92a from the position information transmitted with the touch signal. The touch position (X, Y) corresponds to the position of the first finger 90a. Subsequently, the pointer controller 134 links the touch position (X, Y) of the first point 92a with a pointer position (x, y) of the pointer 142 displayed on the display unit 140.
As a method for linking a touch position (X, Y) with a pointer position (x, y), two methods using 2-dimensional coordinates are as follows. An absolute coordinate system links a touch position on the keypad 110 with a pointer position on the display unit 140 having the same coordinates. Alternatively, a relative coordinate system links an initial position of the first touch with a pointer position regardless of a touch position.
Step S13 detects generation of a second touch In the state of touching the first point 92a with the first finger 90a, if a user touches another point 92b (hereinafter, second point) on the keypad 110 with another finger 90b (hereinafter, second finger), the touch detector 122 detects a physical property change according to the touch operation of the second finger 90b in the same manner as in the first touch, converts the physical property change to a touch signal, and transmits the touch signal to the control unit 130.
As described above, if another touch signal is received in the state of receiving the first touch signal, the touch identifier 132 identifies the touch signals as a second touch signal. That is, if two touch signals generated at two different points are simultaneously received, the touch identifier 132 identifies a touch signal received with new position information as a second touch signal. However, if a touch signal having new position information is received in the state that the first touch signal is no longer received, the touch signal is identified as a first touch signal.
Step S14 identifies whether an executable option is available If a second touch is generated, the option controller 136 of the control unit 130 identifies whether an option executable at the current position of the pointer 142 is available. Various types of screens can be output to the display unit 140 according to the operation status of the mobile terminal 100, and the pointer 142 can be located at any position on the screen. Menu items for executing a specific function of the mobile terminal 100 can be displayed on the screen of the display unit 140 in formats such as an icon and a text, and the status of executing the specific function can also be displayed. The pointer 142 can be located at a menu item or at any other position.
According to the position of the pointer 142, an executable option may be available or unavailable. Accordingly, the option controller 136 identifies in advance, if a second touch is generated, whether an option is available. If an executable option is always available regardless of the position of the pointer 142, the step S14 can be omitted.
Step S15 displays the option window 144, if a second touch is generated and an executable option is available, the option controller 136 displays the option window 144 on the display unit 140. This corresponds to a right-click of a no use in a typical personal computing environment, and the option window 144 is displayed at a position where the pointer 142 is currently located.
Referring to
Referring to
Selection of a menu item can be set if the pointer 142 stops at the menu item longer than predetermined time duration (i.e. if a first touch signal is continuously received without a position information change). Alternatively, selection of a menu item can be set if a first touch signal having the same position information is received repeatedly (i.e. if the first finger 90a presses the same position once more). Further, selection of a menu item can be set if a second touch signal is received regardless of the above two conditions. In any case, the menu item 146 before selection shown in
Referring to
If the second touch is detected, the option controller 136 of the control unit 130 identifies whether an option executable at the current position of the pointer 142 is available. In
An additional function can be set to scroll the menu items in the option window 144 when the second finger 90a moves upwards or downwards in the state that an option window 144 is displayed. Alternatively, an additional function may be set to execute a selected menu in the option window 144 by pressing once or twice with the second finger 90a. Similarly, an additional function can be set to execute the selected menu item 146 by pressing once or twice with the first finger 90a before displaying an option window 144.
The present invention provides a user interface executing a predetermined function by detecting a touch and identifying the type of the touch when a user touches a keypad installed with a touch sensor by using their fingers. The user interface utilizing a keypad touch method is suitable for execution of various applications in a mobile terminal, because it enables execution of a normal function of a keypad press operation and an additional function.
In particular, the user interface according to the present invention enables operation control of a pointer and display of an option window on a display unit by using a keypad touch. Accordingly, the present invention provides an operation environment of a mobile terminal close to a personal computing environment, simplicity in use even in a screen having a complicated option structure, and excellent user accessibility and convenience. The user interface according to the present invention can provide an optimized method for one of a browser and GPS navigation environments.
In the user interface according to the present invention, operation on both a keypad area and a display area are not required because operation of a mobile terminal is performed only in a keypad are a different than the conventional touch screen method. Accordingly, the user interface according to the present invention provides a much simpler operation compared to a conventional method, and operation with one hand is possible, because use of a stylus pen is unnecessary. Further, the user interface according to the present invention has an effect of cost saving compared to a conventional touch screen method.
Although preferred embodiments of the present invention have been described in detail herein-above, it should be understood that many variations and/or modifications of the basic inventive concept herein described, which may appear to those skilled in the art, will still fall within the spirit and scope of the present invention as defined in the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2006-0057388 | Jun 2006 | KR | national |
This application claims priority under 35 U.S.C. §119 to an application entitled “KEYPAD TOUCH USER INTERFACE METHOD AND MOBILE TERMINAL USING THE SAME” filed in the Korean Intellectual Property Office on Jun. 26, 2006 and assigned Serial No. 2006-0057388, the contents of which are incorporated herein by reference.