This application claims the benefit under 35 U.S.C. §119(a) of a Korean Patent Application No. 10-2009-0049304, filed on Jun. 4, 2009, in the Korean Intellectual Property Office, the entire disclosure of which is incorporated herein by reference for all purposes.
1. Field
The following description relates to a device including a touch interface, and more particularly, to an apparatus and method for providing a selection area on a touch interface that may be applicable to a mobile terminal and the like.
2. Description of Related Art
Recently, a touch interface has become widely used as a touch screen for a mobile terminal, for example, a smart phone. Through activation of the smart phone emphasizing a “PC in my hand,” users may do many things in a mobile environment. The users may perform functions easier and more efficiently using the touch interface.
The touch interface may have inconvenient and ineffective aspects. For example, in the case of a document creation, it may be difficult to input characters and select an accurate area using the touch interface in comparison to an existing key pad type interface.
In one general aspect, there is provided an apparatus for providing a selection area for a touch interface, the apparatus comprising a touch interface to display a content, a sensor to sense a touch event and a drag operation of the touch event via the touch interface, the touch event includes an initial point where the initial touch occurs, a change direction point where a drag direction is changed, and a finish point where the touch event is terminated, and a touch interface controller to control the touch interface to provide a selection area for the content based on a point where the drag direction is changed.
The selection area for the content may be set to an area from the point where the drag direction is changed to the finish point where the touch event is terminated.
The touch interface controller may control the touch interface to display an auxiliary image corresponding to a point where the touch event occurs.
The touch interface controller may control the touch interface to display an auxiliary image corresponding to a current touch point of a user.
The sensor may sense a touch event that occurs on different sides of the initial touch point, and the touch interface controller may select content from both of the different sides of the initial touch point.
The drag operation may include a first drag direction from the initial touch point to the change direction point, and a second drag direction from the change direction point to the finish point.
In another aspect, there is provided an apparatus for providing a selection area for a touch interface, the apparatus comprising a touch interface to display a content, a sensor to sense a touch event and a drag operation of the touch event via the touch interface, the touch event includes a starting point where the initial touch occurs, a change direction point where a drag direction is changed, and a finish point where the touch event is terminated, and a touch interface controller to control the touch interface to change a display attribute of the content based on a point where the drag direction is changed.
The touch interface controller may control the touch interface to change the display attribute of the content in an area from the point where the drag direction is changed to the finish point where the touch event is terminated.
The display attribute of the content may include at least one of a shadow, a font of a text, a color of the text, and a background color.
The touch interface controller may control the touch interface to display an auxiliary image corresponding to a point where the touch event occurs.
The touch interface controller may controls the touch interface to display an auxiliary image corresponding to a current touch point of a user.
The sensor may sense a touch event that occurs on different sides of the initial touch point, and the touch interface controller may select content from both of the different sides of the initial touch point.
The drag operation may include a first drag direction from the initial touch point to the change direction point, and a second drag direction from the change direction point to the finish point.
In another aspect, there is provided a method of providing a selection area for a touch interface, the method comprising displaying a content on the touch interface, sensing a touch event and a drag operation via the touch interface, the touch event includes a starting point where the initial touch occurs, a change direction point where a drag direction is changed, and a finish point where the touch event is terminated, and providing a selection area for the content based on a point where the drag direction is changed.
The selection area for the content may be set to an area from the point where the drag direction is changed to the finish point where the touch event is terminated.
The touch interface may display an auxiliary image corresponding to a point where the touch event occurs.
The touch interface may display an auxiliary image corresponding to a current touch point of a user.
The method may further comprise changing a display attribute of the selection area for the content.
The sensing may include sensing a touch event that occurs on different sides of the initial touch point, and the providing may include selecting content from both of the different sides of the initial touch point.
The drag operation may include a first drag direction from the initial touch point to the change direction point, and a second drag direction from the change direction point to the finish point.
Other features and aspects will be apparent from the following detailed description, the drawings, and the claims.
Throughout the drawings and the detailed description, unless otherwise described, the same drawing reference numerals will be understood to refer to the same elements, features, and structures. The relative size and description of these elements may be exaggerated for clarity, illustration, and convenience.
The following detailed description is provided to assist the reader in gaining a comprehensive understanding of the methods, apparatuses, and/or systems described herein. Accordingly, various changes, modifications, and equivalents of the methods, apparatuses, and/or systems described herein will be suggested to those of ordinary skill in the art. Also, description of well-known functions and constructions may be omitted for increased clarity and conciseness.
The touch interface 110 displays a content on the interface. The touch interface 110 provides a user interface that enables a user to input information by touch, for example, the user may input information via a user's finger, a stylus, and the like. Various applications may also be included in the selection area apparatus 100. For example, the selection area apparatus may include an application for a copy and paste function for the content, a webpage, a text file, and the like.
The sensor 120 senses a touch event on the touch interface 110, and may sense a drag direction of the touch event. For example, the touch event may indicate a state or an action where the user's finger, the stylus, and the like, touches the touch interface 110. The touch event includes a drag direction, for example, up, down, left, right, diagonal, or a combination thereof. The term “drag” used herein may be similar to a drag of a mouse in a PC environment. For example, a touch event may include a starting point, where the touch initially occurs, a change direction point where the drag direction is changed, and a finish point where the touch ends and the contact with the touch pad terminates. The drag operation may include dragging the touch from the starting point to the change direction point, and to the finish point. For example, a drag direction of the touch event may indicate a movement direction of the user's finger or the stylus in a state or action where the touch event is maintained. The drag direction of the touch event may be any desired direction, for example, up, down, left, right, a diagonal direction, or a combination thereof, as shown in
The touch interface controller 130 performs various types of operations to provide the selection area and to control the selection area. The touch interface controller 130 may control the touch interface 110 to display the selection area for the content separately from other areas of the display.
Based on the drag direction of the touch event, the touch interface controller 130 may control the touch interface 110 to provide the selection area for the content. The selection area for the content may be set to an area from the point where the drag operation begins to a point where the touch event is terminated. The termination of the touch event denotes a state where the touch on the touch interface 110 is no longer sensed.
The drag direction of the touch event may be changed by the user. The touch interface controller 130 may control the touch interface 110 to change a display attribute of the content based on the change in the drag direction of the touch event. The touch interface controller 130 may control the touch interface 110 to change the display attribute of the content from the point where the drag direction of the touch event is changed to the point where the touch event is terminated. The display attribute of the content may include, for example, at least one of a shadow, a font of a text, a color of the text, a background color, and the like.
As shown in
Referring to
In 220, the selection area providing apparatus 100 determines that a touch event is sensed on the touch interface and where on the interface the touch event is sensed. The sensing in 220 may be repeated to repeatedly sense whether a touch event occurs.
When a touch event is sensed, the selection area providing apparatus 100 senses a drag direction of the touch event, in 230. For example, as shown in
In 240, the selection area providing apparatus 100 senses whether the drag direction is changed. The sensing in 240 may be repeated to repeatedly sense whether a drag direction has changed.
When the drag direction is changed, in 250 the selection area providing apparatus 100 provides a selection area for the content based on the point where the drag direction is changed. The selection area for the content may be set to an area from the point where the drag direction is changed to a point where the touch event is terminated. The touch interface included in the selection area providing apparatus 100 may display an auxiliary image for a selection area designated by a user.
When the selection area providing apparatus 100 senses a first drag direction of a touch event and senses a second drag direction different from the first drag direction, the selection area providing apparatus 100 may change a display attribute of the content based on a starting point of the second drag direction. The selection area providing method may further include changing a display attribute of the designated selection area.
For example, a user may touch an initial start point 810 of the content displayed on a touch interface and move the user's finger from the initial start point 810 to a desired point 820 in front of “Telecommunications.” In doing so the user performs an example of a drag operation. The user may designate a selection area 830 while dragging the user's finger from the point 820 towards the point 810. As described above, the selection area 830 may start from the point 820 where the drag direction is changed. Thus, a user may select content on multiple sides of an initial starting point.
Hereinafter, a conventional selection area will be described with reference to
Referring to
Meanwhile, as shown in
In the conventional method shown in
The apparatus and method described herein may allow a user the ability to more accurately designate selected text in an environment with a narrow touch interface. In the environment with the narrow touch interface such as a mobile device, it may be difficult for the user to accurately designate a desired initial touch point. For example, because a user's finger is often larger than text displayed on a mobile terminal, it may be difficult for a user to accurately select an initial touch point. However, using the selection area providing apparatus described herein, the user may easily move to the user's desired point using a drag function and thus may more accurately designate the selection area. An auxiliary image as shown in
For example, a user may touch a random point 1010 of the content displayed on a touch interface and drag the user's finger from the point 1010 to a desired point 1020 in front of “Telecommunications.” In this example, the user desires to highlight the phrase “Telecommunications is one of five business.” The user may designate the selection area 1050 while dragging the user's finger from the point 1020 towards the point 1010. Next, the user may drag the user's finger from the point 1020 to a point 1030, beyond the desired content area that the user desires to select. The user may adjust the selection area 1050 by dragging the user's finger back to a point 1040. The user may confirm the selection area 1050 by separating the user's finger from the touch interface. Specifically, the selection area 1050 may be set to an area from the point 1020 where the drag direction is changed to the point 1040 where the touch event is terminated.
The selection area providing apparatus allows a user to more easily designate an accurate selection area using a touch interface. Also, it is possible to more easily and more accurately provide a user with a selection area in an environment where the user's controllable space is narrow, for example, on a mobile terminal. Further, if a user is having trouble viewing the text on the terminal, the interface touch apparatus may provide an auxiliary image to the user that magnifies the selection area.
As a non-exhaustive illustration only, the terminal device described herein may refer to mobile devices such as a cellular phone, a personal digital assistant (PDA), a digital camera, a portable game console, and an MP3 player, a portable/personal multimedia player (PMP), a handheld e-book, a portable lab-top PC, a global positioning system (GPS) navigation, and devices such as a desktop PC, a high definition television (HDTV), an optical disc player, a setup box, and the like capable of wireless communication or network communication consistent with that disclosed herein.
The processes, functions, methods and software described above including methods according to the above-described examples may be recorded in computer-readable storage media including program instructions to implement various operations embodied by a computer. The media may also include, alone or in combination with the program instructions, data files, data structures, and the like. Examples of computer-readable media include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM disks and DVDs; magneto-optical media such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like. Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter. The described hardware devices may be configured to act as one or more software modules in order to perform the operations of the above-described example embodiments, or vice versa. In addition, a computer-readable storage medium may be distributed among computer systems connected through a network and computer-readable codes or program instructions may be stored and executed in a decentralized manner.
A number of examples have been described above. Nevertheless, it will be understood that various modifications may be made. For example, suitable results may be achieved if the described techniques are performed in a different order and/or if components in a described system, architecture, device, or circuit are combined in a different manner and/or replaced or supplemented by other components or their equivalents. Accordingly, other implementations are within the scope of the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2009-0049304 | Jun 2009 | KR | national |