The present application relates to a user interface, an apparatus and a method for improved control, and in particular to a user interface, an apparatus and a method for improved control of a graphical user interface having a small display.
Contemporary apparatuses with small displays with touch user interfaces have fewer user input controls than traditional Windows Icon Menu Pointer (WIMP) interfaces, but they still need to offer a similar set of responses to user actions, e.g. command and control possibilities. For example most web pages are designed for a large display, but are often viewed on a small display. The user of an apparatus with a small display should be offered the same level of control as the user of an apparatus with a large display.
A traditional WIMP (windows icons menus pointer) device may offer a mouse pointer, a left and right mouse button, a scroll wheel, keyboard scroll keys, and keyboard modifiers for mouse-clicks (e.g. control-left-mouse). A touch device relies entirely on touch on the screen with one or two fingers to send commands to the system, even where the underlying touch system is similar to the WIMP system and requires similar control information.
For most portable apparatus there is simply not space enough to offer all these control options.
An apparatus that allows easy and precise control of objects displayed on a small display would thus be useful in modern day society.
On this background, it would be advantageous to provide a user interface, an apparatus and a method that overcomes or at least reduces the drawbacks indicated above by providing an apparatus, a method, a computer readable medium and a user interface according to the claims.
Further features, advantages and properties of device, method and computer readable medium according to the present application will become apparent from the detailed description.
In the following detailed portion of the present description, the teachings of the present application will be explained in more detail with reference to the example embodiments shown in the drawings, in which:
a, b, c, d and e are views of an apparatus according to an example embodiment,
a and b are views of an apparatus according to an example embodiment,
a and b are views of an apparatus according to an example embodiment, and
In the following detailed description, the user interface, the apparatus, the method and the software product according to the teachings for this application in the form of a cellular/mobile phone will be described by the embodiments. It should be noted that although only a mobile phone is described the teachings of this application can also be used in any electronic device such as in portable electronic devices such as laptops, PDAs, mobile communication terminals, electronic books and notepads and other electronic devices offering access to information.
The mobile terminals 100, 106 are connected to a mobile telecommunications network 110 through Radio Frequency (RF) links 102, 108 via base stations 104, 109. The mobile telecommunications network 110 may be in compliance with any commercially available mobile telecommunications standard, such as Group Spéciale Mobile (GSM), Universal Mobile Telecommunications System (UMTS), Digital Advanced Mobile Phone system (D-AMPS), The code division multiple access standards (CDMA and CDMA2000), Freedom Of Mobile Access (FOMA), and Time Division-Synchronous Code Division Multiple Access (TD-SCDMA).
The mobile telecommunications network 110 is operatively connected to a wide area network 120, which may be Internet or a part thereof. An Internet server 122 has a data storage 124 and is connected to the wide area network 120, as is an Internet client computer 126. The server 122 may host a www/wap server capable of serving www/wap content to the mobile terminal 100.
A public switched telephone network (PSTN) 130 is connected to the mobile telecommunications network 110 as is commonly known by a skilled person. Various telephone terminals, including the stationary telephone 132, are connected to the PSTN 130.
The mobile terminal 100 is also capable of communicating locally via a local link 101 to one or more local devices 103. The local link can be any type of link with a limited range, such as Bluetooth, a Universal Serial Bus (USB) link, a Wireless Universal Serial Bus (WUSB) link, an IEEE 802.11 wireless local area network link, a Radio Standard link for example an RS-232 serial link, etc. The local devices 103 can for example be various sensors that can communicate measurement values to the mobile terminal 100 over the local link 101.
A computer such as a palmtop can also be connected to the network both via a radio link such as a WiFi link, which is the popular term for a radio frequency connection using the WLAN (Wireless Local Area Network) standard IEEE 802.11.
It should be noted that the teachings of this application are also capable of being utilized in an internet network of which the telecommunications network described above may be a part of.
As is commonly known the internet is a global system of interconnected computer networks that interchange data by packet switching using the standardized Internet Protocol Suite (TCP/IP). It is a “network of networks” that consists of millions of private and public, academic, business, and government networks of local to global scope that are linked by copper wires, fiber-optic cables, wireless connections, and other technologies.
The Internet carries various information resources and services, such as electronic mail, online chat, online gaming, file transfer and file sharing, and the inter-linked hypertext documents and other resources of the World Wide Web (WWW).
It should be noted that even though the teachings herein are described solely to wireless networks it is in no respect to be limited to wireless networks as such, but it to be understood to be usable in the Internet or similar networks. The teachings herein find use in any device having a touch input user interface where other input means, such as keyboards and joysticks, are limited. Examples of such devices are mobile phones, Personal digital Assistants (PDAs), game consoles, media players, personal organizers, electronic dictionaries and digital image viewers.
An embodiment 200 of the mobile terminal 100 is illustrated in more detail in
The internal component, software and protocol structure of the mobile terminal 200 will now be described with reference to
The MMI 334 also includes one or more hardware controllers, which together with the MMI drivers cooperate with the touch display 336/203, and the keypad 338/204 as well as various other Input/Output devices such as microphone, speaker, vibrator, ringtone generator, LED indicator, etc.
The software also includes various modules, protocol stacks, drivers, etc., which are commonly designated as 330 and which provide communication services (such as transport, network and connectivity) for an RF interface 306, and optionally a Bluetooth interface 308 and/or an IrDA interface 310 for local connectivity. The RF interface 306 comprises an internal or external antenna as well as appropriate radio circuitry for establishing and maintaining a wireless link to a base station (e.g. the link 102 and base station 104 in
The mobile terminal also has a Subscriber Identity Module (SIM) card 304 and an associated reader. As is commonly known, the SIM card 304 comprises a processor as well as local work and data memory.
Examples of such apparatuses are media players, mobile phones, personal digital assistants, digital cameras, navigation devices such as GPS (Global Positioning System) devices, game consoles end electronic dictionaries.
The apparatus 400 comprises a touch display 403 on which two objects 410a and 410b are displayed. Also indicated in
To provide improved control to a user a controller (not shown) is configured to display a cursor which can be controlled by touch input on the touch display 403. A user is thus able to use the cursor 412 as a virtual mouse.
This enables a user to accurately point at and control objects that are comparably small compared to the touching point of a finger or a stylus.
b shows a view of an apparatus where a cursor 412 is displayed slightly offset from the touching zone 411. In an example embodiment the cursor 412 is displayed adjacent the touching zone 411. In an example embodiment the cursor 411 is displayed adjacent the touch zone at a distance of 1, 2, 3, 4, 5, 6, 7, 8, 9 or 10 pixels from the touch zone. Other distances are also possible, for example in the range 10 to 15, 15 to 20 and 20 to 25 pixels. The number of pixels in the distance depends on design and usability issues such as display size, pixel size, stylus size. For example if the controller detects that a broad stylus is used, the controller displays the cursor at a greater distance from the touch zone than it would for a thin stylus. In one example embodiment the controller is configured to perform this dynamically. In the following the combination of the touch zone 411, the cursor 412 and the selection zone will be referred to as a virtual mouse 412.
As is readily understood the selection point of the cursor may be around its tip. By moving the selection zone from the touching zone the user is able to see better where he is pointing and also point with higher precision as the tip of the cursor is in most cases smaller than the tip of the stylus.
As the virtual mouse 412 finds best use with small options it would be preferable if the apparatus could offer a user the option of controlling the apparatus with both a direct touch and by the virtual mouse 412.
A controller is configured to activate the virtual mouse 412 upon detection of a slide-in gesture i.e. a touch input that originates outside the display 403. In one example embodiment a slide-in gesture can be determined as being a gesture that originates at or in the immediate vicinity of an edge of a display and immediately has a certain speed or a speed above a certain level. This allows a controller to differentiate between a gesture starting outside the display and continuing over it from a gesture deliberately starting close to an edge of the display and continuing inside the display, such as a gesture for selecting an object located close to the edge and dragging it inside the display area. The later gesture would have an initial speed close or equal to zero.
In one example embodiment the determination of the slide-in gesture depends on whether an object is covered by the path within a very short time interval.
In one example embodiment the slide-in gesture is assumed to have been performed if a user initiates it outside an active area or an application area of said display 403.
In this example embodiment a user may thus activate a virtual mouse 412 for a window by sliding in over the window.
In one example embodiment, which is shown in
The controller is further configured to receive touch input relating to a movement control of the virtual mouse 412 and move the virtual mouse 412 accordingly.
In order to provide a user with an increased control one or more virtual mouse buttons 413 are displayed as the virtual mouse 412 is activated. In one example embodiment the virtual mouse button is associated with one or more commands or functions. The controller is further configured to receive touch input relating to said virtual mouse button 413 and execute a command or function accordingly.
It should be noted that the virtual mouse button 413 may be arranged differently in different embodiments and the placement shown in
In one example embodiment the controller is configured to determine whether the received touch input relating to the virtual mouse button 413 is a single-point or multi-point touch input. The controller is configured to execute different commands or functions accordingly. For example the function OPEN is in one example embodiment associated with a single touch on the virtual mouse button 413 and a function of displaying an options menu is associated with a double touch on the virtual mouse button 413.
This provides a user with the option of controlling the virtual cursor 412 with one finger and the selected action with one or more other fingers.
In one example embodiment the controller is configured to detect one or more gestures relating to the virtual mouse button 413. Each gesture is associated with an action (command or function) and the controller is configured to execute the associated command or function in response to detecting the gesture. In one example the function OPEN is associated with a touch or tap on the virtual mouse button 413 and the function of displaying an options list is associated with a sliding gesture on the virtual mouse button 413.
In one example embodiment the associated function is further determined by the object 410. For example, the function associated with an object representing a music file can be to play the music file and the function associated with an object representing an image file can be to display the image.
It should be noted that when a virtual mouse 412 is active the controller is configured to display one or several virtual mouse buttons 413 inside an application view and the user can use these virtual mouse buttons 413 with another/second finger for triggering mouse button down and up events for an object identified with a virtual mouse 412 resulting in the same outcome as would be produced with a physical mouse interaction when using standard computers. If the user would interact with a document instead of virtual mouse buttons 413 with his/her second finger the controller would execute a default function/feature provided for by the application. In an exemplary embodiment of an Internet or Hypermedia application, the controller would display at least a left and a right virtual mouse button. It should also be noted that a virtual mouse 412 can comprise as many virtual buttons 413 as needed for different purposes whatever relevant in the current context.
In one example embodiment the controller is configured to execute a command or function when the controller detects that the touch input is released.
In one example embodiment the controller is configured to deactivate the virtual mouse without executing a command or function when the controller detects that the touch input is released.
d shows an apparatus 400 where a user has activated a virtual mouse 412 and positioned it so that it identifies one object 410a.
As a user taps on the virtual mouse button 413 an associated action is executed by the controller. See
In one example embodiment the controller is configured to receive touch input and to detect a pressure level of the received input. The controller is further configured to associate various commands or actions to specified pressure levels. This provides for a feature of moving the virtual mouse 412 using low pressure touch input and selecting commands by using touch input with higher pressure. In one example embodiment a user could thus move the virtual mouse by sliding his finger and then “clicking” on an item by pushing harder on the touch display. A move operation in such an embodiment would be achieved by moving the virtual mouse to an object and pressing down on the object. Then move the object to another position where the pressure would be lowered gain, i.e. the user would not push so hard.
In one example embodiment the controller is configured to receive multiple touch input from the touch display. The controller is further configured to receive first touch input and to associate this first input with the virtual mouse and to interpret the first input as a continuous stream of movement and control info for the virtual mouse. The controller is also configured to receive at least a second input and associate the second input with commands that are not related to the virtual mouse 412. In one example embodiment the second input is related to a panning action of content that is displayed on the display. In such an embodiment a user can activate virtual mouse by sliding in a finger on the display and then by placing a second finger on the display and moving the second finger the user is able to pan the displayed content. This would alleviate the requirement for having scrollbars. In one example embodiment the controller is configured to determine all input to be related to a second action that is not related to the control of the virtual mouse as a multiple touch input is detected. In such an embodiment the user is able to start a virtual mouse by sliding in one finger and then by touching the screen with a second finger all actions taken are determined to be related to the second action and not control of the virtual mouse while the multi-touch is detected. To return to controlling the virtual mouse the user simply releases the second touch e.g. releases the second finger.
In one example embodiment the second input is associated with a zoom function. In such an embodiment the user is able to zoom in/out with a pinch or release gesture. For example when a user has moved the virtual mouse cursor with one finger to an application view, the user is able to zoom in/out the view with a two-finger pinch gesture by bringing a second finger to the screen. After zoom in/out the user releases the second finger and continues the normal cursor movement with the first finger.
In one example embodiment the virtual mouse is de-activated as the touch input is released. In one example embodiment the controller is configured to execute an action if the mouse is de-activated while identifying i.e. pointing at an object. This enables a user to be able to activate a virtual mouse 412, slide it out to an object 410 and execute on action on the object by one single and simple sliding gesture.
In one example embodiment the controller is configured to deactivate the virtual mouse without executing a command or function when the controller detects that the touch input is released.
In one example embodiment the virtual mouse is de-activated if it is brought outside the window or application area for which it was activated.
It should be noted that in
It should also be noted that the virtual mouse of this application will also find use for enabling a user to handle and/or control objects which have functions associated with it which functions are dependant on the interaction. This makes a touch based user interface more compatible with other user interfaces having additional input means such as a physical mouse and content designed for one system can easily be controlled in a different system. For example, an object is displayed on a display. The object is associated with a number of functions one being that as a cursor hovers over the object (a mouse over event) a pop-up menu is displayed. The virtual mouse of this application enables a user interface not having a navigational input device to implement such functions and to allow designers and users to differentiate between mouse down, mouse up and mouse over events in a manner that is intuitive both to implement and for a user to use.
The apparatus 500 comprise a touch display 503 on which two objects 510a and 510b are displayed. Also displayed is an icon 514 for activating a virtual mouse.
In this embodiment a controller is configured to detect a gesture originating in the virtual mouse icon 514 and in response thereto activate a virtual mouse 512. The controller is configured to receive control input for the virtual mouse as has been described with reference to
This enables a user with an intuitive starting point for activating the virtual mouse 512.
In one example embodiment a controller is configured to activate a virtual mouse 512 upon detection of any touch input on the icon 514. And in one example embodiment if a tap is detected on the icon 514. In one example embodiment the virtual mouse 512 is displayed adjacent the icon 514 when activated. In one example embodiment the virtual mouse 512 is displayed adjacent in the middle of the display 503 when activated.
In one example embodiment the controller is configured to de-activate the virtual mouse 512 if it is brought back to the icon 514, that is by bringing it back to point A.
In one example embodiment the controller is configured to de-activate the virtual mouse 512 upon detection of further input on the icon 514.
In one example embodiment the controller is configured to de-activate the virtual mouse 512 in one of the ways described with reference to
In
It should be noted that the placement of the icon 514 is only for illustrative purposes and the icon 514 may be placed in other positions on the display 503 in different embodiments. In one example embodiment it is part of a toolbar 520. In one example embodiment the icon's 514 placement is dependant on an application being executed. In one example embodiment the icon is displayed according to a context of an application. For example the icon 514 is only displayed if any actions can be undertaken with a virtual mouse 512.
The apparatus 600 comprises a touch display 603 on which content 620 is displayed. In this example the content 620 has a graphical extent exceeding the resolution of the touch display 603 which in this example results in that the full content 620 can not be displayed at once. This is indicated in the figure by the content 620 extending outside the touch display 603. This is only for illustrative purposes as a skilled reader would realize and in an implementation the portion of the content 620 extending outside the touch display 603 would not be visible.
In
A controller is configured to arrange at least one scroll command portion 615 along at least one side of the touch display 603.
In this example only one scroll command portion 615 is shown along the top edge of the display 603.
In one example embodiment the scroll command portions 615 are not visible.
In one example embodiment the scroll command portions 615 are marked. In one example embodiment they are marked by being shaded.
In one example embodiment a scroll command portion 615 is marked as a controller determines that a virtual mouse is located in a scroll command portion 615.
A controller is configured to determine whether a virtual mouse 612 is located within a scroll command portion 615 or not. If it is determined that a virtual mouse 612 is located within a scroll command portion 615 the controller is configured to scroll the content 620 in response thereto and in a direction corresponding to the location of the scroll command portion 615.
In
In one example embodiment the controller is configured to determine that the virtual mouse 612 is located within a scroll command portion 615 if the virtual mouse 612 at least partially overlaps the scroll command potion 615.
c shows an apparatus as in
A controller is configured to determine whether a virtual mouse 612 is located in any of the scroll command portions scroll command portions 615a, b, c and d and if so to scroll the content accordingly.
In this example the controller is configured to scroll the content 620 downwards if the virtual mouse 612 is located within the upper scroll command portion 615a.
In this example the controller is configured to scroll the content 620 leftwards if the virtual mouse 612 is located within the right scroll command portion 615b.
In this example the controller is configured to scroll the content 620 upwards if the virtual mouse 612 is located within the lower scroll command portion 615c.
In this example the controller is configured to scroll the content 620 rightwards if the virtual mouse 612 is located within the left scroll command portion 615d.
The directions used are those which are perceived as an apparatus is viewed from the front as displayed in
In one example embodiment the width of the scroll command portion 615 is set in accordance with the width of the stylus used.
In one example embodiment the width of the scroll command portion 615 is fixed to a preset value independent of the width of the stylus used.
In one example embodiment the width of the scroll command portion 615 is set in proportion to the available display size and in one example embodiment to the width of an application window being displayed (not shown).
In one example embodiment an apparatus has a foldable display or alternatively two displays arranged on opposite sides of the apparatus. Such an apparatus will have one first touch display area on a front face of the apparatus and one second touch display area on a back face of the apparatus. In such an embodiment a controller may be configured to receive touch input on the second touch display area and in response thereto display a virtual mouse on the first touch area. The controller may further be configured to receive control input for the virtual mouse through the second touch area to control the virtual mouse on the first touch area. In one example embodiment the first touch are is not touch sensitive but merely a display.
Such embodiments enable a user to control a virtual mouse on a front display (portion) by making touch input on a back side of an apparatus.
The various aspects of what is described above can be used alone or in various combinations. The teaching of this application may be implemented by a combination of hardware and software, but can also be implemented in hardware or software. The teaching of this application can also be embodied as computer readable code on a computer readable medium. It should be noted that the teaching of this application is not limited to the use in mobile communication terminals such as mobile phones, but can be equally well applied in Personal digital Assistants (PDAs), game consoles, media players, personal organizers, electronic dictionaries, computers or any other device designed for displaying content on a small touch display.
The teaching of the present application has numerous advantages. Different embodiments or implementations may yield one or more of the following advantages. It should be noted that this is not an exhaustive list and there may be other advantages which are not described herein. For example, one advantage of the teaching of this application is that a user is offered improved control of small objects being displayed on a touch display.
Although the teaching of the present application has been described in detail for purpose of illustration, it is understood that such detail is solely for that purpose, and variations can be made therein by those skilled in the art without departing from the scope of the teaching of this application.
For example, the teaching of the present application has been described in terms of a mobile phone, it should be appreciated that the teachings of the present application may also be applied to other types of electronic devices, such as media players, palmtop, game consoles, digital cameras, electronic dictionaries and so on. It should also be noted that there are many alternative ways of implementing the methods and apparatuses of the teachings of the present application.
Features described in the preceding description may be used in combinations other than the combinations explicitly described.
Whilst endeavouring in the foregoing specification to draw attention to those features of the invention believed to be of particular importance it should be understood that the Applicant claims protection in respect of any patentable feature or combination of features hereinbefore referred to and/or shown in the drawings whether or not particular emphasis has been placed thereon.
The term “comprising” as used in the claims does not exclude other elements or steps. The term “a” or “an” as used in the claims does not exclude a plurality. A unit or other means may fulfill the functions of several units or means recited in the claims.