Mobile communication terminal and method

Information

  • Patent Application
  • 20060262146
  • Publication Number
    20060262146
  • Date Filed
    June 22, 2005
    19 years ago
  • Date Published
    November 23, 2006
    17 years ago
Abstract
It is shown a method to move focus from a first user interface element to a second user interface element shown on a display of a mobile communication terminal, said mobile communication terminal further comprising an input device. The method comprises the steps of: detecting a directional input via said input device, said input indicating a desired direction to move; determining a set of candidate user interface elements being eligible to receive focus; for each candidate user interface element in said set of candidate user interface elements, determining a weighted distance between said candidate user interface element and said first user interface element by applying a function including, as an input parameter, an overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction; and determining said second user interface element to be a particular candidate user interface element in said set of candidate user interface elements that has a minimum weighted distance to said first user interface element.
Description
FIELD OF THE INVENTION

The present invention generally relates to user interfaces of mobile communication terminals, and more particularly to changing focus of user interface elements shown on displays of mobile communication terminals.


BACKGROUND OF THE INVENTION

Mobile communication terminals have changed dramatically in the last decade. With the first 2G terminals, the only real purpose was to make normal phone calls. Now with 2.5G (GPRS), CDMA2000 and UMTS technology, mobile communication terminals not only facilitate voice communication, but also digital communication such as text and multimedia messaging, as well as browsing content provided by Internet servers.


The browser applications, also known as user agents, that are responsible for rendering documents, such as HTML, SVG, SMIL, containing focusable user interface elements, for example links or form controls, need to provide a method that takes input from the user, e.g. an event generated by the hardware input controller, and translates that input into an action that changes the state of the document by removing the focus from one user interface element and setting it on another.


In case the user agent is running on a device that allows a pen or a mouse as an input device, then the user can directly indicate the user interface element that will receive focus by simply tapping the screen or moving the mouse and clicking, in a position where a desired user interface element is rendered. However, if the device only has a four-way or five-way navigation key or a joystick, the user agent has the much more difficult task of determining the user interface element that will receive focus solely based on:


the currently focused user interface element,


the desired direction (given by the joystick, etc.), and


the position of the other focusable user interface elements relative to the currently focused user interface element.


Two important requirements that should be fulfilled by a user agent that provides such a method are:


the newly focused user interface element should usually be the same as the one intended by the user, and


the method should provide a certain degree of reversibility (e.g. a “left” press on the joystick, etc. followed by a “right” press should transfer the focus between the same two user interface elements).


The European patent EP 0 671 682 B1 presents a method, apparatus and computer readable storage device for positioning a cursor on one of a plurality of controls being displayed on a screen. However, the presented method is unsuitable for use in modern mobile communication terminals capable of displaying complex documents with focusable user interface elements, as the method provides a low degree of reversibility.


Consequently, there is a problem on how to manage navigation in documents displayed in a mobile communication terminal having only a limited input device.


SUMMARY OF THE INVENTION

In view of the above, an objective of the invention is to solve or at least reduce the above-identified and other problems and shortcomings with the prior art, and to provide improvements to a mobile communication terminal.


Generally, the above objectives and purposes are achieved by methods, mobile communication terminals and computer program products according to the attached independent patent claims.


A first aspect of the present invention is a method to move focus from a first user interface element to a second user interface element shown on a display of a mobile communication terminal, said mobile communication terminal further comprising an input device. The method comprises the steps of:


detecting a directional input via said input device, said input indicating a desired direction to move;


determining a set of candidate user interface elements being eligible to receive focus;


for each candidate user interface element in said set of candidate user interface elements, determining a weighted distance between said candidate user interface element and said first user interface element by applying a function including, as an input parameter, an overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction; and


determining said second user interface element to be a particular candidate user interface element in said set of candidate user interface elements that has a minimum weighted distance to said first user interface element.


This provides a method with improved predictability of the shift of focus from a first user interface element to a second user interface element in a complex user interface environment.


In one embodiment, said mobile communication terminal further comprises a current focus position related to said first user interface element, said method comprising the further step of determining a new focus position related to said second user interface element, such that a component distance, along an axis orthogonal to said desired direction, of an absolute distance between said current focus position and said new focus position, is minimized. The focus point furthermore increases the reversibility and provides a way for the user to more predictably move focus from the first user interface element to the second user interface element.


In one embodiment, said in step of determining a new focus position, said new focus position is determined such that it is placed inside said second user interface element, at least a margin distance from any border thereof. This avoids the focus point from being placed right on the border of a user interface element, as the visible part of the user interface element is actually often placed a distance from the selectable border of the user interface element.


In one embodiment, said step of detecting an input involves detecting an input from a directional input device. It is especially important to improve predictability for users when a directional input device is employed.


In one embodiment, said step of detecting an input involves detecting an input from a device selected from the group consisting of a four way input device, a five way input device, an eight way input device, a nine way input device, a joystick, joypad and a navigation key.


In one embodiment, said overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction is restricted to an overlap being visible on said display. This prevents longer user interface elements always winning focus over shorter user interface elements when all user interface elements completely overlap with the currently focused user interface element.


In one embodiment, the method further comprises a step of providing a visual representation of said focus position on said display. This gives the user an indication of the position of the focus point which allows the user to improve the prediction of the movement of the focus point. The visual representation may be a graphical symbol. This allows the user to determine the position of the focus points by means of familiar user interface symbols or icons.


In one embodiment, the method further comprises a step, after said step of detecting an input, of determining a search area in a currently displayed document in which said first and second user interface elements are included, wherein said step of determining a set of candidate user interface elements is confined to user interface elements included in said search area. The introduction of a search area decreases the required processing.


In one embodiment, said search area is a combination of a part of said document currently visible on said display and a part of said document that would be visible if said document is scrolled in said desired direction. This search area should contain all user interface items the user would expect to be able to navigate to, given the desired direction.


In one embodiment, said applied function is:

distbasic+distparallel+2*distorthogonal−√{square root over (overlap)},


wherein distbasic is a Euclidian distance between a current focus position related to said first user interface element and a candidate focus position related to said candidate user interface element;


distparallel is a component along an axis parallel to said desired direction of a distance between a first point of said first user interface element utmost in said desired direction and a second point of said candidate user interface element utmost in a direction opposite said desired direction;


distorthogonal is a component along an axis orthogonal to said desired direction of a distance,between a third point, which may be the same as said first point, of said first user interface element closest to said candidate user interface element and a fourth point, which may be the same as said second point, of said candidate user interface element closest to said first user interface element; and


overlap is a distance of overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction.


The formula provided gives a good level of reversibility while being reasonably simple to calculate.


In one embodiment, distparallel is determined to be 0 if said first user interface element and said candidate user interface element overlap in a direction parallel to said desired direction.


In one embodiment, distorthogonal is determined to be 0 if said first user interface element and said candidate user interface element overlap in a direction orthogonal to said desired direction.


In one embodiment, if a component of said current focus position on an axis orthogonal to said desired direction equals a component of said candidate focus position on an axis orthogonal to said desired direction, distbasic is determined to be 0.


A second aspect of the present invention is a mobile communication terminal comprising a display and an input device, said terminal being configured to allow movement of focus from a first user interface element to a second user interface element shown on said display, said terminal furthermore comprising:


means for detecting a directional input via said input device, said input indicating a desired direction to move;


means for determining a set of candidate user interface elements being eligible to receive focus;


means for determining, for each candidate user interface element in said set of candidate user interface elements, a weighted distance between said candidate user interface element and said first user interface element by applying a function including, as an input parameter, an overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction; and


means for determining said second user interface element to be a particular candidate user interface element in said set of candidate user interface elements that has a minimum weighted distance to said first user interface element.


This provides a mobile communication terminal with improved predictability of the shift of focus from a first user interface element to a second user interface element in a complex user interface environment.


A third aspect of the present invention is a computer program product, directly loadable into a memory of a digital computer, comprising software code portions for performing a method according to the first aspect. This provides a computer program product, when executed, providing improved predictability of the shift of focus from a first user interface element to a second user interface element in a complex user interface environment.


Other objectives, features and advantages of the present invention will appear from the following detailed disclosure, from the attached dependent claims as well as from the drawings.




BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the invention will now be described in more detail, with reference to the enclosed drawings.



FIG. 1 is a perspective view of a mobile communication terminal in the form of a pocket computer according to one embodiment of the present invention.



FIG. 2 illustrates a computer network environment in which the pocket computer of FIG. 1 advantageously may be used for providing wireless access for the user to network resources and remote services.



FIG. 3 is a schematic block diagram of the pocket computer according to the previous drawings.



FIG. 4 illustrates a web browser showing content with hyperlinks.



FIG. 5 illustrates a user agent behavior as it would appear to a user navigating in a web page rendered on a display of a mobile terminal according to an embodiment of the present invention.



FIG. 6 illustrates a search area in an embodiment of the present invention.



FIG. 7 illustrates an exemplary behavior of movement of the focus position.



FIG. 8 illustrates a movement of a focus position when content is scrolled in an embodiment of the present invention.



FIG. 9 illustrates a movement of a focusable position to a focusable user interface element in an image in an embodiment of the present invention.



FIG. 10A and 10B illustrate the use of a distance function in an embodiment of the present invention.



FIG. 11 shows a flow chart illustrating an embodiment of the present invention.




DETAILED DESCRIPTION OF THE INVENTION


FIG. 1 is a perspective view of a mobile communication terminal in the form of a pocket computer according to one embodiment of the present invention.


The pocket computer 1 of the illustrated embodiment comprises an apparatus housing 2 and a display 3 provided at the surface of a front side 2f of the apparatus housing 2. Next to the display 3 a plurality of hardware keys 5a-d are provided, as well as a speaker 6.


More particularly, key 5a is a five-way navigation key, i.e. a key which is depressible at four different peripheral positions to command navigation in respective orthogonal directions (“up”, “down”, “left”, “right”) among information shown on the display 3, as well as depressible at a center position to command selection among information shown on the display 3. Key 5b is a cancel key, key 5c is a menu or options key, and key 5d is a home key.


At the surface of a short side 21 of the apparatus housing 2, there is provided an earphone audio terminal 7a, a mains power terminal 7b and a wire-based data interface 7c in the form of a serial USB port.



FIG. 2 illustrates a computer network environment in which the pocket computer 1 of FIG. 1 advantageously may be used for providing wireless access for the user to network resources and remote services. To allow portable use, the pocket computer 1 has a rechargeable battery. The pocket computer according to an embodiment of the invention also has at least one interface 55 (FIG. 3) for wireless access to network resources on at least one digital network. The pocket computer 1 may connect to a data communications network 32 by establishing a wireless link via a network access point 30, such as a WLAN (Wireless Local Area Network) router. The data communications network 32 may be a wide area network (WAN), such as the Internet or some part thereof, a local area network (LAN), etc. A plurality of network resources 40-44 may be connected to the data communications network 32 and are thus made available to the user 9 through the pocket computer 1. For instance, the network resources may include servers 40 with associated content 42 such as www data, wap data, ftp data, email data, audio data, video data, etc. The network resources may also include other end-user devices 44, such as personal computers.


A second digital network 26 is shown in FIG. 2 in the form of a mobile telecommunications network, compliant with any available mobile telecommunications standard such as GSM, UMTS, D-AMPS or CDMA2000. In the illustrated exemplifying embodiment, the user 9 may access network resources 28 on the mobile telecommunications network 26 through the pocket computer 1 by establishing a wireless link lob to a mobile terminal 20, which in turn has operative access to the mobile telecommunications network 26 over a wireless link 22 to a base station 24, as is well known per se. The wireless links 10a, 10b may for instance be in compliance with Bluetooth™, WLAN (Wireless Local Area Network, e.g. as specified in IEEE 802.11), HomeRF or HIPERLAN. Thus, the interface(s) 55 will contain all the necessary hardware and software required for establishing such links, as is readily realized by a man skilled in the art.



FIG. 3 is a schematic block diagram of the pocket computer according to the previous drawings. As seen in FIG. 3, the pocket computer 1 has a controller 50 with associated memory 54. The controller is responsible for the overall operation of the pocket computer 1 and may be implemented by any commercially available CPU (Central Processing Unit), DSP (Digital Signal Processor) or any other electronic programmable logic device. The associated memory may be internal and/or external to the controller 50 and may be RAM memory, ROM memory, EEPROM memory, flash memory, hard disk, or any combination thereof.


The memory 54 is used for various purposes by the controller 50, one of them being for storing data and program instructions for various pieces of software in the pocket computer 1. The software may include a real-time operating system, drivers e.g. for a user interface 51, as well as various applications 57.


Many if not all of these applications will interact with the user 9 both by receiving data input from him, such as text or navigational input through the input device(s) 53, and by providing data output to him, such as visual output in the form of e.g. text and graphical information presented on the display 52. Non-limiting examples of applications are a www/wap browser application, a contacts application, a messaging application (email, SMS, MMS), a calendar application, an organizer application, a video game application, a calculator application, a voice memo application, an alarm clock application, a word processing application, a spreadsheet application, a code memory application, a music player application, a media streaming application, and a control panel application. GUI (graphical user interface) functionality 56 in the user interface 51 controls the interaction between the applications 57, the user 9 and the user interface elements 52, 53 of the user interface.



FIG. 4 illustrates a browser application showing content with hyperlinks. In this example, the browser application executing in the pocket computer 1 renders a text on the display 52, including a number of hyperlinks 310-313, where the hyperlink 311 is currently focused. As is known in the art, if the user activates the focused hyperlink, the browser application will instead display a new page, referred to by the activated hyperlink.



FIG. 5 illustrates a user agent behavior as it would appear to a user navigating in a web page 304 rendered on the display 52 of a mobile communication terminal according to the present invention such as aforesaid pocket computer. With a currently focused user interface element being a first link 302, the user first navigates to the right 305 to a new link 303 by pressing on a right side of the five way input device 5a. Secondly, the user navigates left 306 by pressing on a left side of the five way input device 5a, resulting in focus shifting back to the first link 302 again, i.e. the navigation is reversible as the user ends up with a focus on the first link 302. A graphical symbol in the form of a hand cursor 301 is used to indicate a currently focused user interface element and can be positioned at coordinates given by a focus position. Note that user interface elements may have arbitrary shape, and in particular, user interface elements are not limited to be of rectangular shape.



FIG. 6 illustrates a search area in an embodiment of the present invention. In this example, the user has pressed down on a directional device, such as the five way directional device 5a, resulting in a desired direction 320 being downwards. It is then determined a search area 323 where eligible user interface elements to receive focus could exist, given a part 321 of the current document visible on the display 52 and the desired direction 320.


The search area 323 comprises the part 321 of the document currently visible on the display 52 plus a search range 322, which is the document area that would become visible as a result of scrolling the document in the desired direction 320.


A set of candidate user interface elements is populated by recursively traversing a rendering tree of the browser and testing boxes in nodes of the tree for overlap with the search area 323. Those boxes that are found to be overlapping are used to get pointers to corresponding user interface elements in the rendering tree. The rendering tree nodes, that correspond to user interface elements that are focusable are added to the set of candidate user interface elements.



FIG. 7 illustrates an exemplary behavior of movement of the focus position. The focus position is a point inside the area of the currently focused user interface element, displayed on the display 52. Alternatively, the focus point is not actually inside the currently focused user interface element but is positioned in a proximity exterior to the focused user interface element. This point is moved along the desired direction 328 (given by several user input events), and the user interface elements being traversed this way are favored for receiving the focus. This adds reversibility to the navigation as can be seen in the following example.


The focus position movement follows the navigational input given by the user: given that a user interface element 320 has focus, and the focus position is in the position 325, navigating downward 328 will determine the focus position to move in the same direction. Initially, the focus position encounters a user interface element 322, resulting in the focus position moving to a new position 326. An additional user input event indicating a desire to move further downwards, analogously moves the focused position to a new position 327 in a user interface element 323. When moving upward from user interface element 323, the focus position will be determined such that the same user interface elements will receive focus (322 followed by 320). Here is a difference to solutions in the prior art for solving the same problem: when traveling upwards from user interface element 323, the focus arrives at 322. At this point, an algorithm that relies solely on a mathematical function to compute the distance between user interface elements would most likely choose user interface element 321 as the next focus target. One reason for this can be that a geometrical center 328 of the user interface element 322 is closer to a geometrical center 329 of user interface element 321 than the geometrical center 330 of user interface element 320. However, this would lead to a rather bad user experience, since the expectation is that 320, and not 321, would receive focus.



FIG. 8 illustrates a movement of a focus position when content is scrolled in an embodiment of the present invention. On mobile communication terminals it is quite often the case that only a small part of a document is visible through the display 52. Consequently it frequently happens that only one user interface element is currently visible, and the user has to scroll the document in order to see other user interface elements.


For example, a user triggers a scroll in the browser application along a direction 336 to the left from an original display view 335a such that the display view changes to 335b, and in response to an additional user input event, to a display view 335c. Content, such as user interface elements 331, 332 and 333, are moved correspondingly as an effect of the scrolling. Additionally, synchronized with the scrolling, the focus position moves from an original position 334a to an intermediate position 334b and finally to a position 334c. The focus position keeps, if possible, the same relative position inside the display 52.



FIG. 9 illustrates a movement of a focusable position to a focusable user interface element in an image 341. This is to illustrate the situation when a focusable user interface element 341 encloses other focusable user interface elements 342-345. An example of such a situation is in an HTML document where the author may place several links 342-345 inside a large image 341, which is, in itself, focusable.


The focus position is originally in a position 339 in a user interface element 340, displayed on the display 52. The user indicates a desired direction 337 of movement, in this case being downwards, resulting in the focus position moving to a new position 338. This new position can then be the starting point for applying a distance function to the user interface elements 342-345 inside the image 341 as is described in detail with reference to FIGS. 10A and 10B below. Once a user interface element is chosen, the focus position can be used again as described above.



FIG. 10A and 10B illustrate the use of a distance function in an embodiment of the present invention.


In FIG. 10A, currently focused user interface element 347 is indicated that it is in focus by a box outline 346. The focus position is in a position 350. The user indicates a desired direction 351 to the right, resulting in candidate user interface elements being the user interface elements 348 and 349.


In FIG. 10B, currently focused user interface element 353 is indicated that it is in focus by a box outline 352. The current focus position is in a position 356. The user indicates a desired direction 359 downwards, resulting in candidate user interface elements being user interface elements 354 and 355. A candidate focus position 357 is determined for user interface element 354 by having a co-ordinate with a component on the y-axis corresponding to the geometrical center of the user interface element 354, and having a component on the x-axis being the same as the current focus position 356. A candidate focus position 358 is determined for user interface element 355 as having a co-ordinate with a component on the y-axis corresponding to the geometrical center of the user interface element 355, and having a component on the x-axis being as close to the current focus position 356, while still remaining within the user interface element 355. In this case, an optional margin is applied such that the candidate focus position 358 is positioned a margin distance from the border of the user interface element 355.


The distance function is used to determine a weighted distance between a currently focused user interface element position and a candidate user interface element. This distance function is applied to calculate a weighted distance between each candidate user interface element and the currently focused user interface element. The new user interface element to receive focus is then determined to be the candidate user interface element with the smallest weighted distance to the currently focused user interface element. The weighted distance is calculated by means of the following formula:

distweighted=distbasic+distparallel+2*distorthogonal−√{square root over (overlap)}


The parameter distbasic is a Euclidian distance between the current focus position of the currently focused user interface element and the candidate focus position of the candidate user interface element. If the two positions have the same coordinate on the axis orthogonal to the desired direction, distbasic is forced to be 0. For example, in FIG. 10B, when calculating the weighted distance between the user interface elements 353 and 355, distbasic is the Euclidian distance between. 356 and 358. When calculating distweighted between user interface elements 353 and 354, distbasic is forced to be 0. However, due to the presence of the other terms in the calculation of distweighted, user interface element 355 will yield a smaller distance value than 354, so the focus will correctly move to 355.


The parameter distparallel is a component along an axis parallel to the desired direction of a distance between a first point 360 of the first user interface element utmost in the desired direction, and a second point 361 of the candidate user interface element utmost in a direction opposite the desired direction. The parameter distparallel is in other words a component of the distance between the points 360 and 361, projected along an axis parallel to the desired direction. Preferably, if the first point is further along the desired direction than the second point, distparallel is set to be 0. Note that in FIG. 10B, as the currently focused user interface element 353 is rectangular, the point 360 may be chosen arbitrarily along the lower edge of the currently focused user interface element 353. Analogously, the point 361 may be chosen arbitrarily along the upper edge of the candidate user interface element 355.


The parameter distorthogonal is set to 0 if there is an overlap between the first user interface element and the candidate user interface element in a direction orthogonal to the desired direction; otherwise distorthogonal is a component along an axis orthogonal to the desired direction of a distance between a third point, which may be the same as the first point, of the first user interface element closest to the candidate user interface element and a fourth point, which may be the same as the second point, of the candidate user interface element closest to the first user interface element. For example, in FIG. 10A, distorthogonal is the vertical distance, as the desired direction is horizontal, between the point 362 of the currently focused user interface element 347 and the point 363 of the candidate user interface element 348. The parameter distorthogonal is in other words a component of the distance between the points 362 and 363, projected along an axis orthogonal to the desired direction. Note that in this case, as the currently focused user interface element is rectangular, the point 362 may be chosen arbitrarily along the upper edge of the currently focused user interface element 347. Analogously, the point 363 may be chosen arbitrarily along the lower edge of the candidate user interface element 348. The term distorthogonal is used in the calculation to compensate for the situations where a link is close along the desired direction, but very far on the orthogonal axis. In such a case, it is more natural to navigate to another link, which may be further away along the desired direction, but approximately on the same level on the other axis.


The parameter overlap is a distance of overlap between the currently focused user interface element and the candidate user interface element in a direction orthogonal to the desired direction. For example, in FIG. 10A, the overlap between the currently focused user interface element 347 and the candidate user interface element 349, is a distance 364. User interface elements are rewarded for having high overlap with the currently focused user interface element. To prevent longer user interface elements always winning focus over shorter user interface elements when all user interface elements completely overlap with the currently focused user interface element, a visible width may optionally be set as an upper limit for the overlap.



FIG. 11 shows a flow chart illustrating an embodiment of the present invention. As a man skilled in the art will realize, all steps in this embodiment are not required to implement the invention.


In a detect directional input step 410, a directional input signal is detected via an input device such as the five-way navigation key 5a. This input gives information about the desired direction in which the user wishes to move focus from the currently focused user interface element to a target user interface element.


In a determine search area step 411, a search area is determined, as described in detail with reference to FIG. 6 above.


In a determine a set of candidate user interface elements step 412, user interface elements of the search area are all considered candidate user interface elements and references to these are collected in a set.


In a determine a test candidate user interface element step 413, a test candidate user interface element, for which a weighted distance has not been calculated yet, is selected from the set of candidate user interface elements.


In a calculate weighted distance step 414, a weighted distance is calculated between the test candidate user interface, element and the currently focused user interface element, as described in detail with reference to FIGS. 10A and 10B above.


In a conditional uncalculated user interface elements step 415, it is tested whether there are any more uncalculated user interface elements. If there are more uncalculated user interface elements, the method proceeds to the determine a test candidate user interface element step 413, otherwise the method proceeds to a determine target user interface element step 416.


In the determine target user interface element step 416, the target user interface element is determined as an element in the candidate set of user interface elements having a minimum weighted distance to the currently focused user interface element.


The invention has mainly been described above with reference to a number of embodiments. However, as is readily appreciated by a person skilled in the art, other embodiments than the ones disclosed above are equally possible within the scope of the invention, as defined by the appended patent claims. It is to be noted that the invention may be exercised in other kinds of mobile communication terminals than the pocket computer of FIGS. 1-3, including but not limited to mobile (cellular) telephones and personal digital assistants (PDAs).

Claims
  • 1. A method to move focus from a first user interface element to a second user interface element shown on a display of a mobile communication terminal, said mobile communication terminal further comprising an input device, said method comprising the steps of: detecting a directional input via said input device, said input indicating a desired direction to move; determining a set of candidate user interface elements being eligible to receive focus; for each candidate user interface element in said set of candidate user interface elements, determining a weighted distance between said candidate user interface element and said first user interface element by applying a function including, as an input parameter, an overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction; and determining said second user interface element to be a particular candidate user interface element in said set of candidate user interface elements that has a minimum weighted distance to said first user interface element.
  • 2. The method of claim 1, wherein said mobile communication terminal further comprises a current focus position related to said first user interface element, said method comprising the further step of: determining a new focus position related to said second user interface element, such that a component distance, along an axis orthogonal to said desired direction, of an absolute distance between said current focus position and said new focus position, is minimized.
  • 3. The method according to claim 2, wherein in said step of determining a new focus position, said new focus position is determined such that it is placed inside said second user interface element, at least a margin distance from any border thereof.
  • 4. The method of claim 1, wherein said step of detecting an input involves detecting an input from a directional input device.
  • 5. The method of claim 4, wherein said step of detecting an input involves detecting an input from a device selected from the group consisting of a four way input device, a five way input device, an eight way input device, a nine way input device, a joystick, a joypad and a navigation key.
  • 6. The method of claim 1, wherein said overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction is restricted to an overlap being visible on said display.
  • 7. The method of claim 1, further comprising the step of: providing a visual representation of said focus position on said display.
  • 8. The method of claim 7, wherein said visual representation is a graphical symbol.
  • 9. The method of claim 1, further comprising the step, after said step of detecting an input, of: determining a search area in a currently displayed document in which said first and second user interface elements are included, wherein said step of determining a set of candidate user interface elements is confined to user interface elements included in said search area.
  • 10. The method of claim 9, wherein said search area is a combination of a part of said document currently visible on said display and a part of said document that would be visible if said document is scrolled in said desired direction.
  • 11. The method of claim 1, wherein said applied function is:
  • 12. The method of claim 11, wherein distparallel is set to 0 if said first user interface element and said candidate user interface element overlap in a direction parallel to said desired direction.
  • 13. The method of claim 11, wherein distorthogonal is set to 0 if said first user interface element and said candidate user interface element overlap in a direction orthogonal to said desired direction.
  • 14. The method of claim 11, wherein if a component of said current focus position on an axis orthogonal to said desired direction equals a component of said candidate focus position on an axis orthogonal to said desired direction, distbasic is determined to be 0.
  • 15. A mobile communication terminal comprising a display and an input device, said terminal being configured to allow movement of focus from a first user interface element to a second user interface element shown on said display, said terminal furthermore comprising: means for detecting a directional input via said input device, said input indicating a desired direction to move; means for determining a set of candidate user interface elements being eligible to receive focus; means for determining, for each candidate user interface element in said set of candidate user interface elements, a weighted distance between said candidate user interface element and said first user interface element by applying a function including, as an input parameter, an overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction; and means for determining said second user interface element to be a particular candidate user interface element in said set of candidate user interface elements that has a minimum weighted distance to said first user interface element.
  • 16. A computer program product, directly loadable into a memory of a digital computer, comprising software code portions for performing a method according to claim 1.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation-in-part of U.S. patent application Ser. No. 11/135,624 filed on May 23, 2005.

Continuation in Parts (1)
Number Date Country
Parent 11135624 May 2005 US
Child 11158921 Jun 2005 US