The disclosed embodiments generally relate to user interfaces and more particularly to haptic user interfaces.
User interfaces for users to control electronic devices have developed continuously since the first electronic devices. Typically, displays are used for output and keypads are used for input, particularly in the case of portable electronic devices.
There is however a problem with portable electronic devices, in that a user may desire to interact with the device even when it is not feasible to see the display.
One known way to alleviate this problem is to use voice synthesis and voice recognition. Voice synthesis is when the device outputs data to the user via a speaker or a headphones. Voice recognition is when the device interprets voice commands from the user in order to receive user input. However, there are situations when the user desires to be quiet and still interact with the device.
Consequently, there is a need for an improved user interface.
In view of the above, it would be advantageous to solve or at least reduce the problems discussed above.
According to a first aspect of the disclosed embodiments there has been provided a method comprising: generating at least one haptic user interface component using an array of haptic elements; detecting user input applied to at least one haptic element associated with one of the at least one haptic user interface component; and executing software code associated with activation of the one of the at least one user interface component.
Each of the at least one haptic user interface component may be generated with a geometrical configuration to represent the haptic user interface component in question.
The generating may involve generating a plurality of user interface components using the haptic element array, and wherein each of the plurality of user interface components may be associated with respective software code for controlling a media controller application.
The plurality of user interface components may be associated with the actions of: pausing media, playing media, increasing volume, decreasing volume, skip forward and skip back.
The generating may involve generating a user interface component associated with an alert.
The generating may involve generating user interface components associated with online activity monitoring.
A second aspect of the disclosed embodiment is an apparatus comprising: a controller; an array of haptic elements; wherein the controller is arranged to generate at least one haptic user interface component using the array of haptic elements; the controller is arranged to detect user input applied to at least one haptic element associated with the user interface component; and the controller is arranged to, as a response to the detection, execute software code associated with activation of the user interface component.
The apparatus may be comprised in a mobile communication terminal.
The controller may further be configured to generate each of the at least one haptic user interface component with a geometrical configuration to represent the haptic user interface component in question.
Each of the plurality of user interface components may be associated with respective software code for controlling a media controller application.
The plurality of user interface components may be associated with the actions of: pausing media, playing media, increasing volume, decreasing volume, skip forward and skip back.
A third aspect of the disclosed embodiments is an apparatus comprising: means for generating at least one haptic user interface component using an array of haptic elements; means for detecting user input applied to at least one haptic element associated with one of the at least one haptic user interface component; and means for executing software code associated with activation of the one of the at least one user interface component.
A fourth aspect of the disclosed embodiments is a computer program product comprising software instructions that, when executed in a controller capable of executing software instructions, performs the method according to the first aspect.
A fifth aspect of the disclosed embodiments is a user interface comprising: an array of haptic elements; wherein the user interface is arranged to generate at least one haptic user interface component using the array of haptic elements; the user interface is arranged to detect user input applied to at least one haptic element associated with the user interface component; and the user interface is arranged to, as a response to the detection, execute software code associated with activation of the user interface component.
Any feature of the first aspect may be applied to the second, third, fourth and the fifth aspects.
Other features and advantages of the disclosed embodiments will appear from the following detailed disclosure, from the attached dependent claims as well as from the drawings.
Generally, all terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless explicitly defined otherwise herein. All references to “a/an/the [element, device, component, means, step, etc]” are to be interpreted openly as referring to at least one instance of the element, device, component, means, step, etc., unless explicitly stated otherwise. The steps of any method disclosed herein do not have to be performed in the exact order disclosed, unless explicitly stated.
The aspect of the disclosed embodiment will now be described in more detail, reference being made to the enclosed drawings, in which:
a-c are views illustrating a mobile terminal according to an embodiment.
a-b illustrate the use of a haptic user interface for media control that can be embodied in the mobile terminal of
The disclosed embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which certain embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout.
The mobile terminals 100, 106 are connected to a mobile telecommunications network 110 through RF links 102, 108 via base stations 104, 109. The mobile telecommunications network 110 may be in compliance with any commercially available mobile telecommunications standard, such as GSM, UMTS, D-AMPS, CDMA2000, FOMA and TD-SCDMA.
The mobile telecommunications network 110 is operatively connected to a wide area network 112, which may be Internet or a part thereof. A server 115 has a data storage 114 and is connected to the wide area network 112, as is an Internet client computer 116.
A public switched telephone network (PSTN) 118 is connected to the mobile telecommunications network 110 in a familiar manner. Various telephone terminals, including the stationary telephone 119, are connected to the PSTN 118.
An front view of an embodiment 200 of the mobile terminal 100 is illustrated in more detail in
b is a side view of the mobile terminal 200, where the keypad 224 can be seen again. Furthermore, parts of a haptic array 226 can be seen on the back of the mobile terminal 200. It is to be noted that the haptic array 226 does not need to be located on the back of the mobile terminal 200; the haptic array 226 can equally be located on the front face, next to the display 223 or on any of the side faces. Optionally, several haptic arrays 226 can be provided on one or more faces.
c is a back view of the mobile terminal 200. Here the haptic array 226 can be seen in more detail. This haptic array comprises a number of haptic elements 227, 228 arranged in a matrix. The state of each haptic element 227, 228 can be controlled by the controller (331 of
The internal component, software and protocol structure of the mobile terminal 200 will now be described with reference to
The MMI 339 also includes one or more hardware controllers, which together with the MMI drivers cooperate with the haptic array 326, the display 323/223, keypad 324/224, as well as various other I/O devices 329 such as microphone, speaker, vibrator, ringtone generator, LED indicator, etc. As is commonly known, the user may operate the mobile terminal through the man-machine interface thus formed. The haptic array 326 includes, or is connected to, electro-mechanical means to translate electrical control signals from the MMI 339 to mechanical control of individual haptic elements of the haptic array 326.
The software also includes various modules, protocol stacks, drivers, etc., which are commonly designated as 337 and which provide communication services (such as transport, network and connectivity) for an RF interface 333, and optionally a Bluetooth™ interface 334 and/or an IrDA interface 335 for local connectivity. The RF interface 333 comprises an internal or external antenna as well as appropriate radio circuitry for establishing and maintaining a wireless link to a base station (e.g., the link 102 and base station 104 in
The mobile terminal also has a SIM card 330 and an associated reader. As is commonly known, the SIM card 330 comprises a processor as well as local work and data memory.
Now follows a scenario presenting a user interface according to an embodiment.
a-b illustrate the use of a haptic user interface for media control that can be embodied in the mobile terminal of
In an initial generate haptic UI (user interface) components step 780, haptic user interface components are generated on the haptic array 226 of the mobile terminal 200. This can for example be seen in more detail in
In a detect user input on haptic UI component step 782, user input is detected using the haptic array. The details of this are described above in conjunction with
In a execute associated code step 784, the controller executes code associated with the user input of the previous step. For example, if the user input is associated with playing music in the media player, the controller executes code for playing the music.
Although the invention has above been described using an embodiment in a mobile terminal, the invention is applicable to any type of portable apparatus that could benefit from a haptic user interface, including pocket computers, portable mp3-players, portable gaming devices, lap-top computers, desktop computers etc.
The invention has mainly been described above with reference to a few embodiments. However, as is readily appreciated by a person skilled in the art, other embodiments than the ones disclosed above are equally possible within the scope of the invention, as defined by the appended patent claims.