The present invention relates to a user interface, a method, and a computer program for controlling an apparatus, and such an apparatus.
In the field of user operation of apparatuses, e.g. on small handheld apparatuses, e.g. mobile phones or portable media players, and headsets for these having benefit of being operated, the problem of manipulating the apparatus that do not have room for input means for all the functions provided by the apparatus. This can be solved by navigating in menus where parameters of the functions can be set, if the apparatus is equipped with a graphical user interface. However, this implies other problems: control of functions that a user put timing constraints on, or operation when the user do not have ability to look at the apparatus. Such a function is volume control. Different approaches have been provided to control volume by small dedicated keys or a sliding key (jog/shuttle knob). A problem with this is that it might either be hard for the user to use very small keys, or that the keys require too much space on the small handheld apparatus. Another problem is that mechanical fitting of such keys can give secondary problems, such as at manufacturing the apparatus, maintaining apparatus quality, or design of the apparatus. Therefore, there is a demand for an approach that overcomes at least some of these problems.
Therefore, the inventor has found an approach that is both user intuitive and efficient also for small apparatuses. The basic understanding behind the invention is that this is possible if the user is provided to control functions directly independent on menu status by means not requiring outer user interface space. The inventor realized that a user is able to move the portable apparatus, which can be registered by the apparatus. Thus, the user can control one or more functions independent on menus and without dedicated keys.
According to a first aspect of the present invention, there is provided a user interface comprising a sensor arranged to determine a spatial change, said user interface being arranged to control at least one function, wherein the function is controlled by said determined spatial change.
The spatial change may comprise a linear movement. The spatial change can comprises a change in orientation. The function may be volume control of audio output.
The user interface may further comprise an enablement controller arranged to provide a control signal enabling control of the function. The enablement controller may be arranged to receive a enablement user input for providing the control signal. The enablement user input may be a predetermined spatial change to be determined prior to the determined spatial change used to control the function. The user interface may further comprise a further user actuatable element. The enablement user input may be a determined actuation of the further user actuatable element.
According to a second aspect of the present invention, there is provided an apparatus comprising a processor and a user interface controlled by the processor, the user interface comprising features according to the first aspect of the present invention.
The apparatus comprises a processor and a user interface connected to the processor. The user interface comprises a sensor arranged to determine a spatial change. The processor is arranged to control a function based on said determined spatial change.
The spatial change may comprise a linear movement. The spatial change may comprise a change in orientation. The function may be volume control of audio output.
The apparatus may further comprise an enablement controller arranged to provide a control signal enabling control of the function. The enablement controller may be arranged to receive an enablement user input for providing the control signal. The enablement user input may be a predetermined spatial change to be determined prior to the determined spatial change used to control the function. The apparatus may further comprise a further user actuatable element. The enablement user input may be a determined actuation of the further user actuatable element.
According to a third aspect of the present invention, there is provided a user interface method comprising determining a spatial change; and controlling a function based on the determined spatial change.
The determining of the spatial change may comprise determining a linear movement. The determining of the spatial change may comprise determining a change in orientation. The controlling of the function may comprise adjusting audio output volume.
The method may further comprise, prior to determining the spatial change, receiving an enablement user input; and providing a control signal enabling the controlling of the function. The receiving of the enablement user input may comprise detecting a predetermined spatial change prior to the determined spatial change used to control the function. The receiving of the enablement user input may comprise detecting a determined actuation of a further user actuatable element.
According to a fourth aspect of the present invention, there is provided a computer program comprising instructions, which when executed by a processor are arranged to cause the processor to perform the method according to the third aspect of the invention.
According to a fifth aspect of the present invention, there is provided a computer readable medium comprising program code, which when executed by a processor is arranged to cause the processor to perform the method according to the third aspect of the invention.
The computer readable medium comprises program code comprising instructions which when executed by a processor is arranged to cause the processor to perform determination of a spatial change; and control of a function based on the determined spatial change.
The program code instructions for determination of a spatial change may further be arranged to cause the processor to perform determination of a linear movement. The program code instructions for determination of a spatial change may further be arranged to cause the processor to perform determination of a change in orientation. The program code instructions for control of a function may further be arranged to cause the processor to perform adjustment of audio output volume.
The program code instructions may further arranged to cause the processor to perform, prior to determination of the spatial change, reception of an enablement user input; and provision of a control signal enabling the controlling of the function. The program code instructions for reception of the enablement user input may further be arranged to cause the processor to perform detection of a predetermined spatial change prior to the determined spatial change used to control the function. The program code instructions for reception of the enablement user input may further be arranged to cause the processor to perform detection of an actuation of a further user actuatable element.
a to 1c illustrate a user interface according to embodiments of the present invention.
a illustrates a user interface 100 according to an embodiment of the present invention. The user interface 100 is illustrated in the context of an apparatus 102, drawn with dotted lines, holding an orientation sensor 104 of the user interface 100. The user interface 100 co-operates with a processor 106, which can be a separate processor of the user interface 100, or a general processor of the apparatus 102. The orientation sensor 104 can be a force sensor arranged to determine force applied to a seismic mass 108, e.g. integrated with the sensor 104, as schematically depicted magnified in
It should be noted that an accelerometer based on gyroscopic effects, or equivalent functioning sensor e.g. using optics and light interference, e.g. ring laser gyroscope or fibre optic gyroscope, can be used, as well as a force sensor and seismic mass to detect changes in orientation in the embodiment illustrated in
The user interfaces 100, 200 may also comprise other elements, such as keys 110, 210, means for audio input and output 112, 114, 212, 214, image acquiring means (not shown), a display 116, 216, etc, respectively. The apparatuses 102, 202 may be a mobile telephone, a personal digital assistant, a navigator, a media player, a digital camera, or any other apparatus benefiting from a user interface according to any of the embodiments of the present invention.
Examples will be demonstrated below, but in general, the directions and/or movements can either be pre-set, or be user defined. In the latter case, a training mode can be provided where the user defines the directions and/or movements.
a to 3c illustrate an operation example of an apparatus 300 according to an embodiment of the present invention. The apparatus 300 can for example be a mobile phone or a headset. The example is based on using the user interface demonstrated with reference to any of
The angles of orientation will be given as a deviation (D from a determined average orientation 302 of the present use of the apparatus, as illustrated in
Another applicable principle is to determine movements of the apparatus. This relies on the fact that the force F on the seismic mass m depend on the acceleration of the mass as F=m·a. Upon movements, the seismic mass is subject to acceleration (and deceleration) in different directions, which movement can be registered by the force sensor and the processor. It should be noted that an accelerometer based on gyroscopic effects, or equivalent functioning sensor e.g. using optics and light interference to detect changes in orientation, e.g. ring laser gyroscope or fibre optic gyroscope. To illustrate this,
In summary four main ways of operation principles can be employed. One is where the parameter to be controlled, e.g. sound volume, is derived from an angle deviation from a reference angle. Another is where an angle deviation above a threshold angle deviation causes stepwise increase or decrease, depending on if the angle deviation is positive or negative, of the parameter to be controlled. Further another is where the parameter to be controlled is derived from movement, i.e. determined acceleration, e.g. by stepwise increase or decrease, depending on the direction of movement, of the controlled parameter. Still further another is where the parameter to be controlled is derived in two steps: first where a movement indicates that a change is desired, and second where the amount of increase or decrease, depending on the direction of movement, is determined by the time the apparatus is kept in an orientation having an angle deviation above a threshold angle deviation. Different combinations of these main ways of operation can readily be employed to design the user interface.
The UI 604 comprises at least a sensor 610 arranged to determine movements and/or orientations of the apparatus 600. Output of the sensor can be handled by an optional movement/orientation processor 612, or directly by the processor 602 of the apparatus 600. Based on the output from the sensor 610, the apparatus 600 can be operated according to what has been demonstrated with reference to any of
The apparatus 600 can be a mobile phone, a portable media player, or other portable device benefiting from the user interface features described above. The apparatus 600 can also be a portable handsfree device or a headset that is intended to be used together with any of the mobile phone, portable media player, or other portable device mentioned above, and for example being in communication with these devices via short range radio technology, such as Bluetooth wireless technology. For headsets or portable handsfree devices, the user interface described above is particularly useful, since these devices normally are even smaller, and normally operated without any support from graphical user interfaces.
To avoid unintentional control of the function due to unintentional movements of an apparatus having a user interface performing the method, enablement control of controlling the function can be performed. This can be done, e.g. prior to determining the spatial change, by receiving 704 an enablement user input, and providing 706 a control signal enabling the controlling of the function. Where no enablement user input, e.g. detection of a predetermined spatial change or an actuation of a further user actuatable element such as a key or proximity sensor, is received, the method can wait until such enablement user input is received, e.g. by conditional return 708 to the reception phase 704 of enablement user input.
Upon performing the method, operation according to any of the examples given with reference to