This application claims priority under 35 USC §119 to Finnish Patent Application No. 20045149 filed on Apr. 4, 2004.
The invention relates to a device having a touch screen and status detection of the device in cooperative engagement with the input associated with the touch screen. The invention also relates to a corresponding user interface and a corresponding system. It also relates to a touch screen module and a method for adapting a user interface to be suitable for two or more different touching states. It also relates to a computer program product and a software product for controlling the functions of a device having a touch screen and status indication thereof.
In various electronic devices, it is known to use a touch panel or a corresponding device to detect a touch or another effect as well as to determine the touching point or effective point. This type of a touch panel is typically used placed on top of a display terminal, in which case this type of an arrangement is also referred to as a touch screen. The user of the electronic device can thus perform selection operations and the like by touching the surface of the touch panel at an appropriate point. The information shown on the display can thus be used in selecting the touch point. For example, selection areas are formed on the display, information connected to the selection areas being displayed in connection with them. This information can be, for example, a text that discloses which function is activated in the electronic device by touching the selection area in question. The information can also be image information, such as a symbol, which discloses a function.
For touching a touch screen, it is possible to use, for example, a finger or a particular touching tool, such as a marking tool or a stylus. In many portable electronic devices, the touch screen is relatively small in size, for which reason a separate touching tool is primarily used in such applications. The use of a touching tool makes it possible to select the desired selection data from the information displayed in small size.
One problem in known arrangements is that the user cannot make the selection (i.e. touch) on the touch screen accurately without a touching tool, for example with a finger only. However, the use of another means than the touching tool would be preferable in many use situations. Such a situation is, for example, the answering of a call.
Now, a solution is presented for controlling the touch screen to be touched, for example, with a touching tool, with a finger, or in another way, which makes the use of the touch screen easier for the user.
The basic idea of the invention is to detect, whether a touching tool is used for touching the touch screen or not, and to change the information to be displayed on the touch screen and the user interface elements to be suitable for the touching means used.
Various touching means have different properties including, for example, the surface area touching the screen and the touch surface. To utilize these properties, it is advantageous to tune, i.e., to calibrate the control settings for the different touching means to be as individual as possible; these settings can be used, for example, to adapt the user interface. In one embodiment of the invention, it is possible to calibrate the user interface to be suitable for the touching means to be used. In one embodiment, the detection of the touching means and the determination of the control settings are performed during the step of calibration of the touch screen, wherein the user touches the touching points determined by the device with a touching means. On the basis of the touches, the device sets up information about the surface area and the touch surface of the touching means. The touching points can be freely located on the touch screen, for example in the corners and in the centre of the screen. The calibration of the touching means and the screen can be performed at different steps of usage, for example when the device is taken in use. In one embodiment, the calibration can be performed at any time when the user so wishes. In one embodiment, the sizes of the user interface elements are changed to correspond to the properties of the touching means in connection with the calibration.
In one embodiment, it is detected when a particular touching tool, such as, for example, a marking tool, is in use, i.e., in the active mode, and when it is not in use, i.e. when the touching tool is, for example, in a passive or standby mode. On the basis of the mode data, the information to be displayed on the touch screen and the user interface elements are. controlled to be suitable for the touching tool used.
One embodiment of the device according to the invention comprises a touch screen which reacts to the touch or a corresponding input of the touching means and on which user interface elements are displayed, as well as status means for detecting the mode of the touching means giving the input to the touch screen. The device is arranged to adapt one or more user interface elements to be displayed on the touch screen to be suitable for the touching means detected by the mode status means.
The detection of the touching means used can be implemented in a number of ways. One way is to detect when the touching tool is in the active mode, wherein the touching tool is defined as the touching means. The mode detection can be implemented in a variety of ways, for example by a mechanical, optical or electromagnetic sensor.
In one embodiment, in turn, the touching means is identified on the basis of the touch sensed by the touch screen. It is thus possible, for example, to display the information and the user interface elements adapted to such a touching means which was last used for touching the screen. The detection of the touching means can be implemented in a number of ways, depending primarily on the principle of operation of the touch screen.
It is also possible to form the information on the type of the touching means by a separate control structure controlled by the user. However, for the user, it is often more practical if the data of the touching means is produced without the need for a control by the user.
According to the invention, the settings of the touch screen are changed according to the touching means in use. The aim is to optimize the displayed information to be suitable for the touching means. In one embodiment of the invention, the size of the displayed information and of user interface elements is changed depending on whether a touching tool is used or not. It is often possible to use a touching tool to touch and/or to select smaller details than, for example, with a finger, wherein it is possible to display small user interface elements when a touching tool is used and large user interface elements when a finger is used as the touching means.
In one embodiment of the invention, different user interface elements can also be prioritized; that is, they can, for example, be arranged in an order of priority, by which some elements can be left out in the case of larger user interface elements. It is thus possible to magnify the user interface elements to be displayed according to the touching means of the user.
In another embodiment, in turn, the data of the touching means is used to affect the type of applications displayed. For example, when the touching means is a pen-like touching tool, e.g. such applications are displayed in which it is possible, for example, to write or draw with said touching tool. On the other hand, when another touching means is used than the touching tool, for example a finger, it is possible to display finger-controllable applications, such as various lists.
By the arrangement according to the invention, many advantages are achieved when compared with the solutions of prior art. One embodiment of the invention makes it possible to use various touching means in an efficient way. Another embodiment, in turn, makes it possible to adapt the quantity and/or quality of the information displayed, for example to optimize the displayed information to be more suitable for the performance of the touching means.
One embodiment of the invention also improves the usability, because the size of the user interface elements corresponds better to the touching means used, wherein the occurrence of error touches and thus error selections is reduced. In another embodiment, calibration of the device is used to secure the manipulation of the user interface at a sufficient accuracy, wherein the probability of error touches, which reduce the usability, is decreased. Furthermore, the calibration can also be used to secure the matching of the coordinates of the pixels used for drawing an image visible to the user and those of the film detecting the touch.
One embodiment of the invention makes it possible to display a large quantity of information, such as several small icons. The displaying of several small icons is, in many cases, user friendly when a touching tool is used.
The present invention is thus directed to such devices and methods as well as corresponding user interfaces, systems, touch screen modules, computer program products and software products.
The solution presented by the invention can be used in a variety of devices with a touch screen. In many applications, only a part of the functions of the device are controlled with the help of the touch screen, but it is also possible to implement a device in which all the functions are controlled via the touch screen. Possible devices in which the arrangement of the invention is advantageous, include mobile stations, communication devices, electronic notebooks, personal digital assistants (PDA), various combinations of the above devices, as well as other devices in which touch screens are used. The invention is also suitable for use in systems which comprise a device module with a touch screen. Thus, the device module with the touch screen can be used to control functions of the system via the touch screen. The different functions of the system can be implemented in different device elements, depending on the assembly and use of the system.
In the following, the invention will be described in more detail with reference to the appended principle drawings, in which
For the sake of clarity, the figures only show the details necessary for understanding the invention. The structures and details which are not necessary for understanding the invention and which are obvious for anyone skilled in the art have been omitted from the figures in order to emphasize the characteristics of the invention.
In this context, it should be mentioned that in this description a touch does not solely refer to a situation, in which the touching means (touching tool 3 and user's finger 7) touches the surface of the touch screen 2, but the touch can in some cases be also sensed in a situation, in which the touching means 3,7 is sufficiently close to the surface of the touch screen 2 but does not touch it. Furthermore, the surface of the touch screen 2 can be provided with e.g. a protective film, in which case this protective film can be touched, or the touching means 3, 7 is sufficiently close to it and the touch screen 2 can sense the touch. This type of a touch screen 2, not requiring a physical touch, is normally carried out by the capacitive and/or optical principle.
The touch screen 2 is typically equipped with a touch screen controller, in which the necessary steps are taken to control the operation of the touch screen and to detect touches (or said corresponding inputs). In one embodiment, the controller of the touch screen 2 forms the coordinates of the touch point and transmits them e.g. to the control block of the electronic device 1. On the other hand, the steps required for controlling the operation of the touch screen 2 and for sensing a touch can, in some applications, be also performed in the control block of the electronic device 1, in which case a separate controller for the touch screen is not required.
In implementing the touch screen 2, it is possible to use a variety of techniques, non-limiting examples of which include touch screens based on optical detection, capacitive touch screens and resistive touch screens. In view of the present invention, the type of the touch screen 2 is not significant, nor is the principle how the different touch points are sensed.
In
In one embodiment, the centre of gravity of the touching area of the finger 7 is determined, and this information is used to determine the user interface element 8 which is primarily activated by the touching area of the finger. For determining the centre of gravity of the touching area, various ratios can be defined for different points of the user interface element 8. Thus, a touch on the point of identification data of the user interface element 8 can be allocated, for example, a high weight value, wherein such a touch is interpreted to be related to said identification data and the respective function, irrespective of other touches detected by the touch screen 2.
In another embodiment, in turn, the user interface is calibrated to be suitable for the touching means 3, 7 to be used. One possible way of identifying the touching means 3, 7 and determining various parameters for the control setting data is the step of calibration. Thus, the user for example goes through the touching points determined by the device 1. On the basis of the touches by the touching means 3, 7, the device 1 sets up information about the surface area and the touch surface of the touching means. The touching points can be freely located on the screen, for example in each corners and in the centre of the screen. By determining individual parameters for the touching means 3, 7, the user interface can be manipulated at a sufficient accuracy. This, in turn, reduces or eliminates the number of error touches which reduce the usability. By calibration, it is also possible to secure the matching of the pixels to be used for drawing an image visible to the user and the coordinates on the film detecting the touching and to correct the control values, if necessary.
Furthermore, one could add a mention somewhere that the device may be used by several users and it could therefore display different icons or icons with different sizes. For example, a person with thin fingers does not need as large icons as a person with thick fingers.
The above-presented calibration of the user interface and the creation of control setting data according to the touching means 3, 7 are carried out in different steps of usage, for example when the device is taken in use. Typically, the calibration is carried out when introducing such a touching means 3, 7, whose properties differ from the properties of the touching means 3, 7 previously in use. In one embodiment, the calibration and the creation of control setting data can be performed at any time when the user so wishes. In one embodiment, in connection with the calibration, the values effective on the sizes of the user interface elements 8 are changed to comply better to the properties of the touching means 3, 7.
In
When the touching tool 3 is set in the active mode, for example by removing the touching tool 3 from the holder 4, the user interface elements 8 are changed from the form optimized for the finger 7 to the form optimized for the touching tool 3, i.e., for example, from the form shown in
The active and/or passive mode of the touching tool 3 can be detected in a number of different ways. In one embodiment, the presence of the touching tool 3 in the holder 4 is detected by a mechanical switch structure 5, wherein the status of said switch is changed depending on whether the touching tool is placed in or detached from the holder. In another embodiment, the presence of the touching tool 3 in the holder 4 is detected by an optical switch structure 5, and in a third embodiment, an electromagnetic switch structure is used to detect a change in the electromagnetic field caused by the touching tool 3. In one embodiment, the data about the position of the touching tool 3 is transmitted from the presence sensor 5 to the controller of the touch screen. The controller of the touch screen, in turn, arranges the information and the user interface elements 8 on the touch screen 2 in a form depending on the position of the touching tool (or, more generally, the mode of the touching tool).
The means 5 for detecting the mode of the touching tool 3, such as a switch or a sensor, can be placed in several different locations in the device 1, for example in the touching tool 3, in the holder 4, and/or in the touch screen 2. Depending on the location of placement and the requirements set by it, it is advantageous to select the most suitable status means 5. For example, in the touch screen, it is often advantageous to use an optical or electromagnetic sensor 5, and in the touching tool 3 and in the holder 4, it is often advantageous to use a mechanical or electromagnetic sensor. Also other types of means and sensors 5 can be used according to the present invention.
In one embodiment, the detection of the mode of the touching tool 3 is implemented with a switch structure in the touching tool 3. The switch structure may be controlled by the user consciously or unknowingly. Such an embodiment makes it possible, for example, that a detached touching tool which is not intended to be used for controlling (i.e., which is not on) will not cause the adaptation of the display.
The above-presented active mode and passive mode are only examples of various modes which the touching tools 3 and the finger 7 may have. In addition, some embodiments include other modes, such as, for example, various standby modes and precision modes. By selecting the precision mode, for example, it is possible to affect the touching accuracy of the touching tool 3.
In the above description of some exemplary embodiments of the invention, the device 1 comprising only one touching tool 3 was used as an example. However, in one embodiment of the invention, shown in
In one embodiment of the invention, there is also a function by which the user can select the desired optimization for the touch screen 2, irrespective of the touching means 3, 7 used. Thus, for example when the touching tool 3 is lost or damaged, the user can select the user interface optimized for the finger 7, even though the data from the presence sensor 5 indicates that the user interface optimized for the touching tool should be used. Thus, in one embodiment, a “not-in-use” mode is defined for the touching tool 3, wherein the data on said tool does, not affect the control of the touch screen.
By combining, in various ways, the modes and structures disclosed in connection with the different embodiments of the invention presented above, it is possible to produce various embodiments of the invention in accordance with the spirit of the invention. Therefore, the above-presented examples must not be interpreted as restrictive to the invention, but the embodiments of the invention may be freely varied within the scope of the inventive features presented in the claims hereinbelow.
Number | Date | Country | Kind |
---|---|---|---|
20045149 | Apr 2004 | FI | national |