The present invention generally relates to improved stylus suitable for touch surfaces and configured for providing dynamic controls.
To an increasing extent, touch-sensitive panels are being used for providing input data to computers, electronic measurement and test equipment, gaming devices, etc. The panel may be provided with a graphical user interface (GUI) for a user to interact with using e.g. a pointer, stylus or one or more fingers. The GUI may be fixed or dynamic. A fixed GUI may e.g. be in the form of printed matter placed over, under or inside the panel. A dynamic GUI can be provided by a display screen integrated with, or placed underneath, the panel or by an image being projected onto the panel by a projector.
For most touch systems, a user may place a finger onto the surface of a touch panel in order to register a touch. Alternatively, a stylus may be used. A stylus is typically a pen shaped object with one end configured to be pressed against the surface of the touch panel. An example of a stylus according to the prior art is shown in
Two types of stylus exist for touch systems. An active stylus is a stylus typically comprising some form of power source and electronics to transmit a signal to the host touch system. The type of signal transmitted can vary but may include position information, pressure information, tilt information, stylus ID, stylus type, ink colour etc. The source of power for an active stylus may include a battery, capacitor, or an electrical field for providing power via inductive coupling. Without power, an active stylus may lose some or all of its functionality.
An active stylus may be readily identified by a host system by receiving an electronic stylus ID from the active stylus and associating the stylus ID with position information relating to the contact position between the stylus and the touch surface of the host system.
However, styluses do not lend themselves to enhanced control functionality featuring a large number of controls, such as buttons, due to the limited surface of a stylus on which to place the controls. Therefore, what is needed is a way of improving the control functionality of a stylus using a limited number of controls.
It is an objective of the invention to at least partly overcome one or more of the above-identified limitations of the prior art.
One or more of these objectives, as well as further objectives that may appear from the description below, are at least partly achieved by means of a method for data processing, a computer readable medium, devices for data processing, and a touch-sensing apparatus according to the independent claims, embodiments thereof being defined by the dependent claims.
Embodiments of the invention will now be described in more detail with reference to the accompanying schematic drawings.
The present invention relates to styluses and touch panels and the use of techniques for providing control of a computer device using a stylus and touch panel. Throughout the description the same reference numerals are used to identify corresponding elements.
In a preferred embodiment, tip 20 is configured to detect contact with a touch surface and generate a corresponding signal. Tip 20 comprises a contact sensor that may comprise a pressure detector, projected capacitance sensor, or other sensor suitable for detecting the application of the stylus tip to a surface. In this embodiment, when the stylus of this embodiment is applied to a touch sensitive surface, the tip 20 detects the contact with a surface and signals control system 60. Similarly, the depression of either of the buttons 30 or 40 is signalled to control system 60. Control system 60 is configured to generate and transmit a signal via antenna coil 70 to a receiver in a touch sensing system, wherein the signal is generated in dependence on at least the signal from tip 20, button 30, and/or button 40.
In one embodiment, computer device (not shown) may comprise a touch sensitive surface, such as a touch pad or touch display as is well known in the art. The computer device may be connected to a display configured to display UI components controlled by input to the touch sensitive surface. In one embodiment, the touch sensitive surface and display are connected to form a touch sensitive display configured to receive and display a corresponding user interface control for a finger or stylus touch. Examples of a user interface control may include a paint brush or pen tip for applying digital ink to a digital canvas, an eraser for removing digital ink from a digital canvas, a select tool for selecting portions of a digital canvas, etc.
In a preferred embodiment of the sequence shown in
In one embodiment, the release of button 40 whilst tip 20 is still applied to the touch surface results in no change to the control signal. In an alternative embodiment, a third control signal is transmitted from the stylus to the computer device when button 40 is released whilst tip 20 is still applied to the touch surface. In response to receiving the second control signal, the computer device is configured to apply a second function to the user interface control matched to the stylus from the point at which the third control signal is received. In one embodiment, the second function is to select an area specified by the user interface control matched to the stylus for further manipulation.
In a preferred embodiment, the stylus is configured to generate a control signal corresponding to that described in the preferred embodiment of
In a preferred embodiment of the sequence shown in
In a preferred embodiment of the sequence shown in
In a preferred embodiment in which stylus 100 has two buttons, button 40 and button 30, stylus 100 is configured to transmit a fourth, fifth or sixth control signal in response to a usage sequence of button 30 corresponding to the embodiments described above in relation to
In an embodiment comprising button 40 and button 30, the computer device is configured to carry out a second action in response to receiving a sixth control signal. In one embodiment, the second action is to simulate a keypress. In one embodiment, the second action is to move to the previous page of a document displayed by the computer device.
In an alternative embodiment, the stylus simply transmits the status of the contact sensor, button 30, and button 40 to the computer device. The logic needed to generate the same behaviour at the user interface control level as described above is instead handled by the computer device. In this embodiment, the computer device is configured to determine an activation of the user control from a user control signal generated by the user control and transmitted from the controller device to the computer device. Similarly, the computer device is configured to determine whether a contact between the controller device and a touch surface has occurred in dependence on a contact sensor signal generated by the contact sensor and transmitted from the controller device to the computer device. The computer device is then configured to perform at least one of the following:
1) Apply a default function to the user interface control in response to detecting contact between the controller device and a touch surface with no corresponding activation of the user control.
2) Apply a first function to the user interface control in response to detecting an activation of the user control and subsequent contact between the controller device and a touch surface before de-activation of the user control is detected.
In one embodiment, several user interface elements on a digital canvas may be grouped together via the following gesture: Whilst holding down a button of the stylus, applying the stylus to the touch surface at the location of each of the user interface elements in order to select each one by one. Preferably, the stylus is lifted away from the touch surface in-between application to the user interface elements. In one example, the user interface elements are post-it notes and the above process allows the selection of multiple post-it notes. When the stylus is applied to the touch surface and the button is released, the selected user interface elements are aligned in a geometric arrangement around the location of the stylus. The geometric arrangement may include a grid arrangement of the user interface elements around the stylus location. In one embodiment, the user interface elements are arranged at a default position if the user releases the button whilst the stylus is not applied to the touch surface.
The above gesture may also be connected to a specific electronic stylus ID. In this embodiment, selection of user interface elements is done according to the electronic stylus ID of the stylus selecting the user interface elements. When the stylus having the specific electronic stylus ID is applied to the touch surface and the button is released, the user interface elements selected using the specific electronic stylus ID are aligned in a grid geometric arrangement around the location of the stylus. This feature allows two or more users to do selection and grouping of different user interface elements according to the above gesture simultaneously.
Number | Date | Country | Kind |
---|---|---|---|
1730034-4 | Feb 2017 | SE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/SE2018/050070 | 1/31/2018 | WO | 00 |