1. Field of the Invention
The present disclosure relates to a method for defining order of priority for handling touch-based gestures, and in particular for a user defining a set of gesture sequence controlling an order of priority for processing assignment by an application program itself or by the system of a device hosting the application program.
2. Description of Related Art
In recent years, the touch-based gestures (including multi-touch) have become popular, and these gestures can be interpreted directly by an application program itself if the application program (or app) wants to handle the such function triggered by the corresponding gesture, or the corresponding gestures can be interpreted by the system of a device if the application program does not handle such interpretation function itself, and meanwhile, the system of the device can be configured to handle such gesture function. And it is difficult to distinguish as to whether each of the touch-based “swipe” operation made by a user belongs to the application program or to the system. One simple method is to allow the application program to handle the interpretation of the gestures first, and if such gesture is not recognized by the application program, it is then passed over to the system of the device for further interpretation. As can be seen easily, such conventional method has inherited limitations.
Usually, a touch-based smartphone comprises of two major display areas, in which one area is configured as a main UI display area, and the other display area is configured as a menu area to display a plurality of menu buttons for functional operation capabilities. And in many modern smartphone designs, these two display areas are formed at or on the same surface, and it is easy to perform swipe gesture operations that extend across these two display areas by the user.
Usually for swipe-gestures or touch-based gesture swipes, the application program will only recognize if when the gesture has occurred in the “Main UI display area”. Meanwhile, such touch swipe gesture would not have been detected when performed in the “menu area”, but instead, it is typically a press and tap operation which is detected correspondingly, and such tap or press operation will invoke an action operation such as “go back to previous app”, “open the menu”, “search”, and “return to home”, etc. . . .
a-1b shows a system of an electronic device activated to pop up a new window for allowing operation in that new window upon a user to tap on a system icon.
a-5b show a left to right swipe defining one system gesture for invoking a window to pop up to allow the user to enter into various modes according to a third embodiment of present disclosure.
An objective of the present disclosure is to provide a method for a user to define whether a predetermined touch-based gesture operation or command should be handled by an application program itself or by the system of a device hosting the application program, thereby making the predetermined gesture operation or command provided by the user to be much more straightforward, intuitive, and easy to understand when used in actual situations.
Another objective of the present disclosure is to provide a method for defining a set of gesture sequence controlling an order of priority for processing assignment with respect to an operating system of an electronic device hosting an application program, and to the application program itself.
To achieve the above-said objectives, the present invention provides a method for defining a set of gesture sequence controlling an order of priority for processing assignment with respect to an operating system of an electronic device hosting an application program, and to the application program itself, comprising the following steps: a start position is determined as whether being detected in a menu area, followed by detecting and determining if a set of gesture sequence is a swipe; upon determining that the starting point is not detected in the menu area, the gesture sequence is sent for application level processing first; upon determining that the set of gesture sequence is a swipe, the gesture sequence sent for system level processing first; upon determining that the set of gesture sequence is not a swipe, the gesture sequence is sent to the application program for application level processing first.
To achieve the above-said objectives, the present invention defines a start position as being a first touch point(s) upon when a touch gesture is detected by the electronic device, and defines an end position as being a final touch point(s) upon when the touch gesture is detected by such device in the method using a set of gesture sequence for controlling an order of priority for processing assignment. If the start position is detected or sensed as being actuated in a main UI display area, this detected gesture is then acknowledged as being intended to be handled by the application program itself, and such detected gesture is called an application level gesture. If the start position is actuated upon when a corresponding gesture is detected in the menu area, this detected gesture is then acknowledged as being intended to be handled by the system of the device, and such operating mode can also referred to as a system gesture mode, and therefore, the application program should skip over from handling or processing such detected gesture and pass processing responsibility over to the system of the device.
By adopting the method for defining one set of gesture sequence controlling the order of priority for processing assignment of instant disclosure, the functionality of gestures performed in the menu area is expanded from being just merely press and tap operations, but also to include also a new “combined menu area plus main UI display area swipe” gesture operation or command. In addition, no false triggers would occur because the algorithm can effective distinguish between the “press and tap” conventional gesture operations occurring in the menu area versus the new “combined menu area plus main UI display area swipe” gesture operation.
According to a first embodiment of the present disclosure, an application is running in an electronic device and is managed by the operating system of the electronic device. In addition, a system icon 20 is found in a system menu area 30 of the electronic device (not shown) as shown in
In an alternative embodiment of present disclosure, the gestures that fit certain patterns can be allowed to be interpreted as system level gestures.
According to the first embodiment, a method for defining the order of priority using a set of gesture sequence for controlling the order of priority for processing assignment (defining a decision criteria for setting the correct or a desired particular order for prioritizing specific processing assignment based on predefined set of gesture sequence so as to decide whether to allocate the processing assignment to the application program itself or instead, to the system of the electronic device hosting the application program therein), a start position is defined as being a first touch point(s) upon when a touch gesture is detected by the device, and an end position is defined as being a final touch point(s) upon when the touch gesture is detected by such device.
A menu area 30 can be configured for usage to support either with or without a touch screen. In the case of having a touch screen, such swipe gesture operation can be easily detected and sensed, so that it would be easy to define and actuate the start position; and in the case without having a touch screen, there is no touch screen support in the menu area 30, and another method is required to actuate a start position, and some of the implementation include the following steps:
According to the above embodiments, one system gesture can also be defined by a set of gesture sequence, so that the set of gesture sequence will invoke a window to allow further more flexible system gestures in that window or reaching across that window. For example, according to a third embodiment of present disclosure, a left to right swipe is a set of gesture sequence defining one system gesture, and upon performing this set of gesture sequence by the user, a window or new page is invoked to pop up to allow the user to enter into various modes, such as, “M”, “V”, “D”, and “E” as shown in
It will be apparent to those skilled in the art that various modifications and variations can be made to the structure of the invention without departing from the scope or spirit of the invention. In view of the foregoing, it is intended that the present invention cover modifications and variations of this invention provided they fall within the scope of the following claims and their equivalents.