This application claims the priority benefit of Taiwan applications serial No. 108130915, filed on Aug. 28, 2019. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of specification.
The invention relates to a control method of a user interface and an electronic device.
Touchpads generally support single-finger or multi-finger gestures to activate the computer system operations. However, with the rapid development of notebook computers, the touchpad of the notebook computers often provides different input functions. For example, in a small notebook computer, the touchpad also provides a numeric interface for entering numeric keys, to make up for the lack of independent numeric keys region on the keyboard of the small notebook. However, when the user interface with other functions is added to the touchpad, the function switching is inconvenient, and even input errors is caused due to incorrectly selecting the user interface.
According to the first aspect, a controlling method of a user interface applied to an electronic device is provided. The electronic device includes a touch element and a screen, and the touch element includes a start area, a trigger area, and a track area connecting the start area and the trigger area. The controlling method of the user interface includes following steps: entering a startup interface display mode according to the touch behavior performed on the touch element; generating continuous touch data in response to the touch behavior when the touch behavior moves from the start area to the track area and an animation trigger condition is satisfied, activating an animation mode according to the continuous touch data; and generating the continuous touch data in response to the touch behavior when the touch behavior moves from the start area to the track area and from the track area to the trigger area, and opening a user interface according to the continuous touch data.
According to the second aspect, an electronic device is provided. The electronic device includes a screen, a touch element and a processor. A touch element includes a start area, a trigger area and a track area connecting the start area and the trigger area. The touch element is configured to generate a continuous touch data in response to a touch behavior. The processor is electrically connected with the screen and the touch element. The processor enters a startup interface display mode according to the touch behavior. the processor activates an animation mode according to the continuous touch data when the touch behavior moves from the start area to the track area and an animation trigger condition is satisfied, and when the touch behavior moves from the start area to the track area and from the track area to the trigger area, the processor opens a user interface according to the continuous touch data.
In summary, a control method of a user interface and an electronic device are provided, which matches the user interface of the application with gestures without affecting the original operation, to increase user experience and enhance convenience.
These and other features, aspects and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings.
In one embodiment, the screen 18 is a display, and the user performs various kinds of the touch behavior on the touch element 22. In one embodiment, the touch element 22 is a touch panel or a touch display, which is not limited herein. In one embodiment, the screen 18 is a touch display, a user performs various kinds of touch behavior on the screen 18 or the touch element 22.
In an embodiment, the processor 16 is implemented using a system on chip (SoC), a microcontroller (MCU), a central processing unit (CPU), or an application specific integrated circuit (ASIC).
In an embodiment, the screen 18 displays an image, and the touch element 22 includes a start area 221, a trigger area 223, and a track area 222 connecting the start area 221 and the trigger area 223. The processor 16 enters a startup interface display mode according to the touch behavior performed on the touch element 22. In an embodiment, the start area 221, the track area 222, and the trigger area 223 are rectangular, circular, triangular, polygonal, or any other shape, which is not limited herein. In one embodiment, the touch element 22 generates touch data in response to the touch behavior. The touch data at least includes touch location information (such as touch coordinates), so that the processor 16 determines the position touched by the user's finger 26 according to the touch data to perform corresponding operations. When the touch behavior moves from the start area 221 to the track area 222 and meets an animation trigger condition, as shown in
Please refer to
When entering the second stage of the startup interface display mode, the processor 16 determines whether the touch location information of the second subsequent touch data of the continuous touch data is located in the track area 222 (step S26). The second subsequent touch data refers to the touch data generated by the touch element 22 in response to the touch behavior after entering the second stage of the startup interface display mode. When it is determined that the touch location information of the second subsequent touch data is not within the track area 222, close the startup interface display mode and return to the normal mode are performed (step S25). When the touch location information of the second subsequent touch data is within the track area 222, the processor 16 determines whether the animation trigger condition is satisfied (step S27). When the touch location information of the second subsequent touch data is in the track area 222 but the animation trigger condition is not satisfied, the second stage of the startup interface display mode (step S23) is maintained; When the touch location information of the second subsequent touch data is located in the track area 222 and the animation trigger condition is satisfied, enter a third stage of the startup interface display mode is performed (Stage=3) (Step S28). At the third stage, the animation mode of the user interface 28 as shown in
Next, the processor 16 determines whether the touch location information of the third subsequent touch data in the continuous touch data is moved from the track area 222 to the trigger area 223 (step S29). The third subsequent touch data refers to the touch data generated by the touch element 22 in response to the touch behavior after entering the third stage of the startup interface display mode. When the touch location information of the third subsequent touch data is not moved from the track area 222 to the trigger area 223, the startup interface display mode (step S25) is closed. When the touch location information of the third subsequent touch data is moved from the track area 222 to the trigger area 223 (which indicates that the touch behavior moves from the start area 221 to the track area 222 and from the track area 222 to the trigger area 223 in sequence), enter the fourth stage (Stage=4) of the startup interface display mode is performed (step S30), and stop the animation mode and fully start the user interface 28 at the fourth stage are performed. At this time, a stable screen of the user interface 28 (as shown in
In one embodiment, requires when both the direction and the distance of the touch behavior meet a condition, the animation trigger condition is satisfied, such as sliding a sufficient distance in a specific direction. Please refer to
In one embodiment, the animation mode makes the user interface 28 to appear progressively on the screen 18 according to the direction and distance of the touch behavior, which is not limited herein. In other embodiments, the user interface 28 has different presentation modes according to the direction, distance, speed, or pressing force of the touch behavior when the animation mode is activated. In one embodiment, the user interface 28 presents a color changing effect such as changing the color of the user interface 28 from light to dark, from dark to light, or from the first color to the second color when the animation mode is activated and at least one of the direction, distance, speed, or pressing force of the touch behavior changes from small to large. In another embodiment, when the animation mode is activated, the user interface 28 presents a size change effect such as changing the size of the user interface 28 from small to large according to at least one of the direction, distance, speed, or pressing force of the touch behavior. In another embodiment, when the animation mode is activated, the user interface 28 presents a transparency change effect such as changing the user interface 28 from fully transparent to semi-transparent, and then from semi-transparent to opaque according to at least one of the direction, distance, speed or pressing force of the touch behavior changes from small to large. In another embodiment, when the animation mode is activated, the user interface 28 presents a shape change effect such as gradually changing the shape of the user interface 28 from a first shape (such as a shape of diamond) to a second shape (such as a shape of circle) according to at least one of the direction, distance, speed, or pressing force of the touch behavior changes from small to large.
Please refer to
Specifically, in an embodiment, the processor 16 includes a driving unit and a user interface processing unit corresponding to the user interface 28. When the user interface 28 is opened and the touch element 22 continuously senses the touch behavior, the processor 16 transmits the touch element coordinate information of the touch behavior to the user interface processing unit through the driving unit. Then, the processor 16 converts the touch element coordinate information into the user interface coordinate information by the user interface processing unit, and triggers the function of the interface icon 281 according to the user interface coordinate information.
In an embodiment, the screen 18 is a touch display. When the user interface 28 is activated and displayed on the screen 18, the user directly performs a second touch behavior on the screen 18, and then the touch location information of the touch data generated by the screen 18 in response to the second touch behavior is converted. The touch location information is converted from the screen coordinate information (that is, the coordinate in the coordinate system of the screen 18) to the user interface coordinate information (that is, the coordinate in the coordinate system of the user interface 28), to trigger the application function of the interface icon 281 which selected by the second touch behavior. Specifically, when the user interface 28 is opened and displayed on the screen 18, the processor 16 transmits the screen coordinate information corresponding to the second touch behavior to the user interface application corresponding to the user interface 28 through a driving application, and then the processor 16 converts the screen coordinate information into the user interface coordinate information by the user interface application, and triggers the application function of the interface icon 281 according to the user interface coordinate information.
In one embodiment, when the user interface 28 is displayed on the screen 18, multiple interface icons 281 are presented in a circular arrangement on the screen 18. To select one of the interface icon 281′ at this time, the second touch behavior needs to be performed on the touch element 22 (in an embodiment: a finger 26 slides into a responding area of the interface icon 281′ in the touch element). When the driving application of the processor 16 receives the touch data generated by the touch element 22 in response to the second touch behavior, enlarge the corresponding interface icon 281′. Please refer to
In one embodiment, when the user interface 28 is displayed on the screen 18, if no interface icon 281 is selected, and the finger 26 leaves the touch element 22 (that is, the touch behavior ends), or the finger 26 leaves the touch element 22 for a preset time (that is, the touch behavior ends and after the preset time), the user interface 28 is automatically closed.
In one embodiment, please refer to
In an embodiment, a plurality of the interface icons 281 on the user interface 28 are arranged in different shapes, not limited to the ring shape in the foregoing embodiment. Please refer to
Please refer to
Therefore, a control method and the electronic device in this disclosure provide a convenient and intuitive multiple user interface on the touch element. The user can gradually open the user interface by performing a touch behavior to optimize the user experience.
Although the disclosure has been described in considerable detail with reference to certain preferred embodiments thereof, the disclosure is not for limiting the scope. Persons having ordinary skill in the art may make various modifications and changes without departing from the scope. Therefore, the scope of the appended claims should not be limited to the description of the preferred embodiments described above.
Number | Date | Country | Kind |
---|---|---|---|
108130915 | Aug 2019 | TW | national |
Number | Name | Date | Kind |
---|---|---|---|
20100306702 | Warner | Dec 2010 | A1 |
20140071063 | Kuscher | Mar 2014 | A1 |
20140298237 | Galu, Jr. | Oct 2014 | A1 |
20150370427 | Zhang | Dec 2015 | A1 |
20160147433 | Lin | May 2016 | A1 |
Number | Date | Country |
---|---|---|
102467316 | May 2012 | CN |
102654818 | Sep 2012 | CN |
200839592 | Oct 2008 | TW |
200839592 | Oct 2008 | TW |
Number | Date | Country | |
---|---|---|---|
20210064229 A1 | Mar 2021 | US |