The disclosure relates to a control method, and in particular, to a control method of an electronic device.
Computers provide a variety of functions. These functions are triggered through different instructions. Users need to memorize the instructions to trigger a certain function for operation. Therefore, it is inconvenient.
The disclosure provides a control method of an electronic device. The electronic device has an input unit and a display unit. The control method comprises the following steps: displaying a user interface on the display unit, the user interface having a function event; receiving an execution command that corresponds to the function event through the input unit, and displaying a window in response to the execution command on the display unit; and receiving a disable command that corresponds to the function event through the input unit, and displaying an execution result in response to the disable command on the display unit; wherein at least one of the execution command and the disable command is a gesture.
The control method provided in the disclosure performs a function event menu by receiving an execution command (for example, a specified gesture) through an input unit, and receives a subsequent command, such as a disable command through the input unit to perform a subsequent step. The control method of the disclosure simplifies operation steps and improves the convenience.
Specific embodiments of the disclosure are described in further detail below with reference to the accompanying drawings. The advantages and features of the disclosure will become more apparent from the following descriptions and claims. It is to be noted that the drawings are all in a very simplified form and are not drawn to accurate scale, but are merely used for convenience and clarity of description of the embodiments of the disclosure.
As shown in the figure, the electronic device 10 includes a display unit 12, an input unit 14, a determining unit 16, and a processing unit 18. The determining unit 16 is electrically connected to the input unit 14 to receive a command from the input unit 14 for determining. In an embodiment, the determining unit 16 is a hardware circuit, a software program, or a combination thereof. In an embodiment, the input unit 14 is a touch pad, a touch panel, a keyboard, or a combination thereof. In an embodiment, the command from the input unit 14 is a gesture command or a key input command.
The processing unit 18 is electrically connected to the determining unit 16 and the display unit 12. It is determined through the determining unit 16 whether the command from the input unit 14 matches a preset command or not, and thus to present a user interface, information, or an execution result corresponding to the command on the display unit 12.
First, in step S120, a text processing interface (that is, a user interface) is displayed on the display unit 12. The text processing interface includes a plurality of function events, including at least a copy/paste function. The text processing interface is started by the electronic device 10 automatically after power on, or is started by a user. In an embodiment, the step is performed by the processing unit 18 in
Next, in step S140, an execution command corresponding to the copy/paste function is received through the input unit 14. Subsequently, in step S150, a plurality of recent copied items stored in a scrapbook (that is, a window in response to the execution command) is displayed on the display unit 12 for the user to choose and confirm. In an embodiment, the execution command is a key input command of “Ctrl+C”. In an embodiment, step S150 is performed by the processing unit 18 in
Next, in step S160, a disable command corresponding to the copy/paste function is received through the input unit 14. Subsequently, in step S170, all temporarily stored items in the scrapbook are pasted in the text processing interface (that is, an execution result in response to the disable command) and presented on the display unit 12. In an embodiment, the disable command is a gesture. In an embodiment, step S170 is performed by the processing unit 18 in
Conventionally, the scrapbook needs to be opened repeatedly for clicking different items to paste a plurality of pieces of copied items. In comparison, in this embodiment, all repetitive actions are integrated, so that the user pastes all contents in the scrapbook with only one gesture.
First, in step S222, a to-be-determined gesture is received through the input unit 14.
Next, in step S224, it is determined whether the to-be-determined gesture matches an execution gesture of the user interface or not. In an embodiment, the step is performed by the determining unit 16 in
When the to-be-determined gesture matches the execution gesture of the user interface, as shown in step S226, the user interface is displayed on the display unit 12. In an embodiment, the step is performed by the processing unit 18 in
When the to-be-determined gesture does not match the execution gesture of the user interface, as shown in step S228, the processing unit 18 considers the to-be-determined gesture as a general input gesture.
First, in step S320, a desktop of an operating system (that is, a user interface) is displayed on the display unit 12. The desktop of the operating system includes a plurality of function events, including at least the numerical calculation function. In an embodiment, the step is performed by the processing unit 18 in
Next, in step S340, an execution command corresponding to the numerical calculation function is received through the input unit 14. Subsequently, in step S350, a calculation input box (that is, a window in response to the execution command) is displayed on the display unit 12 for the user to input. In an embodiment, the execution command is a gesture. In an embodiment, step S350 is performed by the processing unit 18 in
Next, in step S360, a disable command is received through the input unit 14. The disable command corresponds to the numerical calculation function that is triggered by, for example, an “Enter” key, indicates that the user has finished inputting. Subsequently, in step S370, a calculation result of user input information (that is, an execution result in response to the disable command) is presented on the display unit 12. In an embodiment, step S370 is performed by the processing unit 18 in
Conventionally, different programs are used according to different contents that are queried for, or a menu item needs be selected first. In comparison, in the disclosure, all queries are directly conducted at the same portal, so that user operation is simplified.
First, in step S420, a desktop (that is, a user interface) of an operating system is displayed on the display unit 12. The desktop of the operating system includes a plurality of function events, including at least the automatic VPN connecting function. In an embodiment, the step is performed by the processing unit 18 in
Next, in step S440, an execution command corresponding to the automatic VPN connecting function is received through the input unit 14 to execute the automatic VPN connecting function. Subsequently, in step S450, the automatic VPN connecting function is executed, and information of a VPN connection status (that is, a window in response to the execution command) is displayed on the display unit 12. In an embodiment, the execution command is a gesture. In an embodiment, step S450 is performed by the processing unit 18 in
Next, in step S460, a disable command corresponding to the automatic VPN connecting function is received through the input unit 14 to break the VPN connection. Subsequently, in step S470, an execution result of the disable command, for example, a page informing that the automatic VPN connecting function is disabled, is presented on the display unit 12. The automatic VPN connecting function in step S450 is continuously executed until VPN connection succeeds or the processing unit 18 receives the disable command through the input unit 14. In an embodiment, the disable command is a gesture. In an embodiment, step S470 is performed by the processing unit 18 in
Also referring to
Subsequently, as shown in step S554, whether the connection succeeds is determined or not. When the connection succeeds, the process ends. When the connection fails, the process goes to step S556.
In step S556, it is determined in the process whether the number of connection failures exceeds a preset number, for example, three, or not. When the number of connection failures exceeds the preset number, the process goes to step S558 to automatically change the VPN connection point. Subsequently, connection is automatically performed according to a changed-to VPN connection point. When the number of connection failures does not exceed the preset number, the process goes to step S553 to continue the connection. In an embodiment, the changed-to VPN connection point is a recent connection point recorded other than the connection point that fails last time.
The process is continuously performed until VPN connection succeeds or the processing unit 18 receives the disable command on the input unit 14.
Conventionally, a setting path of an interface is complex, and a relatively large number of operation steps are involved. In the disclosure, all paths and operation steps are integrated into gestures to simplify user operation. In addition, conventionally, when connection fails, manual changing is needed. In the disclosure, reconnection is performed automatically, and another connection point is automatically selected after a plurality of failures.
First, in step S620, a desktop of an operating system (that is, a user interface) is displayed on the display unit 12. The desktop of the operating system includes a gesture editing function for a user to edit a gesture to facilitate gesture input. In an embodiment, the step is performed by the processing unit 18 in
Next, in step S640, an execution command corresponding to the gesture editing function is received through the input unit 14. Subsequently, in step S650, an editing window (a window in response to the execution command) is displayed on the display unit 12 for the user to perform editing, for example, recording or modifying gesture information. In an embodiment, the execution command is a gesture. In an embodiment, step S650 is performed by the processing unit 18 in
Next, in step S660, a disable command corresponding to the gesture editing function, for example, an “Enter” key or a key input command of ending recording, indicating that the user has finished recording or modifying, is received through the input unit 14. Subsequently, in step S670, gesture editing completion (that is, an execution result in response to the disable command) is displayed on the display unit 12. In an embodiment, in step S670, the user may directly exit the editing window, or a dialog box inquiring whether to save the editing is displayed on the display unit 12. In an embodiment, step S670 is performed by the processing unit 18 in
Conventionally, when users want to adjust specific items, different windows have to be operated one by one. In the disclosure, different operations are integrated into gestures, making it convenient for the user to operate.
The control method provided in the disclosure performs a function event menu by receiving an execution command (for example, a specified gesture) through the input unit 14 (for example, a touchpad), and receives a subsequent command such as a disable command through the input unit 14 (for example, the touchpad) to perform a subsequent operation. The control method of the disclosure simplifies operation steps and improves the convenience.
The foregoing descriptions are merely exemplary embodiments of the disclosure and are not intended to limit the disclosure in any way. Any person skilled in the art can make any form of equivalent replacement or modification to the technical means and technical contents disclosed in the disclosure without departing from the scope of the technical means of the disclosure, and such an equivalent replacement or modification does not depart from the contents of the technical means of the disclosure and falls within the protection scope of the disclosure.
Number | Date | Country | Kind |
---|---|---|---|
109122965 | Jul 2020 | TW | national |
This application claims the priority benefit of U.S. Provisional Application Ser. No. 62/906,466 filed on Sep. 26, 2019 and TW Application Serial No. 109122965 filed on Jul. 8, 2020. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of the specification.
Number | Date | Country | |
---|---|---|---|
62906466 | Sep 2019 | US |