Some computing devices, such as smart phones, may receive input from a touch-sensitive display and a physical keyboard. Thus, a user of the computing device has different ways to provide input.
Some examples of the present application are described with respect to the following figures:
More and more computing devices have the capability to receive touch input besides smart phones. For example, all-in-one computers and tablet computers have touch-sensitive displays to receive touch inputs. An application (implemented using processor executable instructions) may take advantage of the capability by including a feature to automatically display a virtual input device, such as virtual keyboard, to receive touch input. An example of such an application may be a mobile application developed for a portable computing device, such as a smart phone. However, an application developed for a non-portable computing device, such as a desktop computer, may not have such a feature. A user of the application may have to manually search and activate a virtual input device. Thus, user experience of the application may be negatively affected.
Examples described herein provide a computing device to automatically display a virtual input device in a graphical control element associated with a desktop application. For example, a non-transitory computer readable storage medium may include instructions that when executed cause a processor of a computing device to determine an active graphical control element displayed on a display of the computing device. The active graphical control element may include an input element. The instructions when executed may further cause the processor to determine whether the active graphical control element corresponds to a desktop application or a non-desktop application based on a property of the active graphical control element. The instructions when executed may further cause the processor to, in response to a determination that active graphical control element corresponds to the desktop application, monitor the input element. The instructions when executed may further cause the processor to automatically display a virtual input device on the display based on a particular type of input event associated with the input element. Thus, user experience of the application may be enhanced.
Processor 102 may be a central processing unit (CPU), a semiconductor-based microprocessor, and/or other hardware devices suitable for retrieval and execution of instructions stored in a computer-readable storage medium. Processor 102 may fetch, decode, and execute instructions to control a process of automatically displaying a virtual input device in a graphical control element associated with a desktop application. Display 104 may be touch-sensitive display implemented using a touchscreen. For example, display 104 may be a touch-sensitive liquid crystal display (LCD). Processor 102 may control operations of computing device 100.
During operation, a graphical control element 106 may be launched and displayed on display 104. Graphical control element 106 may be an interaction component in a graphical user interface associated with an application (implemented using instructions executable by processor 102) that is executing at computing device 100. Graphical control element 106 may provide visual representation of data to a user and may receive input from the user. For example, graphical control element 106 may be implemented as a window in a graphical user interface.
When graphical control element 106 is displayed, processor 102 may detect the presence of graphical control element 106. Processor 102 may monitor graphical control element 106 to determine whether graphical control element 106 becomes active. Graphical control element 106 may become active when graphical control element 106 receives an interaction from a user input via an input device. For example, graphical control element 106 may become active when a user of computing device 100 clicks on graphical control element 106 via a mouse. As another example, graphical control element 106 may become active when the user touches graphical control element 106 via a stylus or a finger.
In response to a determination that graphical control element 106 is active, processor 102 may determine whether graphical control element 106 corresponds to a desktop application or a non-desktop application. That is, processor 102 may determine graphical control element 106 is part of a desktop application or a non-desktop application. As used herein, a desktop application may be an application that lacks the ability to automatically display a virtual input device. A non-desktop application may be an application that has the ability to automatically display a virtual input device.
Processor 102 may determine graphical control element 106 is part of a desktop application or a non-desktop application based on a property of graphical control element 106. In some examples, the property may be an indication of an executing operating system process that is associated with the graphical control element. For example, processor 102 may query an operating system of computing device 100 to determine whether graphical control element 106 is associated with an executing operating system process. In response to a determination that graphical control element 106 is associated with an executing operating system process, processor 102 may determine that graphical control element 106 corresponds to a desktop application.
In response to a determination that graphical control element 106 is not associated with any executing operating system process, processor 102 may determine that graphical control element 106 corresponds to a non-desktop application. In some examples, in response to a determination that graphical control element 106 is not associated with any executing operating system process, processor 102 may further query the operating system to determine w graphical control element 106 includes a particular class name that is indicative of a non-desktop application to ensure graphical control element 106 has the ability to automatically display a virtual input device. For example, the class name “Windows.UI.core.CoreWindow” may be indicative of a type of non-desktop application called Universal Windows Platform (UWP) application. When processor 102 determines that graphical control element 106 corresponds to a non-desktop application, processor 102 may stop monitoring graphical control element 106.
When text input box 108 is identified, processor 102 may monitor text input box 108 to detect a particular type of input event associated with text input box 108. When the user of computing device 100 selects text input box 108 to begin providing input, processor 102 may determine a type of input event associated with text input box 108. For example, when the user selects text input box 108 via a touch input, such as using a physically making contact with display 104 via a stylus or a finger, the type of input event may be a touch input event. When the user selects text input box 108 via a mouse, the type of input event may be a mouse click input event.
When the type of input event is the touch input event, processor 102 may cause virtual input device 110 to be automatically displayed on display 104 near text input box 108. As used herein, automatically displaying virtual input device 110 means virtual input device 110 is displayed without receiving an input from the user to launch a display of virtual input device 110. That is, the user does not have to select or execute another application to launch virtual input device 110. Virtual input device 110 may be any type of input device rendered or generated using processor executable instructions. In some examples, virtual input device 100 may be a virtual keyboard.
When the type of input event is the mouse click input event, processor 102 may determine whether a physical keyboard is available for use. For example, processor 102 may query the operating system to determine whether a physical keyboard is coupled to computing device 100. In response to a determination that the physical keyboard is unavailable for use (i.e., not coupled to computing device 100). Processor 102 may cause virtual input device 110 to be automatically displayed near text input box 108.
Processor 202 may be similar to processor 102 of
Computer-readable storage medium 2104 may be any electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions. Thus, computer-readable storage medium 204 may be, for example, Random Access Memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage device, an optical disc, etc. In some examples, computer-readable storage medium 204 may be a non-transitory storage medium, where the term “non-transitory” does not encompass transitory propagating signals. As described in detail below, computer-readable storage medium 204 may be encoded with a series of processor executable instructions 206-212.
Active graphical control element determining instructions 206 may determine whether a graphical control element is active. For example, referring to
Active graphical control element application type determining instructions 208 may determine whether the active graphical control element corresponds to a desktop application or a non-desktop application. For example, referring to
Input element monitoring instructions 210 may monitor an input element to detect a particular type of input event associated with the input element. For example, referring to
Automatic virtual input device displaying instructions 212 may automatically display a virtual input device. For example, referring to
Method 400 may include detecting a graphical control element, at 402. For example, referring to
When there is not an indication of an executing operating system process associated with the active graphical control element, method 400 may further include determining whether a class name of the active graphical control element matches a class name of a non-desktop application, at 408. When the class name matches the class name of the non-desktop application, method 400 may return to block 402.
When there an indication of an executing operating system process associated with the active graphical control element, method 400 may further include monitoring an input element of the active graphical control element to detect a particular type of input event, at 410. For example, referring to
Method 400 may further include determining if the input event is a touch input event, at 412. For example, referring to
The use of “comprising”, “including” or “having” are synonymous and variations thereof herein are meant to be inclusive or open-ended and do not exclude additional unrecited elements or method steps.
| Filing Document | Filing Date | Country | Kind |
|---|---|---|---|
| PCT/US2016/015974 | 2/1/2016 | WO | 00 |