Aspects of the present invention are directed to simulation and automation of button activation and deactivation on button-aware computing platforms.
The use of hardware buttons has enhanced the usefulness and flexibility of computing devices such as handheld computers, tablet-style computers, and the like. Typically, such smaller computing devices have main user input resources that are not button-based. For example, for many such computing devices, the main way for a user to interact with the computing device is to use a stylus-based interface on a touch-sensitive display. Hardware buttons may also be provided as a secondary mode of input. The user may press and/or release buttons, and the underlying platform converts the button activity into actions.
There is a need for testing of the software applications and operating systems that are button-aware, that is, that respond to button activity on such computing devices. However, efficient testing of such a button-based platform is challenging because button pressing is a manual process. For instance, to quickly perform such testing by actually pressing buttons, an army of robots would literally be required. This is expensive and unrealistic, and so the physical pressing of buttons has become time-consuming and error-prone using alternative methods. The difficulties with such testing are multiplied where there are various different hardware platforms to be tested with the same target software. Different hardware platforms may have different buttons that are provided in different locations and/or in different quantities. Thus, a customized testing scenario must be created for each different hardware platform. This, again, becomes unwieldy, inefficient, error-prone, and expensive.
There have been various efforts to improve upon the testing process, but such efforts have not resulted in a satisfactory solution. One way is to provide a stick-like instrument mounted on an electro-mechanical device and control it to mimic user actions. However, this is neither a cheap nor realistic methodology. Another approach is to simulate button events by providing automation hooks in various layers of the operating system stack where the button input is massaged into an expected format. However, this is difficult at best and is again dependent upon the implementation of the system being tested. Thus, a customized testing scenario would still need to be created for each new platform to be tested.
A better way to simulate hardware button events is therefore needed.
Aspects of the present invention are directed to simulating the actual hardware button signals at a low level. The data resulting from those signals then propagates naturally through the system, being processed and formatted in the layers of the system stack in a normal manner, eventually being directed to the target software application being tested as an action for that software application associated with the button activity. In this end-to-end approach, button events are simulated by injecting data into the system from the bottom-most layers where raw data may be the basic property of the button, e.g., the state of the button (e.g., pressed or released). Thus, this would be independent of the actual implementation of converting button events to actions. Such simulation helps developers and test teams run real-life tests and scenarios in a reproducible and efficient manner, irrespective of the hardware platform.
Further aspects of the invention are directed to mapping actions to buttons, button events, and/or the physical orientation of the computing device being tested. Such mapping may be set or read by a testing software application using an application programming interface (API). Because the mapping may be dynamically set and changed during the testing process, this allows platforms having only a single hardware button (or a small number of hardware buttons) to be tested under a variety of configurations.
Still further aspects of the present invention are directed to providing a create-once-use-many-times methodology for testing. This is because a uniform platform for button testing is provided that may be used with a variety of hardware platforms and target software. The result is that testing time and expense is substantially reduced while repeatability is increased. The testing platform is allowed to be uniformly applicable due to the fact that an injected button event, just as the actual pressing of a hardware button, causes data to be injected at a low level. Because the injected data must propagate throughout the normal system, such testing exercises the whole system, as opposed to providing automation hooks at custom-selected locations and artificially providing data to the system at a particular layer in the particular format required.
Using the described testing methodology, testing teams may rapidly automate and cover key scenarios in their tests. Time is therefore saved by automating the onerous task of repeating the same button actions on different hardware platforms and under different conditions.
These and other aspects of the invention will be apparent upon consideration of the following detailed description of illustrative embodiments.
The foregoing summary of the invention, as well as the following detailed description of illustrative embodiments, is better understood when read in conjunction with the accompanying drawings, which are included by way of example, and not by way of limitation with regard to the claimed invention.
The invention is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to, personal computers (PCs); server computers; hand-held and other portable devices such as personal digital assistants (PDAs), tablet PCs or laptop PCs; multiprocessor systems; microprocessor-based systems; set top boxes; programmable consumer electronics; network PCs; minicomputers; mainframe computers; distributed computing environments that include any of the above systems or devices; and the like.
Aspects of the invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The invention may also be operational with distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
With reference to
Computer 100 typically includes a variety of computer-readable media. Computer readable media can be any available media that can be accessed by computer 100 such as volatile, nonvolatile, removable, and non-removable media. By way of example, and not limitation, computer-readable media may include computer storage media and communication media. Computer storage media may include volatile, nonvolatile, removable, and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, random-access memory (RAM), read-only memory (ROM), electrically-erasable programmable ROM (EEPROM), flash memory or other memory technology, compact-disc ROM (CD-ROM), digital video disc (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store the desired information and which can accessed by computer 100. Communication media typically embodies computer-readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF) (e.g., BLUETOOTH, WiFi, UWB), optical (e.g., infrared) and other wireless media. Any single computer-readable medium, as well as any combination of multiple computer-readable media, are both intended to be included within the scope of the term “a computer-readable medium” as used in both this specification and the claims. For example, a computer readable medium includes a single optical disk, or a collection of optical disks, or an optical disk and a memory.
System memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as ROM 131 and RAM 132. A basic input/output system (BIOS) 133, containing the basic routines that help to transfer information between elements within computer 100, such as during start-up, is typically stored in ROM 131. RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120. By way of example, and not limitation,
Computer 100 may also include other computer storage media. By way of example only,
The drives and their associated computer storage media discussed above and illustrated in
Computer 100 may also include a touch-sensitive device 165, such as a digitizer, to allow a user to provide input using a stylus 166. Touch-sensitive device 165 may either be integrated into monitor 191 or another display device, or be part of a separate device, such as a digitizer pad. Computer 100 may also include other peripheral output devices such as speakers 197 and a printer 196, which may be connected through an output peripheral interface 195.
Computer 100 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180. Remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to computer 100, although only a memory storage device 181 has been illustrated in
When used in a LAN networking environment, computer 100 is coupled to LAN 171 through a network interface or adapter 170. When used in a WAN networking environment, computer 100 may include a modem 172 or another device for establishing communications over WAN 173, such as the Internet. Modem 172, which may be internal or external, may be connected to system bus 121 via user input interface 160 or another appropriate mechanism. In a networked environment, program modules depicted relative to computer 100, or portions thereof, may be stored remotely such as in remote storage device 181. By way of example, and not limitation,
As previously stated, computer 100 may take any of a variety of forms. For example, referring to
Typically, the pressing and/or releasing of one of buttons 202-207 causes a signal to be sent to a driver running on computer 200. The signal may be data that identifies the particular button pressed or released, as well as an event currently associated with the button (e.g., whether the button has been pressed or released). The driver receives the signal, interprets the signal, and forwards information about which button was pressed or released to the operating system and/or to a software application. The driver may operate at the Human Interface Device (HID) layer and operate in accordance with the HID specification. The HID specification is a well-known standard that may be used for a variety of input devices. The HID specification is mainly implemented in devices connected to a computer via USB but can support input devices that use other types of ports or buses. For example, input devices connected using the IEEE 1394 protocol may be used in accordance with the HID specification.
Referring to
In the shown embodiment, the following functions are shown. In the kernel mode, an HID layer 303, 310 is provided that implements the HID protocols and communicates with physical input devices such as buttons 202-207. A button driver 304 is provided in HID layer 303 that receives signals from buttons 202-207. Button driver 304 translates these signals into a format that is understandable by a button event processing unit 302, which is in the user mode. Button event processing unit 302 receives this information and passes it on (possibly re-translating the information in the process) to a software application 301 that is interested in knowing the states of one or more of the buttons 202207. In addition, as will be discussed further below, button event processing unit 302 determines an appropriate action that should be taken depending upon the button, a button event associated with that button, and/or a current physical orientation of the computing device.
In addition, another driver, called herein a virtual button driver 311, is provided in the HID layer of the kernel mode. Virtual button driver 311 receives information from another software application, called herein button simulator software application 313, via a programming interface such as application programming interface (API) 306, called herein button injector API 306. Button injector API 306 provides various functionality to software application 313 including mapping of buttons to actions and communication with virtual button driver 311. Although software application 313 is shown in
Button injector API 306 may provide a variety of functionality that is available to software application 313. Mapping is one functionality, in which software application 301 (or the operating system) may get or set the mapping of one or more buttons to one or more actions and/or read the currently set mapping. For example, using commands and/or queries defined by button injector API 306, software application 313 (or the operating system) may be able to define which out of a plurality of actions are to be mapped to each of buttons 202-207 (or a subset thereof). The mappings of buttons, button events, physical orientations, and actions may be stored in a data repository 312, such as an operating system registry. An action may be any action, such as opening a software application, issuing a command, shutting down computer 200, sending a page up or down request, pressing a function key, or performing any other function. In this way, each of buttons 202-207 may be mapped, or associated with, different actions. Using button injector API 306, software application 313 (or the operating system) may map only a particular one of the buttons, a particular subset of the buttons, or all of the buttons, to one or more actions. This means that upon performing a particular event associated with a button (e.g., upon pressing the button), the action associated with the button would be performed.
The particular action to be performed may depend not only on which button is pressed, but also which event is performed on the button. There are many possible events that may be performed on a button. Such events are also referred to herein as button events. For instance, a button press event is where a button is pressed. A button release event is where a button is released. A button hold event is where a button is pressed for at least a threshold period of time. A button click event is where a button is pressed for only a short period of time. The button hold and click events may be considered separate events in and of themselves, or they may be considered particular combinations of the basic button press and button release events. Table 1 below shows an example of how various actions may be associated with some of the buttons of
Button 202, for example, may be assigned button ID 1; button 203 may be assigned button ID 2; and button 203 may be assigned button ID 3. According to Table 1, for instance, pressing button 202 (button ID 1) would result in action 1 occurring, double-clicking button 202 would result in action 2 occurring, and holding down button 202 would result in action 1 occurring (the same as where button 202 is merely pressed).
In addition, the mapping of actions with buttons and events may further take into account the physical orientation of computer 200. This may be especially useful where, as in the present example, computer 200 is a portable computing device that may operate in different modes depending upon its physical orientation. For instance, computer 200 may have an orientation sensor that detects the physical orientation of computer 200 (e.g., vertical, rotated sideways, or at some angle in between) and operate in a particular mode depending upon the orientation. Such sensors are well known. If computer 200 is oriented vertically (such as shown in
In addition to setting the button mapping, button injector API 306 may also allow software application 313 to read the button mapping. Such a query may include the button ID, the button event, and/or the orientation of the computing device, and the result of the query may be the action that is assigned to that particular combination of properties. Or, the entire mapping or an arbitrary subset of the mapping may be provided to software application 313 upon query via button injector API 306. Software application 313 may further query, via button injector API 306, how many buttons are known to exist on the computing device and/or how many of those buttons are mapped to an action.
Button injector API 306 also allows software application 313 to inject a specified button event. To inject a button event is to cause virtual button driver 311 to simulate the button event without the need for the actual hardware button to physically experience the button event. Button injector API 306 provides for software application 313 to specify one or more of buttons 202-207 and a button event associated with that button(s). In response, button injector API 306 communicates with virtual button driver 311 such that virtual button driver 311 simulates the actual button event and sends information to button event processing unit 302 letting it know that the button event has occurred. Button event processing unit 302 may not know the difference between a simulate button event from virtual button driver 311 and an actual button event communicated from HID button driver 304. In other words, the data from virtual button driver 311 and HID button drive 304 may be in the same format and, other than the source of the data, be otherwise indistinguishable.
Button injector API 306 has further sub-components: a data transformation sub-component 307, a mapping management sub-component 308, and a data management sub-component 309. Mapping management sub-component 308 is used for storing and retrieving associations between buttons 202-207 and the mapped actions in data repository 312. Data transformation and management sub-component 307 is used for receiving queries and commands from software application 313, and to transform the queries and commands into a different format that is usable by virtual button driver 311. Device management 309 handles any communication protocols necessary between button injector API 306 and virtual button driver 311.
The simulation of button events using the above-described architecture will now be discussed. Conventionally, when a button is pressed, held down, released, etc., a signal for that button 202-207 is sent to HID button driver 304, which in turn forwards data to button event processing unit 302. Button event processing unit 302 determines an appropriate action to take in response to the button event and forward data about that action and/or the button event to software application 301 (or to the operating system, as desired).
However, when simulating a button event, a different data path is taken. In this case, software application 313 is used for sending commands and/or queries using button injector API 306 in order to simulate the activation and/or deactivation of one or more of buttons 202-207, without the need for buttons 202-207 to actually be physically activated and deactivated. For example, referring to
Next, in step 402, software application 313 uses button injector API 306 to inject a specified button event for a specified one of buttons 202-207. In this regard, software application 313 may be programmed to cycle through a set of button events in a particular order. The data sent from button injector API 306 and/or from software application 313 representing a button injection may be in the form shown in FIGS. 5 and 6. This illustrative format assigns one bit for each of hardware buttons 202-207. In this example, there are thirty-two possible hardware buttons that may be referenced, even though computer 202 in this case has only six hardware buttons 202-207. So, hardware buttons 202-207 are each assigned a different bit, in this example bits zero through five, respectively. In
In response to a button injection request from software application 313, button injector API 306 sends data to virtual button driver 311, indicating the button ID of the specified button and the specified button event. In response, virtual button driver 311 converts the received data to further data that represents the button event and the button ID and sends this data to button event processing unit 302. In response, button event processing unit 302 checks data repository 312 in step 403 to determine which activity is mapped to the particular button and button event, as well as to the current orientation of computer 200 (if desired). If a mapping exists, then the associated activity is found and in step 404 button event processing unit 302 sends further data to software application 301, indicating the activity. In response, software application 301 performs some function based on the activity (e.g., paging up), and in step 405 software application 312 (or a separate monitoring software application) detects the response of software application 301 to determine whether the response is correct and expected. The process is continued for further buttons and/or button events on the same button, as desired. The flowchart of
In this way, the functionality of software 301 may be determined relative to various button events associated with button 202-207. This may advantageously be accomplished without ever having to actually perform a button event on the actual hardware buttons 202-207, thus substantially speeding the testing process while also making it more reliable and flexible.