Users that interact with software programs sometimes provide feedback as to the user experience with (at least parts of) those programs. Program developers/authors use the feedback to simplify existing products as well as create easy-to-use, more intuitive new software products.
Existing feedback collection mechanisms suffer from a number of problems, which makes collecting useful feedback difficult. One cumbersome and fairly manual way to collect feedback is to conduct usability tests in which a usability engineer steers, directs and watches a test subject perform scripted tasks. The engineer manually collects information from this session. As can be readily appreciated, this manual collection method is fairly limited and does not provide for real-world and/or unscripted usage scenarios.
In real-world scenarios, one problem is that many users do not know how to find the mechanism needed to return feedback, and thus do not provide any. Another problem results from users providing the feedback after interacting with the program. Sometimes such subsequent feedback may be very general (“took too long” or “was too complicated”), which is of almost no help in fixing any specific problem. Other times users may have difficulty with a particular part of the program, but then not accurately recall the specific issue or issues that caused the difficulties with respect to that part. Still other times, a user may struggle through a difficult part of a program, but after further interaction, have a different understanding as to the program, which biases the feedback returned, or influences the user to not bother sending any feedback.
The present invention is illustrated by way of example and not limited in the accompanying figures in which like reference numerals indicate similar elements and in which:
Various aspects of the technology described herein are generally directed towards providing a feedback mechanism to users in a way that facilitates collection of feedback from users relevant to their current context (e.g., which part of the program corresponds to the feedback). The feedback mechanism is generally easy for users to find, and may appear at the appropriate places in the program.
In one aspect, an interactive feedback entry triggering mechanism (e.g., a feedback button or other interactive mechanism) may be made to appear at a relevant location so as to “prompt” the user as to the ability to enter any desired feedback. As determined by the program developer, for example, an interactive feedback entry triggering mechanism (for simplicity referred to as a “feedback mechanism” herein) may appear anywhere on a user interface, at any time, in any suitable presentation format (e.g., color) and/or according to any criteria; different feedback entry triggering mechanisms may appear on the same user interface at different locations.
It should be understood that any of the examples herein are non-limiting. For instance, examples used herein show wizards, pages, UI controls and the like, however any programs, input mechanisms and feedback-related mechanisms may benefit from the technology described herein. As such, the present invention is not limited to any particular embodiments, aspects, concepts, structures, functionalities or examples described herein. Rather, any of the embodiments, aspects, concepts, structures, functionalities or examples described herein are non-limiting, and the present invention may be used various ways that provide benefits and advantages in computing and collecting contextual user feedback in general.
In general, the user interacts with the program 104 in a known manner via one or more input devices 108, which causes GUI output to be rendered to a device display or device-coupled monitor 110. As described herein, the output may include contextual (including user) feedback 112, as well as program data 114, such as entered via text, graphical input, controls of a page, a window, a menu and so forth. For purposes of brevity, wizard pages are generally exemplified herein, although it is understood that any visible content may be output, even if not technically a “page” of content. Similarly, any type of program, e.g., application, applet, middleware, operating system and the like that allows user interaction may benefit from the technology described herein.
As is typical, any rendered program data 114 that the user wants persisted is saved as user program data 116 in a suitable format. The data 116 may be persisted locally and/or remotely.
As described herein, any data that the user enters as feedback 112 may be persisted for further use, e.g., program development, modification (revising/debugging) and/or the like. As exemplified in
For example, each feedback mechanism may be a button that appears at a time and location determined by a developer of the wizard (or per the page if developers provide pages independently, for example). As represented in
As can be readily appreciated, an API or the like accessible via a development environment may assist developers in providing feedback mechanisms. The API or the like may take feedback mechanism-related parameters such as type, location, timing data, color data, other appearance data and so forth. Feedback collection for a program or subset of a program may be operated or not operated according to a mode, e.g., turned on for Beta versions off for released versions, turned on for certain customers while turned off for others, and so on.
The feedback that is collected may be uploaded along with context metadata that helps identify the point of collection and related state information. For example, a simple format may be (program ID, program context ID, feedback text), where program context ID may identify what step in the wizard was being presented at the time feedback was entered, and/or what feedback mechanism was triggered. A scheme as simple as (program, input field) such as (“server space provisioning wizard”, “server input field”) may suffice, although alphanumerical identifiers or the like may provide for more flexibility and extensibility. As another example scheme, each interactive feedback mechanism may have its own GUID or the like. Additional metadata regarding the feedback may be collected as well, such as such as current interactive control identifier, software revision/version number, user role, customer ID, license information, timestamp, system state, device type, display type, duration interacting with current screen and/or control, and so on; (note that a long duration time may be indicative of user difficulty with a screen or UI component on a screen). Virtually any information that may be collected from a device (or possibly along with other manual input) may be included as contextual metadata for the feedback data.
In such a scenario the metadata likely includes an identifier of which control was active at the time of comment entry. In this way, having icons or the like associated with a particular input field for example, provides for more granular feedback than that of an entire page (as previously exemplified in
Note that in
When selected, a drop down interactive text entry box 554 appears for information entry to collect typed feedback in this current context. Audio recording/streaming to a server is also an option in this example. As can be seen, it is straightforward for a user to enter “just-in-time” feedback that is contextually relevant to the data that the user is entering or attempting to enter in this example and other examples. Video and other data such as gesture-related input data, screenshots, images and so on may be provided as feedback by the user, and any type of feedback data may be combined.
As can be readily appreciated, feedback mechanisms other than buttons or links may be used. For example, if a program is appropriately configured, a knowledgeable user may “right click” at a data entry control or field to enter feedback, possibly selecting a selection or a sub-selection from a drop-down menu of other right-click options such as “help” or the like. A program with without different data entry “screens” such as a word processing program, photo editor program, spreadsheet and so on may benefit from such a right-click mechanism, although any suitable mechanism(s) may be used, and any other program user interface screen, such as each drop-down menu, toolbar, ribbon or the like may include such a feedback mechanism or mechanisms. Another feedback mechanism may incorporate a simple voting scheme such as thumbs up or thumbs down, or thumbs up, thumbs down or confused, or green checkmark, red “X” or yellow “?” options are all possible examples of a voting type scheme; checkbox-related data, vote-related data, radio-button related data and the like may be used for vote-related data, for example. Actual physical gestures may be prompted for in systems configured with gesture recognition.
Returning to
If it is time to deploy a feedback mechanism as evaluated at step 604, then step 606 is performed to do so; otherwise no feedback mechanism is presented at this time. Note that a feedback mechanism may be immediately rendered as part of the user interface at step 602, such as in the example of
Step 606 represents waiting for some user interaction. Note that if the feedback mechanism is delayed before deployment, steps 604 and 606 may result in the feedback mechanism appearing while waiting for user interaction. Although not explicitly shown, steps 604 and 606 also may vary the appearance, position, animation state and so on of a feedback mechanism over time.
When user interaction is received, step 608 branches to step 610, which in this example tests for whether the program is being ended; if ended, at step 612 the program may output/persist any program data that the user had entered, although the user may elect to exit without saving. Note that many of the steps of
If not a program end situation, step 614 evaluates whether the context changed. One way to change context is to move to a new (next or previous) screen (assuming the user is allowed to do so based upon the data entered), which branches back to render the new screen.
Another type of context change is to change context on the same screen, e.g., the user moves to a new control to enter more data. If so, the UI may be adjusted (step 616) for the new current context, e.g. to move a blinking text entry cursor. This allows a change in the feedback mechanism as well, e.g., step 616 returns to step 604.
Another type of interaction is to receive user feedback, as determined via step 618; (other actions represented by the “no” branch, such as requesting help are not shown for purposes of brevity but as can be readily appreciated are appropriately handled). If feedback is received, once the user stops providing feedback and moves on (e.g., clicks elsewhere), step 620 represents outputting (e.g., pushing) the feedback to a data store for immediate or subsequent collection. Note that the user may be prompted before persisting the feedback.
As can be seen, there is provided a service/feature within a graphical user interface (GUI) that allows the user to provide immediate feedback while still in context of performing a specific task or operation within the GUI. This “Just-in-time interactive Feedback” service provides non-intrusive controls throughout the GUI that optionally allows the user to easily provide feedback within the GUI on virtually any task being performed. The feedback triggering mechanism such as a button may include text or the like that helps solicit the feedback.
As the user is performing any given task, this service/feature allows the user to easily make a comment or add some annotations on what he or she trying to do, and at the same time allows for completing tasks uninterrupted. Instead of completing a task and thereafter going to another “user feedback” page to provide comments, a user can provide feedback directly inline as the task is being performed. The service/feature automatically associates the context (e.g., screen and task) with the feedback.
Example Computing Device
The techniques described herein can be applied to any device or set of devices capable of running programs and processes, such as the user device 102 of
Embodiments can partly be implemented via an operating system, for use by a developer of services for a device or object, and/or included within application software that operates to perform one or more functional aspects of the various embodiments described herein. Software may be described in the general context of computer executable instructions, such as program modules, being executed by one or more computers, such as client workstations, servers or other devices. Those skilled in the art will appreciate that computer systems have a variety of configurations and protocols that can be used to communicate data, and thus, no particular configuration or protocol is considered limiting.
With reference to
Computer 710 typically includes a variety of machine/computer-readable media and can be any available media that can be accessed by computer 710. The system memory 730 may include computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) and/or random access memory (RAM), and hard drive media, optical storage media, flash media, and so forth. By way of example, and not limitation, system memory 730 may also include an operating system, application programs, other program modules, and program data.
A user can enter commands and information into the computer 710 through input devices 740. A monitor or other type of display device is also connected to the system bus 722 via an interface, such as output interface 750. In addition to a monitor, computers can also include other peripheral output devices such as speakers and a printer, which may be connected through output interface 750.
The computer 710 may operate in a networked or distributed environment using logical connections to one or more other remote computers, such as remote computer 770. The remote computer 770 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, or any other remote media consumption or transmission device, and may include any or all of the elements described above relative to the computer 710. The logical connections depicted in
As mentioned above, while example embodiments have been described in connection with various computing devices and network architectures, the underlying concepts may be applied to any network system and any computing device or system in which it is desirable to improve efficiency of resource usage.
Also, there are multiple ways to implement the same or similar functionality, e.g., an appropriate API, tool kit, driver code, operating system, control, standalone or downloadable software object, etc. which enables applications and services to take advantage of the techniques provided herein. Thus, embodiments herein are contemplated from the standpoint of an API (or other software object), as well as from a software or hardware object that implements one or more embodiments as described herein. Thus, various embodiments described herein can have aspects that are wholly in hardware, partly in hardware and partly in software, as well as in software.
The word “example” is used herein to mean serving as an example, instance, or illustration. For the avoidance of doubt, the subject matter disclosed herein is not limited by such examples. In addition, any aspect or design described herein as “example” is not necessarily to be construed as preferred or advantageous over other aspects or designs, nor is it meant to preclude equivalent example structures and techniques known to those of ordinary skill in the art. Furthermore, to the extent that the terms “includes,” “has,” “contains,” and other similar words are used, for the avoidance of doubt, such terms are intended to be inclusive in a manner similar to the term “comprising” as an open transition word without precluding any additional or other elements when employed in a claim.
As mentioned, the various techniques described herein may be implemented in connection with hardware or software or, where appropriate, with a combination of both. As used herein, the terms “component,” “module,” “system” and the like are likewise intended to refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution. For example, a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a computer and the computer can be a component. One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers.
The aforementioned systems have been described with respect to interaction between several components. It can be appreciated that such systems and components can include those components or specified sub-components, some of the specified components or sub-components, and/or additional components, and according to various permutations and combinations of the foregoing. Sub-components can also be implemented as components communicatively coupled to other components rather than included within parent components (hierarchical). Additionally, it can be noted that one or more components may be combined into a single component providing aggregate functionality or divided into several separate sub-components, and that any one or more middle layers, such as a management layer, may be provided to communicatively couple to such sub-components in order to provide integrated functionality. Any components described herein may also interact with one or more other components not specifically described herein but generally known by those of skill in the art.
In view of the example systems described herein, methodologies that may be implemented in accordance with the described subject matter can also be appreciated with reference to the flowcharts of the various figures. While for purposes of simplicity of explanation, the methodologies are shown and described as a series of blocks, it is to be understood and appreciated that the various embodiments are not limited by the order of the blocks, as some blocks may occur in different orders and/or concurrently with other blocks from what is depicted and described herein. Where non-sequential, or branched, flow is illustrated via flowchart, it can be appreciated that various other branches, flow paths, and orders of the blocks, may be implemented which achieve the same or a similar result. Moreover, some illustrated blocks are optional in implementing the methodologies described hereinafter.
While the invention is susceptible to various modifications and alternative constructions, certain illustrated embodiments thereof are shown in the drawings and have been described above in detail. It should be understood, however, that there is no intention to limit the invention to the specific forms disclosed, but on the contrary, the intention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention.
In addition to the various embodiments described herein, it is to be understood that other similar embodiments can be used or modifications and additions can be made to the described embodiment(s) for performing the same or equivalent function of the corresponding embodiment(s) without deviating therefrom. Still further, multiple processing chips or multiple devices can share the performance of one or more functions described herein, and similarly, storage can be effected across a plurality of devices. Accordingly, the invention is not to be limited to any single embodiment, but rather is to be construed in breadth, spirit and scope in accordance with the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
6516421 | Peters | Feb 2003 | B1 |
7444340 | Padgett | Oct 2008 | B2 |
7996338 | Kamar | Aug 2011 | B2 |
8032480 | Pinckney | Oct 2011 | B2 |
8635475 | Lin | Jan 2014 | B2 |
20040002993 | Toussaint et al. | Jan 2004 | A1 |
20060026536 | Hotelling et al. | Feb 2006 | A1 |
20070050654 | Switzer | Mar 2007 | A1 |
20100229112 | Ergan | Sep 2010 | A1 |
20110060754 | Theimer et al. | Mar 2011 | A1 |
20110279453 | Murphy et al. | Nov 2011 | A1 |
20120242482 | Elumalai | Sep 2012 | A1 |
20120311465 | Nealer | Dec 2012 | A1 |
20120317545 | Chandra et al. | Dec 2012 | A1 |
20130132565 | Cetin | May 2013 | A1 |
20140013234 | Beveridge et al. | Jan 2014 | A1 |
20140130099 | Kunisetty | May 2014 | A1 |
20140171039 | Bjontegard | Jun 2014 | A1 |
20140365397 | Mitra | Dec 2014 | A1 |