As shown, the in-line report generator 102 may be operated in conjunction with a graphical user interface (GUI) 104. The GUI 104 may include, for example, a browser or other software application that is configured to allow a user thereof to display and/or interact with various types of data. The GUI 104, e.g., browser, may be configured, for example, to obtain information from remote sources, e.g., server computers, using various protocols (e.g., hyptertext transfer protocol (HTTP)) and associated techniques, examples of which are described herein.
As should be apparent, the GUI 104 may be implemented on a conventional display 106. The display 106 may typically be operated in conjunction with a computer 108, which may represent, for example, a desktop or laptop computer, a personal digital assistant (PDA), a networked computer (e.g., networked on a local area network (LAN)), or virtually any type of data processing apparatus. As should also be apparent, the computer 108 may be associated with various types of storage techniques and/or computer-readable media, as well as with one or more processors for executing instructions stored thereon.
In the following description, it is generally assumed for the sake of example that the in-line report generator 102 is configured to provide reporting regarding an event that has been, or may be, performed by one or more users. In various parts of the description, such an event is described, as an example, as including the inputting of a response to a query that is part of an electronic survey, where the electronic survey may be provided to a number of users. Of course, virtually any event performed by one or more users may be reported upon in the manner(s) described herein, including, for example, a purchase by the user(s) of an item at an on-line store, or a selection of a link by the users(s) on a web page.
Moreover, a user in the sense described herein may encompass, for example, a human user or a computer. As an example of the latter case, it may be the case that a human user is filling out a trouble-shooting form regarding a computer problem being experienced by the human user. In this case, the computer with which the human user is having a problem may itself provide data about its own current operation, perhaps in association with troubleshooting data provided by the human user. In this case, the in-line report generator 102 may provide a reporting of the data provided by the problematic computer.
In
Meanwhile, reporting elements 112 generally may represent or include, for example, icons, content, or data-entry fields, and may be represented or constructed as similarly-expressed objects, components, or other software code that contain(s) information regarding the actual event performed by one or more particular users. For example, where the event elements 110 include electronic surveys distributed to the users, then the reporting elements 112 may represent or include information about the actual event(s) performed by individual users of selecting or providing particular responses to the survey questions (e.g., which user provided a particular answer, or how many users responded to a particular question).
Consequently, in order to provide a reporting of a given event, the in-line report generator 102 may provide an event element 110a within the graphical user interface 104. Then, perhaps in response to a selection of a report selector 114 (e.g., a button, link, or other element of the GUI 104), as described in more detail below, the in-line report generator 102 may provide a corresponding reporting element 112a, in-line with the event element 110a.
As a more specific example, and as shown in
As a yet-more specific example, the query elements 110b may include a query element 110f, associated with the query “did you like this design?” In this case, the response elements 110c may include a response element 110g, which provides response options “yes” or “no.” As should be apparent, this query/response pair may be exactly the query/response pair presented to the various users during distribution of the relevant survey (e.g., with the same or similar properties, content, format, and/or context), so that the operator of the in-line report generator 102 (e.g., a creator or manager of the survey) may see the same or similar context and format that was seen by the user(s) when providing responses to the query. That is, for example, the reporting element 110g may include active components that a responding user may “click on” or otherwise select when providing his or her “yes/no” answer.
Then, the in-line report generator 102 may provide reporting elements 112c, obtained from the reporting elements 112, in order to provide specific reporting about different yes/no responses provided by the various users. In the example of
Moreover, additional reporting information may be provided to the manager of the survey, in conjunction with the above-described techniques. For example, a supplemental reporting element 116 may be provided by the in-line report generator 102 that provides additional information regarding the reporting element 112b. For example, if the reporting element 112b is associated with a response of a particular user, then, the supplemental reporting element 116 may provide additional information about that user. As a specific example, the reporting element 112c associated with the “no” response to the response element 110g may be selected (e.g., clicked on or hovered over) by a manager of the survey who is viewing the results, and an email address of the user who provided the “no” answer may be provided in the supplemental reporting element 116a (e.g., “chuck.marron@demo.com,” as shown). In this way, the manager of the survey may associate feedback with responding users in a fast and convenient manner, and may contact the user for further feedback, if desired or allowed.
During example operations of the in-line report generator 102, a request handler 118 may be configured to receive a request for one or more of the event elements 110 and/or the reporting elements 112. For example, as referenced above, the event elements 110 may include the query elements 110b and the response elements 110c that each may be associated with one or more different surveys. A manager of a particular survey may wish first to view the particular survey, and the request handler 118 may thus receive a request for the particular survey and obtain the relevant query elements and response elements (e.g., the query elements 110d, 110f, and the response elements 110e, 110g). Then, synchronously or asynchronously, the request handler 118 also may obtain corresponding ones of the reporting elements 112 that provide information about the relevant events (e.g., provision of answer choices) by the associated users, and may store the thus-obtained reporting elements 112 using a local memory 120 (where the event elements 110 also may be stored).
In one implementation, then, the query elements and response elements may be presented on the GUI 104 to the manager, e.g., including the query element 110f and the response element 110g, and may initially be presented without the corresponding reporting element 112c. In this case, it should be understood that the manager of the survey, at this point, may have essentially the same or similar view as was experienced by the user(s) when responding to the survey.
In a case where the reporting element(s) 112b, 112c include(s) response information from a plurality of users, an aggregator 122 may be used to aggregate the various responses. For example, in
Then, e.g., once the desired, relevant subset of the reporting elements 112 are stored within the local memory 120, the manager may select the report selector 114. Such a selection may be received and interpreted by the request handler 118 as a request for reporting elements corresponding to the also-stored query elements 110b and response elements 110c, whereupon presentation logic 124 of the GUI 104 may be configured to provide the previously-obtained reporting elements (e.g., the reporting elements 112b and 112c) in alignment with their respective response elements 110e and 110g.
Thus, for example, the manager of a survey in question may obtain reporting information about results of a survey by first simply pulling up the survey itself (e.g., including the various queries and responses thereof), just as the survey was presented to the various users. Then, simply by selecting the report selector 114, the manager may obtain a reporting of the results of the survey, provided in alignment with the queries and/or responses. In this way, the manager may obtain the survey results in a fast and intuitive manner, and may view the survey results in a same or similar manner, context, and format as was experienced by the users.
In the example of
Meanwhile, a query element 210e includes a third question, “What is your role in your organization?”, along with a response element 210f that includes a multiple-choice format of various roles from which the user may select. In this case, the user may potentially select more than one answer within the response element 210f, although, in the example of
A query element 210g includes a fourth question, “Do you have additional comments?”, along with a response element 210h that includes a free text entry field. In
A query element 210i includes a fifth question, “May we contact you for feedback again?”, along with a response element 210j that includes a single-select “yes or no” answer choice. In
Also in
In the example of
Specifically, the reporting element(s) 312a is provided in conjunction with the first question, or, more specifically, in alignment with the response element 210b. Even more specifically, and as shown, the reporting element(s) 312a includes bar graphs placed over the answer choices “1” and “4,” as well as corresponding absolute and percentage numbers describing how many of the four users voted for each option. Other information may be included in association with the reporting element 312a, such as, for example, a display of an average rating (e.g., 3.25) provided by the four users (where, e.g., the average value may be determined by the aggregator 122 of the in-line report generator 102 of
The reporting element(s) 312c provides information about which answer choices of the response element 210f were selected by users. Since the response element 210f is a multi-select response element, i.e., each user may make more than one selection (e.g., a user may be a senior executive and a sales expert). Consequently, the total percentages of responses may add up to more than 100%, as shown.
As referenced above, the reporting element(s) 312c, and/or other reporting elements, may be used to provide related, supplemental reporting elements. For example, selection of one of the bar graphs of the reporting element(s) 312c may provide the manager or other reviewer with an email address of the user(s) who provided answer(s) associated with the selected reporting element. In
Other types of supplemental reporting elements may be provided, as well. For example, responding users may be provided with an ability to include ad-hoc comments, e.g., by using electronic notes that may be attached to a desired location of the screenshot 200. For example, a responding user may add such a note in the vicinity of the second question (of the query element 210c), with a comment that “I actually don't prefer red or blue.” When the user selects “submit,” such a note may be saved with the reporting elements 112, so that when a manager or other reviewer later reviews the screen of the user for reporting purposes, the in-line report generator 102 may include the note within the screenshot 300. Accordingly, for example, the manager or other reviewer of the survey may obtain information that was not requested, but that may be very valuable, in a convenient and understandable manner.
Further in
Although
In such a case where in-line reporting is desired to be implemented on a user-by-user basis, the manager of the survey may request corresponding (single-user) reporting elements by way of selection of an additional or alternative report selector 114. In this case, for example, the manager may scroll through responses of each user individually, e.g., using arrows 302 or other appropriate indicators associated with the report selector 114.
For example, to initially specify single-user reporting, the manager may select a button, drop-down menu, or other selection techniques associated with the report selector 114. The request handler 118 may parse this request and provide the request to the local memory 120 and/or the presentation logic 124. The presentation logic 124 may thus present the desired single-user reporting elements, as described, and may provide subsequent single-user reporting in response to selection of the forward or back arrows 302.
In still other examples, the reporting elements 312a-312e may be used to filter or refine the reporting process. For example, if reporting of the four users of the survey of
It will be appreciated that the above description and examples may be considered to provide at least two modes of operation of the in-line report generator 102. For example, the screenshot 200 may be considered to represent a first mode, or “edit mode,” in which an original survey or survey components are illustrated, perhaps with active controls for the various response elements 210b, 210d, 210f, 210h, and 210j, so that additional responses may be entered. Then, e.g., upon selection or operation of the report selector 114, a second mode, e.g., “replay mode” or “reporting mode,” may be entered, in which the in-line report generator 102 provides the various reporting elements 312a-312e, or other reporting elements. Thus, a manager or other review of the survey may easily switch or toggle back-and-forth between the two modes, and other modes, essentially instantaneously, for fast and convenient review of survey results. Such responsiveness and interactivity may be provided even though the event elements 110 and reporting elements 112 may be at a remote location from the computer 108 of
In the screenshot 400, a plurality of event elements 410a-410e are illustrated. Specifically, each event element provides a possible purchase that may be made by a reviewer of the screenshot 400, where each purchase is defined by a product number, a product description, and a product price, as shown. For example, the event element 410a is associated with the product number “49005547,” “Misc. Building Supplies,” and a price of “400 USD.” The event element 410b is associated with the product number “49005573,” “Furniture,” and a price of “900 USD.” The event element 410c is associated with the product number “49005743,” “Eqpt Rentals (A/V, Tables, Radios),” and a price of “250 USD.” The event element 410d is associated with the product number “49007543,” “Signage (Asset),” and a price of “300 USD.” Finally, the event element 410e is associated with the product number “49075543,” “Signage (non-asset),” and a price of “100 USD.”
Thus, each event element 410a-410e represents a link or opportunity for a reviewer of the screenshot 400 to purchase an associated item, but are referred to here as examples of event elements 110 because each is associated (e.g., by way of the in-line report generator 102) with a previous event in which previous users purchased one or more of the items that are listed. For example, a user may previously have visited the on-line store and purchased one or more products listed or referenced in the screenshot 400.
Thus, in operation, a reviewer of the screenshot 400 may be visiting the on-line store and may be considering purchasing one or more of the listed or referenced items. The reviewer may wish to know, however, how many other users have purchased the item(s) being considered. Accordingly, the reviewer may select the report selector 114, shown in
As already described, the various reporting elements 512a-512e also provide opportunities for supplemental reporting elements. For example, a supplemental reporting element 516 illustrates a box in which the 5 users associated with the reporting element 512e are identified by e-mail address, as shown. The reviewer of the screenshot 500 may obtain such supplemental reporting element(s) by, for example, clicking on the bar graph, or hovering over the bar graph with using a mouse and cursor movement. Of course, these are just examples, and other variations may be used. For example, instead of e-mail addresses, the supplemental reporting element 416 may provide contact to the various users by way of chat, instant messaging, voice-over-IP, or virtually any other technique for contacting the users. Moreover, other types of supplemental reporting information may be provided, such as, for example, more specific information about each users, such as when the user made a particular purchase, or whether the user made such a purchase in conjunction with other purchases.
In
The event in question may then be initiated by providing the event element, at least in part, to the user who is to perform the event (604). For example, a manager of an electronic survey may provide query elements/response elements to the user(s) as part of the electronic survey, for use in responding to the survey. In other examples, as in
Once the event has been performed by at least one user, a reporting element associated with the event and the user may be determined and stored (606). For example, the reporting element may identify the user and/or include contact information for the user, and also may include a description of the response provided by the user as part of the event (e.g., answer selection). In other examples, the reporting element, e.g., the reporting element 112a, may include a quantity or description of a purchased item(s), or may include a number of times that the user selected a provided Internet link.
The event element may then be provided within a graphical user interface (608), such as, for example the GUI 104 and/or a web browser. For example, a manager of a survey may open, access, or otherwise view the survey and associated questions/answer choices thereof, in the same or similar manner in which the survey was previously presented to the user(s) (604). In other examples, the in-line report generator 102 may provide a number or description of purchased items, as in
Before, during, and/or after the providing of the GUI with the event element, the various associated reporting elements may be obtained (and possibly aggregated) (610). For example, the in-line report generator 102 may asynchronously load the reporting elements 112 (or a subset thereof) into the local memory 120, while the query elements 110b and response elements 110c of an associated survey are being provided on the GUI 104. In other examples, the reporting elements 512a-512e associated with the on-line purchases of
A request for the reporting elements may be received (612). For example, the report selector 114 may be activated or selected by the manager of a survey, or by someone reviewing on-line purchases by users, or by someone reviewing a history of visits to a web site.
The reporting element may then be provided within the GUI and aligned with the event element (614). For example, the in-line report generator 102 may provide the reporting element 112a in alignment with the event element 110a, or, more specifically, may provide the reporting element 112c in alignment with the response element 110g, as shown in
It should be understood that as the reporting element(s) is being provided (614), new or additional reporting elements may continually be obtained and/or aggregated in the background (610). For example, a survey may not be associated with a defined start or end time, so that it may be possible that such an on-going survey may receive user responses in an on-going manner. In this case, for example, as the manager of the survey views the reporting elements, additional reporting elements may be obtained at the same time. As a result, the reporting elements may be incremented or otherwise updated, or the manager may switch back-and-forth between edit/view mode and reporting mode, e.g., by repeatedly selecting the report selector 114. In the latter case, each entry into the reporting mode may cause a display of updated, newly-obtained reporting elements.
In the example of
In the example of
Specific examples of the survey components 710 are provided below, with respect to
It should be understood that the campaign manager 704 may generate and conduct a plurality of surveys, having the same, different, or over-lapping questions, and/or having the same, different, or over-lapping users (e.g., participants/respondents). Also, more than one survey may be associated with a single campaign conducted by the campaign manager 704 (as, for example, when the campaign manager 704 sends a follow-up survey to a same set of users, in order to gauge the users' responses to product changes that have been made in the interim, perhaps based on the users' previous responses). Moreover, although only a single campaign manager 704 is illustrated in
Using the feedback system 702, then, the campaign manager 704 may generate and distribute a survey 716 to a user 718, for viewing within a browser 720 or other GUI. The survey 716 thus includes at least one survey component 710a, which the user 718 may use to enter feedback into the survey 716. As referenced above, e.g., once the user 718 has completed the survey 716, the user 718 may be provided with an option to view a reporting of selections made by other users (not shown in
Once the user 718 has performed the event of filling out the survey 716, the feedback system 702 (e.g., the campaign tracking system 714) may receive the corresponding responses for storage within the user response components 712. For example, the user response components may include XML components that include the response information from the user 718. Although such response information may be included within the user response component(s) 712 in conjunction with the associated queries/responses of the relevant survey, it may be more efficient to store the response information by itself within the user response component(s) 712, but with a reference or link to the corresponding survey and/or campaign (e.g., with a reference or link to the corresponding survey component 710a). Examples of how the survey components 710 and user response components 712 may be constructed, linked, and used, are provided below with reference to
Thus, as users, such as the user 718, respond to the survey 716, the user response components 712 may be correspondingly populated. When the campaign manager 704 wishes to review results of the survey 716, the campaign manager 704 may open a browser 722 or other GUI, and may access the feedback system 702 therethrough to obtain and view the survey 716.
As shown in
As shown, the survey component 710a may include a plurality of query components, since the survey 716 may include a plurality of questions. A query component 810a is shown generically as including a the query element 110d and the response element 110e of
The survey component 710a also illustrates a second query component 810b, which may be associated with a second question/answer pair of the survey 716. Specifically, the query component 810b includes the query element 110f of
As shown and described, the user response component 712a may include a user ID 810 that identifies an associated user, e.g., a recipient/respondent of the survey 716. The identification may be at a high level (e.g., identifying the user as a member of a given group or organization) or may include an actual identification of the individual in question (including a current e-mail address, as described above). The user response component 712a may include the reporting element 112b that includes information about how the user (associated with the user ID 810) performed the event of selecting or providing an answer choice to the question of the query element 110d.
The user response component 712a also includes a survey ID 812 to associate the user response component 712a with the appropriate survey, as well as a query component ID 824 that, similarly, associates the user response component 712a with the appropriate query component of the related survey (e.g., the query component 810a).
Finally in the user response component 712a, a visibility indicator 816 is included that indicates whether the reporting element 112b should be hidden or displayed within the relevant GUI (e.g., the browser 722). For example, in some implementations, the in-line report generator 102 may provide the query component 110d, response element 110e, and the reporting element 112b to the appropriate GUI (e.g., the browser 722), e.g., for storage within the local memory 120 of the in-line report generator 102. Then, for example, in response to selection or de-selection of the report selector 114, the request handler 118 and the presentation logic 124 may determine that the reporting element 112b should be visible or invisible to the reviewing user (e.g., the campaign manager 704). In this way, the campaign manager 704 may essentially instantaneously be provided with reporting information, including the reporting element 112b, aligned with the associated response element 110e and/or the associated query element 110d. Further details associate with these and related techniques are provided below with respect to
Also in
Further, a user ID 828 that identifies the user providing the response information as “Chuck Marron.” A survey ID 830 associates the user response component 712b with the survey 716, and a query component ID 832 associates the user response component 712b with question 2 of the survey 716. Finally, a visibility indicator 834 indicates that the reporting element 826 should be made visible within the relevant GUI and aligned with the query element 110f and/or response element 110g of the query component 810b.
A code section 904 represents an example of screen-level information, i.e., a screen of questions associated with a particular survey, where the survey may be identified by survey ID 802 (e.g., the numeric identifier “5414”). A code section 906 indicates a location (e.g., Uniform Resource Locator (URL)) from which the survey may be rendered. Then, code sections 908, 910, 912, 914, and 916 all represent different query element(s) 110d and response elements 110e, each associate with a corresponding query component ID, such as the query component ID 804.
For example, the code section 908 includes a query component ID of “compId=“37916,”” and specifies the question “how much do you like the presented content” as a query to be answered using a rating scale ranging from 1-5, with corresponding captions at each end (e.g., question 1 of
The code section 912 includes the question, “what is your role in your organization?” and a corresponding response element that specifies the various roles (as in question 3 of
Then, a code section 1006 represents a reporting element, such as the reporting element 112b or 826, which indicates that the user in question (e.g., “Chuck Marron”) responded to component id=7 (i.e., the multiple choice query from the code section 912 of
A code section 1012 similarly provides a second example of a user response element, which includes various identifiers in a code section 1014 (e.g., campaignId, screenId, client/user identification, and other reporting information (e.g., time of submission of the choices by the relevant user, “Ted Norris.” The code sections 1016, 1018, and 1020 provide corresponding information as that just described for the code sections 1006-1010, but for the second user, Ted Norris.
More specifically,
In
At some point, the campaign manager 704 or other reviewer may request results of a campaign (e.g., using the request handler 118 of the in-line report generator), so that a GUI, e.g., the browser 722, may be provided with the associated survey components (1108). Before, during, and/or after the loading of the survey components, the browser 722 also may load and/or aggregate associated user response components 712 (1110).
At this point, the associated reporting elements 112 of the user response components may be included in the transmission(s) from the feedback system 702 to the in-line report generator 102 and the browser 722, but may be marked as hidden, and so not displayed within the browser 722. Rather, the survey components 710 and user response components 712 may be stored within the local memory 120 associated with the browser 722.
For example, the survey components 710 and/or the user response components 712 may be implemented in conjunction with Macromedia Flash™, which provides an integrated development environment (IDE) for authoring content in a proprietary scripting language known as ActionScript. The content may then be provided using, for example, the associated Macromedia Flash Player within the browser 722. In this and similar environments, the reporting element(s) 112b or 826 may be asynchronously loaded to the browser 722 and hidden from view while the associated query and response elements 110d-110g are displayed. In this way, the reporting elements are ready and available for when the campaign manager 704 wishes to view them.
Of course, other techniques may be used to asynchronously load the user response elements 712 to the local memory 120 of the browser 722. For example, client-side scripting languages, such as, for example, Javascript, may be used to load the user response components 712, and to merge the user response components 712 with a document object model (DOM) of the already-loaded page of the survey components 710. These and similar techniques may be used in conjunction with interactive web development techniques such as, for example, “Asynchronous JavaScript And XML,” also referred to as Ajax. Ajax may be used to allow for interacting with a server (e.g., a server running the feedback system 702) while a current web page is loading (or has loaded). Ajax may use the XMLHttpRequest or an IFrame object to exchange data with an associated server, usually in the XML format (although other formats may be used).
Still other additional or alternative techniques may be used to operate the in-line report generator 102 as described herein. For example, Dynamic Hyper-Text Mark-up Language (DHTML) techniques, ActiveX techniques, Java applets, and/or other remote/client-side scripting techniques may be used.
Once some or all of the user response components 712 have been loaded to the client (browser 722), the in-line report generator 102 may so indicate by providing the report selector 114 of
At this point, the user response components 712 may be provided within the browser 722, aligned with the corresponding survey components 710 (1114). For example, with reference to
It should be understood that the in-line report generator 102 may continue to load/aggregate user response components, even after the campaign manager 704 has selected and viewed the desired reporting elements. For example, the survey 716 may be on-going, or may be only halfway through its scheduled time for deployment. Nonetheless, the campaign manager 704 may use the in-line report generator 102 to quickly and easily view results, even at such intermediate stages, and may view changed/updated results as new user response components 712 are received.
Although the above examples have been provided for the sake of explanation, it should be understood that many other embodiments may be implemented. For example, the in-line report generator 102 may be used in virtually any data reporting or analytics scenario (e.g., including any statistic, analysis, abstraction, grouping, and/or subset of aggregated response elements). For example, such data reporting may be performed with regard to e-mails listed in a user's inbox, e.g., when the user may use in-line reporting to learn about events such as how many other users have read or forwarded a particular e-mail.
Further, although various techniques have been described, it should be understood that many other techniques may be used. For example, reporting elements may be provided by forcing or requiring a refresh of an entire page (e.g., refreshing the screenshot 200 of
Implementations of the various techniques described herein may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. Implementations may implemented as a computer program product, i.e., a computer program tangibly embodied in an information carrier, e.g., in a machine-readable storage device or in a propagated signal, for execution by, or to control the operation of, data processing apparatus, e.g., a programmable processor, a computer, or multiple computers. A computer program, such as the computer program(s) described above, can be written in any form of programming language, including compiled or interpreted languages, and can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.
Method steps may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output. Method steps also may be performed by, and an apparatus may be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. Elements of a computer may include at least one processor for executing instructions and one or more memory devices for storing instructions and data. Generally, a computer also may include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory may be supplemented by, or incorporated in special purpose logic circuitry.
To provide for interaction with a user, implementations may be implemented on a computer having a display device, e.g., a cathode ray tube (CRT) or liquid crystal display (LCD) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
Implementations may be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation, or any combination of such back-end, middleware, or front-end components. Components may be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (LAN) and a wide area network (WAN), e.g., the Internet.
While certain features of the described implementations have been illustrated as described herein, many modifications, substitutions, changes and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the embodiments.