The present disclosure relates to electronic systems. More specifically, embodiments of the disclosure relate to a system and method for improving and applying technology related to displaying and analyzing user interface variants for concurrent analysis by a user.
Electronic systems utilize user interfaces (UI's) for a variety of purposes including displaying, observing, and collecting data. User interface designers and developers (collectively “developers”) often modify user interfaces to reflect updates in rendered data, changes in data collection, refreshing a look and feel of the interface to maintain engagement with a user, add new functionality, and so on. However, changes in any UI can have a multiple intended and sometimes unintended consequences. However, such consequences can be difficult to detect especially when considering multiple modified UI design choices.
A system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to provide concurrently display images of variant user interfaces and corresponding metrics for concurrent comparison. At least one embodiment includes a computer-implemented method including: generating an experiment including multiple variants of a user interface, the experiment further including metrics that are to be monitored for the multiple variants, where the metrics are configured to provide data relating to interactions between a variant user interface and a client; deploying the multiple variants of the user interface; monitoring the metrics for the multiple variants of the user interface; retrieving the metrics associated with the multiple variants; and presenting images of the variants and corresponding real-time metrics for concurrent comparison on a display in response to user commands. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
At least one embodiment includes a system including a processor; a data bus coupled to the processor; and a non-transitory, computer-readable storage medium embodying computer program code, the non-transitory, computer-readable storage medium being coupled to the data bus, the computer program code including instructions executable by the processor for: generating an experiment including multiple variants of a user interface, the experiment further including metrics that are to be monitored for the multiple variants, where the metrics are configured to provide data relating to interactions between a variant user interface and a client; deploying the multiple variants of the user interface; monitoring the metrics for the multiple variants of the user interface; retrieving the metrics associated with the multiple variants; and presenting images of the variants and corresponding real-time metrics for concurrent comparison on a display in response to user commands.
At least one embodiment includes a non-transitory, computer-readable storage medium embodying computer program code, the computer program code may include computer executable instructions configured for: generating an experiment including multiple variants of a user interface, the experiment further including metrics that are to be monitored for the multiple variants, where the metrics are configured to provide data relating to interactions between a variant user interface and a client; deploying the multiple variants of the user interface; monitoring the metrics for the multiple variants of the user interface; retrieving the metrics associated with the multiple variants; and presenting images of the variants and corresponding real-time metrics for concurrent comparison on a display in response to user commands. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
The present disclosure may be better understood, and its numerous objects, features and advantages made apparent to those skilled in the art by referencing the accompanying drawings. The use of the same reference number throughout the several figures designates a like or similar element.
Systems and methods are disclosed for presenting images of multiple, variant user interfaces along with corresponding metrics for concurrent comparison by a user. Client interfaces allow a client to interact with an underlying system and/or application. Examples of client interfaces include webpages that are presented to a client in a client/server system. Certain styles of client interfaces may be more effective at driving user engagement with client side applications when compared to other client interface designs, even when the client interfaces are designed for similar purposes. The effectiveness of a client interface depends on such things as types and placement of interface images on the interface, types and placement of interface controls, types and placement of menus, types and placement of instructional information on the interface, etc. Embodiments of the disclosed system and method allows user interface designers to directly compare images of variants of a user interface and corresponding metrics with images of one or more other variants on a display. In at least one embodiment, an image of a variant and corresponding metrics are designated as a control against which images of other variants and corresponding metrics are directly compared on the display.
For purposes of this disclosure, an information handling system may include any instrumentality or aggregate of instrumentalities operable to compute, classify, process, transmit, receive, retrieve, originate, switch, store, display, manifest, detect, record, reproduce, handle, or utilize any form of information, intelligence, or data for business, scientific, control, or other purposes. For example, an information handling system may be a personal computer, a network storage device, or any other suitable device and may vary in size, shape, performance, functionality, and price. The information handling system may include random access memory (RAM), one or more processing resources such as a central processing unit (CPU) or hardware or software control logic, ROM, and/or other types of non-volatile memory. Additional components of the information handling system may include one or more disk drives, one or more network ports for communicating with external devices as well as various input and output (I/O) devices, such as a keyboard, a mouse, and a video display. The information handling system may also include one or more buses operable to transmit communications between the various hardware components.
The information handling system 100 likewise includes system memory 112, which is interconnected to the foregoing via one or more buses 114. System memory 112 may be local memory, remote memory, memory distributed between multiple information handling systems, etc. System memory 112 further comprises an operating system 116 and in various embodiments may comprise other software modules and engines configured to implement certain embodiments of the disclosed system.
In the example shown in
In certain embodiments, variant selection engine 122 is also employed to select which metrics to gather for variants in the variant pool 124. In certain embodiments, each variant is assigned an associated variant metric. In at least one embodiment, Varient metrics (0) through Varient metrics(n) are respectively associated with Variant(0) through Varient(n). In at least one embodiment, a single set of metrics are assigned to all variants used in the experiment so that a comparative analysis of the effectiveness of the variants use the same metrics. In at least one embodiment, when a single set of metrics are assigned to all variants, the variant pool 124 need not include multiple sets of variant metrics. Rather, in at least one embodiment, the variant metrics used across all variants can be assigned in a single selection operation using, for example, the variant selection engine 122 and stored with the experiment parameters as opposed to being stored with each variant.
In at least one embodiment, system 118 may include a variant deployment engine 126 configured to deploy the variants for the experiment to the cloud platform 142. In at least one embodiment, the variant deployment engine 126 is used define a deployment schedule for the variants of the experiment. The variant deployment engine 126 may also be configured to execute the actual deployment of the variants to the cloud platform 142. In certain embodiments, different variants are deployed for concurrent use on multiple cloud servers of the cloud platform 142. As an example, different variants may be deployed to different cloud servers so that the variant provided to a client is dependent on the servers on which the variant resides. In certain embodiments, the same variants are deployed in a time series manner so that a single variant is used concurrently at all cloud servers. As an example, a single variant may be deployed to all of the servers so that a single variant is provided to all clients accessing the cloud platform 142 independent of the servers on which the variant resides. As an extension of this example, all servers of the cloud platform 142 may initially operate with a first variant under a first set of operating conditions (e.g., set period of time) and operate with a second variant under a second set of operating conditions.
At least one embodiment includes a variant metrics analytics engine 134. In certain embodiments, the analytics engine 134 is configured to gather metrics data for the variants of the experiment from the servers of the cloud platform 142. In certain embodiments, the analytics engine 134 aggregates and formats the metrics data associated with the variants of the experiment for display at a display of the system 100.
Certain embodiments also include a variant image selection engine 136. In at least one embodiment, the variant image selection engine 136 allows a user to select images of the variants used in the experiment. In certain embodiments, the images of a variants and corresponding metrics are concurrently displayed to a user thereby allowing the user to compare the variants and corresponding metrics with one another on the same display.
In certain embodiments, different variants are concurrently used at different cloud servers. In at least one embodiment, a loaded distributor 217 is used to evenly distribute traffic between the cloud servers so that cloud servers utilizing different variants of the same client interface receive the substantially the same traffic flow. Distributing the traffic flow equally between the cloud servers assists in ensuring that the metrics gathered by different cloud servers for the same variant may be properly compared. As an example, the load distributor 217 may direct an equal amount of traffic between multiple cloud servers employing the same variant interfaces so that the metrics gathered at one server are based on the same amount of traffic directed to other servers.
The exemplary environment 200 shown in
In at least one embodiment, parameters for the experiment are defined at operation 304. In certain embodiments, a user utilizing variant selection engine 122 defines the experiment by selecting multiple variants that are to be included in the experiment as well as the metrics that are to be monitored for the variants. At operation 306, system 118 deploys the variants and metrics of the experiment to, for example, one or more servers of Cloud servers(0)-(n). As the variants are accessed by clients, such as, the corresponding metrics are monitored at operation 308.
In certain embodiments, with system 118, a determination is made at operation 310 as to whether the experiment is to be ended. In at least one embodiment, the experiment is ended after all of the variants have been deployed and used for a predetermined period of time. In at least one embodiment, the experiment is ended after a predetermined amount of traffic has been received at a server hosting the variants. Other criterion may be used at operation 310 to terminate the experiment. If the criterion for ending the experiment is not met at operation 310, the metrics continue to be monitored at operation 308.
Once the experiment has ended at operation 310, variant metrics analytics engine 134 gathers and analyzes the metrics for the variants at operation 312. A variety of metrics may be gathered and analyzed. As an example, the metrics may include the percentage of times users opted to activate a control on the variant to proceed to another webpage, such as a survey page. As another example, the metrics may include sales of items or services presented on the variant. Other metrics may also be used to provide a comparative assessment between variants.
At operation 314, at least one embodiment of system 218 presents the images of the variants and corresponding metrics for direct comparison on a display in response to user commands. As an example, images of multiple variants as viewed by the client and the corresponding metrics for the multiple variants are presented on the display for direct comparison by a user. As an example, images of multiple variants and corresponding metrics may be concurrently presented on the display in a side-by-side manner. As another example, an image of a control may be fixed on the display as a user sequences through images of other variants so that images of one or more of the other variants are displayed concurrently with the control.
System 118 deploys the variants for the experiment to one or more cloud servers at operation 408. In at least one embodiment, all of the variants are deployed concurrently and the metrics for the deployed variants are also monitored concurrently at operation 410. In at least one embodiment, the metrics are monitored at operation 410 until a determination is made at operation 412 that the experiment with the currently deployed variants is complete. If the experiment is completed, the results of the experiment may be stored at operation 414. As an example, the results of the monitoring of the metrics for each of the deployed variants may be stored in electronic memory for retrieval and analysis.
In at least one embodiment, the user names the experiment at operation 502 and sets the parameters for the experiment at operation 504. In this example, a variant that is to be deployed is selected at operation 504 along with the metrics are to be monitored for the variants in the experiment. In at least one embodiment, the metrics may include the predefined duration of time that the variant is to be utilized at one or more cloud servers. At operation 506, variant selection engine 122 determines whether there are more variants that are to be employed in the experiment. If more variants are to be employed, the additional variant is selected by the user at operation 504. In some embodiments, if a different set of metrics are to be monitored for the additional variant, the metrics for the additional variant are also defined at operation 504. In certain embodiments, operation 504 and operation 506 are executed until all variants have been included for the defined experiment.
In at least one embodiment, at operation 508 variant deployment engine 126 deploys one or more variants along with the metrics that are to be monitored for the deployed variant(s). The metrics for a variant are monitored at operation 510 until a timeout, based on the predetermined time, is reached at operation 512. If the predetermined time for the monitoring of the metrics of the deployed variant has not elapsed, at operation 510, the variant metrics analytics engine continues to monitor the metrics for the variant. If the predetermined time for monitoring the metrics of the deployed variant has been reached, system 118 at operation 514 determines whether there are more variants that are to be deployed as part of the experiment. If more variants are to be deployed, the next variant in the sequence is deployed at operation 508 by variant deployment engine 126 and the metrics for the deployed variant are monitored by variant metrics analytics engine 134 at operation 510 until the time for monitoring the variant has elapsed. Once all of the variants for the experiment have been deployed and monitored, the results for the experiment, such as the results of monitoring the metrics for each variant are stored at operation 516.
In certain embodiments, the variants and metrics used in the experiment are retrieved at operation 604. Images of the variants used in the experiment are selected and/or retrieved at operation 605. At operation 606, the user may provide commands to the variant selection engine 122 to retrieve the images of the variants and metrics that are to be concurrently displayed for comparison. In certain embodiments, the user may identify success criterion at operation 608 whereby an image corresponding to a variant meeting a predetermined threshold of effectiveness is flagged at the display for the user. At operation 610, variant selection and deployment system via user interface 138 present the images of the variants and corresponding metrics for the variants to the user for direct comparison on a display in response to the user commands that were executed at operation 606.
In certain embodiments, the user may identify an image for a control variant at screen region 706. In certain embodiments, the control corresponds to a variant against which other variants may be compared. In the example shown in
Embodiments of the disclosure are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The disclosed system is well adapted to attain the advantages mentioned as well as others inherent therein. While the present invention has been depicted, described, and is defined by reference to particular embodiments of the invention, such references do not imply a limitation on the invention, and no such limitation is to be inferred. The invention is capable of considerable modification, alteration, and equivalents in form and function, as will occur to those ordinarily skilled in the pertinent arts. The depicted and described embodiments are examples only, and are not exhaustive of the scope of the invention.
The present application claims priority to U.S. Provisional Patent Application No. 63/030,287, filed May 26, 2020, entitled “Systems and Methods to Enhance Technology Including Employment and Security Related Technology”, which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
63030287 | May 2020 | US |