Many services are delivered to consumers via software applications. In examples, these software applications may be composite in that several software components work in combination to fulfill the service. The components themselves may be distributed across various physical and virtual devices. For instance, a smartphone, tablet, notebook or other user computing device may serve as a client side user interface component. Through that user interface component, a user may initiate a series of actions carried to be carried out by the user computing device and by server side components to fulfill the service.
Introduction:
For a provider of a software application, understanding the user experience and users satisfaction with the application are key factors to successful implementation. With such an understanding, the provider of the application can better evaluate the success or likely success of the software application and how to invest resources for future development. An important issue in evaluating user experience and satisfaction can be identifying failed user interactions with the application. In some situations failed user interactions will result in a crash of the application and/or device the user is interacting with, and are therefore readily identifiable. In many cases, though, a failed user interaction with the application merely causes an error message at the device (e.g., an error message displayed in a popup or other graphic user interface element) without an application or device crash.
Typically error detection in monitoring and testing environments has been accomplished by analyzing log messages for the subject application, e.g., spotting errors in the log according to predefined templates. However, errors found in logs do not always represent the actual error a user experienced. This can be because the log errors may reflect a problem in the code flow versus the user's usage flow. Further, detecting failed user interactions via error statements outside of code error logs has been challenging in that error messages displayed in popups and similar graphic user interface elements will often appear identical to non-error messages except for the text. For instance, the text may not always include “error” or another easily recognized trigger word.
To address these issues, various examples described in more detail below provide a system and a method to identify application message types utilizing a prediction model, wherein the prediction model is determined via imposition of a set of rules upon a set of user facing messages extracted from a set of software applications. In one example of the disclosure, a subject message is obtained by an application message type identification system. The subject message is a message that is to be part of a display caused by a subject software application. A prediction model is utilized to identify the subject message as a first type message or a second type message (e.g., an error message type versus non-error message type). The prediction model is a model determined based upon a set of target words determined by imposition of a set of rules upon a set of user facing messages extracted from a set of software applications, wherein each of the extracted messages was classified post-extraction as a first type message or a second type message. In turn, a communication identifying the subject message as the first type message or the second type message is provided.
In other various examples described in more detail below, a system and a method are provided to determine a prediction model to be utilized to identify application message types. In an example of the disclosure, a set of user-facing messages extracted from a set of software applications is accessed. Each of the extracted messages is a message that has been classified after extraction as a first type message or a second type message (e.g., an error type or non-error type message). Rules are applied to the accessed messages to create a set of target words. For each of the target words, a message distribution of the target word across a message is calculated, and a set distribution of the target word across the set of messages is calculated. A machine learning algorithm is in turn applied to determine a message type prediction model that is based upon the calculated message distributions and set distributions.
It should be noted that while the disclosure is discussed frequently with reference to examples wherein a prediction model is utilized to identify a subject message as a first type message or a second type message, the first type message being an error message and the second type message being a non-error message, teachings of the present disclosure are not so limited and can be applied to any first and second message types. For instance, in examples of the disclosure, a prediction model may be determined and utilized to identify a subject message as a first type message or a second type message, wherein the first message type is an understandable message, and the second type message is a non-understandable message. Other choices for the first and second message types are possible and are contemplated by this disclosure.
In this manner, examples described herein can enable providers of software applications to, in an automated and efficient manner, identify first and second type messages (e.g., error messages and non-error messages, or understandable and non-understandable messages, or other combinations of first and second type messages). Disclosed examples will enable application providers to regularly update the prediction model with new data as the set of extracted and analyzed messages is expanded, and can be can be integrated into other software testing products. Thus, application providers' and developers' satisfaction with products and services that evaluate software application performance utilizing the disclosed examples, and the physical and virtual devices that host or otherwise facilitate such software application evaluation services, should increase. Further end user satisfaction with the subject software applications that are evaluated utilizing the derived prediction model, and the physical and virtual devices that are used to access or host such subject software applications, should increase.
The following description is broken into sections. The first, labeled “Environment,” describes an environment in which various examples may be implemented. The second section, labeled “Components,” describes examples of various physical and logical components for implementing various examples. The third section, labeled “Illustrative Example,” presents an example of identification of application message types. The fourth section, labeled “Operation,” describes steps taken to implement various examples.
Environment:
Link 120 represents generally any infrastructure or combination of infrastructures to enable an electronic connection, wireless connection, other connection, or combination thereof, to enable data communication between components 108-118. Such infrastructure or infrastructures may include, but are not limited to, one or more of a cable, wireless, fiber optic, or remote connections via telecommunication link, an infrared link, or a radio frequency link. For example, link 120 may represent the internet, one or more intranets, and any intermediate routers, switches, and other interfaces. As used herein an “electronic connection” refers generally to a transfer of data between components, e.g., between two computing devices, that are connected by an electrical conductor. A “wireless connection” refers generally to a transfer of data between two components, e.g., between two computing devices, that are not directly connected by an electrical conductor. A wireless connection may be via a wireless communication protocol or wireless standard for exchanging data.
Client devices 110, 112, and 114 represent generally any computing device with which a user may interact to communicate with other client devices, server device 116, and/or server devices 118 via link 120. Server device 116 represents generally any computing device to serve an application and corresponding data for consumption by components 108-118. Server devices 118 represent generally a group of computing devices collectively to serve an application and corresponding data for consumption by components 108-118.
Computing device 108 represents generally any computing device with which a user may interact to communicate with client devices 110-114, server device 116, and/or server devices 118 via link 120. Computing device 108 is shown to include core device components 122. Core device components 122 represent generally the hardware and programming for providing the computing functions for which device 108 is designed. Such hardware can include a processor and memory, a display apparatus 124, and a user interface 126. The programming can include an operating system and applications. Display apparatus 124 represents generally any combination of hardware and programming to exhibit or present a message, image, view, or other presentation for perception by a user, and can include, but is not limited to, a visual, tactile or auditory display. In examples, the display apparatus 124 may be or include a monitor, a touchscreen, a projection device, a touch/sensory display device, or a speaker. User interface 126 represents generally any combination of hardware and programming to enable interaction between a user and device 108 such that the user may effect operation or control of device 108. In examples, user interface 126 may be, or include, a keyboard, keypad, or a mouse. In some examples, the functionality of display apparatus 124 and user interface 126 may be combined, as in the case of a touchscreen apparatus that may enable presentation of images at device 108, and that also may enable a user to operate or control functionality of device 108.
System 102, discussed in more detail below, represents generally a combination of hardware and programming to enable identification of application message types. In some examples, system 102 may be wholly integrated within core device components 122. In other examples, system 102 may be implemented as a component of any of computing device 108, client devices 110-114, server device 116, or server devices 118 where it may take action based in part on data received from core device components 122 via link 120.
In other examples, system 102 may be distributed across computing device 108, and any of client devices 110-114, server device 116, or server devices 118. For instance, in an example system 102 may include a model component that operates on server device 116 (or one or more other devices shown or not shown in
In a particular example, message engine 202, identification engine 204, and communication engine 206 may be included in a message type identification computer system 102 hosted at computing device 108, wherein the accessed subject message is a message for display when the subject software application is executed at a client computer system, e.g., a mobile client device 114. In this particular example, the communication may be provided to a developer computer system, e.g., at server 116, that will utilize the communication to determine a user experience rating for the application.
Components:
In an example, message engine 202 represents generally a combination of hardware and programming to obtain a subject message, the subject message for a display caused by a subject software application. As used herein, a “message” refers generally to any communication, and is not meant to be limited to text or a character string. As used herein, “display” refers generally to an exhibition or presentation caused by a computer for the purpose of perception by a user. In an example, a display may be or include a GUI display to be presented at a computer monitor, touchscreen, or other electronic display device. As used herein, “software application” and “application” are used synonymously, and refer generally to a web application, mobile application, software application, firmware application, or other programming that executes at, or is accessible at, a computing device.
Continuing with
The model that is utilized by identification engine 204 to identify the obtained subject message as a first or second type message is a model determined based upon a set of target words. The target words set is a set of words that was determined by imposition of a set of rules upon a set of user facing messages extracted from a set of software applications, wherein each of the extracted messages was classified post-extraction as a first type message or a second type message. In an example, the set of extracted messages is a set of messages that were presented to one or more users for manual classification as a first type versus second type message (e.g., an error message versus a non-error message, or as an understandable versus non-understandable message). In an example, the set of classified software applications that were user-classified by a user post-extraction is a set of applications that does not include the subject application. In an example, the set of extracted messages is a set of messages that were extracted from the set of software applications via execution of a script or scripts that interacted with software applications in the set. As used herein, a “script” refers generally to any computer programs, including, but not limited to, small programs (e.g., up to a few thousand lines of code) and/or programs written in domain-specific languages.
Continuing with
In an example of the disclosure, the prediction model utilized to identify the subject message as a first or second type message is a model determined according to a process that included the imposition of a set of rules imposed upon the extracted user facing messages, the rules including stemming words in the extracted messages to root representations of the words. In an example of stemming, the words “recovery”, “recoverable”, and “recovered” might be stemmed to a “recover” root representation of the words.
In an example, the prediction model is a model determined according to a process that included a rule of removing stop words from extracted messages. As used herein, a “stop word” refers generally to a specific word, or specific phrase, which is to be filtered out due to being not helpful in differentiating first and second type messages. In different situations, any group of words can be chosen as the stop words for a given purpose. In an example, stop words may include common, short function words, such as “the”, “is”, “at”, “which”, and so on.
Continuing with
In an example, the prediction model is a model determined according to a process that included imposing a rule wherein, for each of the extracted user facing messages, a bag of words was created to represent the extracted message as a vector of the target words included within the message. As used herein, a “bag of words” refers generally to any a simplifying representation wherein text (e.g., an extracted message) is represented as a collection or vector of its words. In an example, the bag of words may be a representation or vector of an extracted message that disregards grammar and word order of the message as extracted. In an example, the bag of words may be a representation or vector of an extracted message that disregards grammar and word order of the message as extracted, yet maintains multiplicity.
Continuing with
In a particular example, identification engine 204 in utilizing the prediction model to identify the subject message as a first type message or a second type message may include one or more of steps of stemming words of the subject message, removing stop words from the subject message, and creating a bag of words for the subject message. In an example, identification engine 204 may, for words in the subject message, apply the determined prediction model to the words to determine the probabilities that the message is a first type message (e.g., an error message) and that the message is a second type message (e.g., a non-error message). In an example, identification engine 204 will classify the subject message as the first type message if the calculated probability that the message is a first type message is greater than the calculated probability that the message is a second type message (e.g., a probability the subject message is not a first type message).
Continuing with
In particular examples, the communication provided by communication engine 206 is to be utilized to determine a user experience rating for the application. As used herein, a “user experience rating” refers generally to rating of user behaviors, attitudes, and/or emotions with respect to use of a product, system or service, e.g., use of the subject application. In examples, a user experience rating may include user perceptions of system aspects such as utility, ease of use and efficiency. In one example, communication engine 102 may determine the user experience rating utilizing the communication, and provide the user experience rating for the application for display to a user. In another example, communication engine 102 may send the communication to a distinct application performance evaluation application, such that the application performance evaluation application can determine a user experiencer rating for the application utilizing the communication and create the display with the user experience rating for user viewing.
In certain examples of the disclosure, message engine 202 may obtain via a network, e.g., link 120 (
In an example, access engine 208 represents generally a combination of hardware and programming to access a set of user-facing messages extracted from a set of software applications. Each of the extracted messages is a message that was classified after extraction, e.g., via a user assigning a classification on a message by message basis, as a first type message or a second type message. In one example the first type message may be an error type message and the second type message may be a non-error type message. In another example, the first type message may be an understandable message and the second type message may be a non-understandable message.
Continuing with
Continuing with
Continuing at
In an example, the determined prediction model is to, when implemented, provide an output of two probabilities: a probability for subject message to be a first type message (e.g., an error message or an understandable message) if the subject message contains a specific attribute or word, and also a probability that the subject message that contains the attribute or word is of the second type (e.g., a non-error message or a non-understandable message). In a particular example, identification engine 204 (
With reference back to
In the foregoing discussion of
Memory resource 302 represents generally any number of memory components capable of storing instructions that can be executed by processing resource 304. Memory resource 302 is non-transitory in the sense that it does not encompass a transitory signal but instead is made up of more or more memory components to store the relevant instructions. Memory resource 302 may be implemented in a single device or distributed across devices. Likewise, processing resource 304 represents any number of processors capable of executing instructions stored by memory resource 302. Processing resource 304 may be integrated in a single device or distributed across devices. Further, memory resource 302 may be fully or partially integrated in the same device as processing resource 304, or it may be separate but accessible to that device and processing resource 304.
In one example, the program instructions can be part of an installation package that when installed can be executed by processing resource 304 to implement system 102. In this case, memory resource 302 may be a portable medium such as a CD, DVD, or flash drive or a memory maintained by a server from which the installation package can be downloaded and installed. In another example, the program instructions may be part of an application or applications already installed. Here, memory resource 302 can include integrated memory such as a hard drive, solid state drive, or the like.
In
In
In an example, a system 102 accesses a set of user-facing messages 404 extracted from a set of software applications, wherein each of the extracted messages 404 was classified after extraction as an “error message” or a “non-error message.” For instance, system 102 accesses a first extracted message 426 (“Failed. Out of memory. Please try again later.”) that was user-classified as an error message.
System 102 applies rules 406 to the accessed messages to create a set of target words 408. For instance, at 428, system 102 applies a rule to stem the word “Failed” in the first extracted message to its root representation “Fail.” At 430, system 102 applies a rule to remove a stop word “of” from the first extracted message 426.
At 432 system imposes a rule 406 wherein a bag of words 432 is created to represent an extracted message 426 (referred to in this example without limitation as to extraction order as the “first extracted message 426”) as a vector of the target words 408 “Please”, “later”, “again”, “try”, “Out”, “Fail”, and “memory” that are included within the first extracted message 426. In this example the bag of words 432 disregards grammar and word order of the message as extracted.
Continuing at
System 102 in turn applies a machine learning algorithm 414 to determine a message type prediction model 416 based upon the calculated message distributions 410 and set distributions 412 for each of the determined target words 408. In an example, system 102 may apply one of the Naïve Bayes or logistic regression learning models. In this example, the resulting prediction model 416 is a model wherein the model, when utilized to determine whether an obtained subject message is an error message or a non-error message, will return two probabilities: a first probability 436 that the subject message is an error message and a second probability 438 that the subject message is a non-error message.
Continuing at
In one example, system 102 may provide the communication to a user experience component at system 102, which in turn may determine a user experience score 424 for the subject software application 434 for user display. In another example, system 102 may provide the communication 422 to a computer system separate from system 102. In an example, the separate computer system may in turn determine a user experience score for the subject software application 434.
Operation:
A prediction model is utilized to identify the subject message as a first type message or a second type message. The prediction model is a model determined based upon a set of target words determined by imposition of a set of rules upon a set of user facing messages extracted from a set of software applications. Each of the extracted messages is a message that was classified post-extraction as a first type message or a second type message (block 504). Referring back to
A communication identifying the subject message as the first type message or the second type message is provided (block 506). Referring back to
Rules are applied to the accessed messages to create a set of target words (block 604). Referring back to
For each of the target words, a message distribution of the target word across a message is calculated, and a set distribution of the target word across the set of messages is calculated (block 606). Referring back to
A probabilistic classifier is applied to determine a message type prediction model based upon the calculated message distributions and set distributions (block 608). Referring back to
Although the flow diagrams of
The present disclosure has been shown and described with reference to the foregoing examples. It is to be understood, however, that other forms, details and examples may be made without departing from the spirit and scope of the invention that is defined in the following claims. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2014/071834 | 12/22/2014 | WO | 00 |