The disclosure generally relates to the field of data processing, and more particularly to determining aspects of program performance.
Product management solutions address requirements of developers and information technology (IT) managers to collect and analyze performance information for program products. Controlled testing used during development phases provides information regarding fundamental product operation and performance. However, expanding numbers and varieties of applications and host platform environments, such as mobile device processing environments, require more comprehensive and flexible performance monitoring solutions and architectures. To address the foregoing issues, program monitoring solutions may employ components for directly collecting application performance data and processing results that are displayed using views that aid developers and IT managers in efficiently determining and understanding operating conditions and trends for various aspect of the application(s) being monitored.
In addition to directly measured program performance information, user feedback information is frequently utilized to facilitate program code product development and support by providing insight into user-centric, qualitative aspects of program performance. User feedback information is particularly important in evaluating performance of a program. However, methods and system for obtaining precise and accurate user feedback are typically costly and sometimes are insufficiently flexible for effective deployment in program development and modification cycles that are increasingly incremental and continuous between major product version releases.
Embodiments of the disclosure may be better understood by referencing the accompanying drawings.
The description that follows includes example systems, methods, techniques, and program flows that embody embodiments of the disclosure. However, it is understood that this disclosure may be practiced without some of these specific details. In other instances, well-known instruction instances, protocols, structures and techniques have not been shown in detail in order not to obfuscate the description.
Overview
Disclosed embodiments includes methods, devices, and systems that utilize reference document user interface (UI) activity, also referred to as interaction events detected by a UI, to identify or otherwise determine performance issues relating to user experience during operation of a program. As utilized herein a “program” or “program product” refers to one or more sets of individually or collectively compiled instructions that are executable by a computer. For example, a program may refer to multiple programs that are statically linked and therefore collectively compiled and executed. A program may also or alternatively refer to multiple programs one or more of which are dynamically linked and therefore independently compiled and called or otherwise linked during execution. The program may be an application program under test such as a database application that includes components and multi-component features that may be individually assessed such as by users during performance testing cycles. A reference document that describes the program is an electronic document such as an operation manual. The reference document is formatted in accordance with an underlying electronic document format to include multiple sub-sections, referred to alternately as document elements, reference element or elements, of the document.
The components and multi-component features of the program, referred to alternately as program elements, are pre-selected to be indexed or otherwise associated with a set of the reference elements. The indexing may include recording associations between program elements and reference elements based on a quantitative and/or qualitative analysis of the descriptive correlation between the reference elements and corresponding program elements. The indexing is performed by a performance classification system that further comprises components that leverage the indexing information during classifier training and classification operations to more precisely, accurately, and efficiently determine performance classifications for programs. Such components may include a training data generator that collects quantitative results in the form of accumulated interaction based event metrics associated with contemporaneous or otherwise operationally associated program element operation metrics.
The training data generator further collects qualitative results in the form of user-specified program element performance classifications that may be used as supervisor values when associated with the combinations of operational metrics and interaction based event metrics. A pattern recognition trainer component processes one or more training-cycle-specific sets of quantitative and qualitative training data to configure pattern recognition code for a performance classifier. The performance classifier may be a program extension, such as a plugin, called by a performance test system to determine performance classifications of one or more program elements of a program based, at least in part, on patterns of reference element interaction based events.
Example Illustrations
Client device 104 may be a compact and mobile computing/networking device or a highly integrated computer platform such as a personal computer. In addition to a network interface, client device 104 includes a main processor 116 and an associated system memory 118 that stores data and system and application software including an application program 122 and a reference document application program 124. In combination, processor 116 and memory 118 provide information processing capability necessary for network communications and furthermore to enable client device 104 to perform other information handling tasks related to, incidental to, or unrelated to the methods described herein. An operating system (OS) 120 executed from system memory 118 may be a flexible, multi-purpose OS and may generally comprise code for managing and providing services to hardware and software components within client device 104 to enable program execution and input/output functions.
Program 122 may be any of a variety of application program types such as a database, a system management application, a code development application, etc. Reference application 124 is a program for generating, storing, rendering and otherwise processing an electronic reference document 125, which is generated, stored, and accessed as a distinct file. For example, reference application 124 may be a document rendering program that implements a version of the portable document format (PDF) file format. Reference document 125 is an electronic document file comprising text and images that depict and describe the features and operation of program 122. Reference document 125 includes various distinctly identifiable sections, referred to alternately as reference elements, which are individually identifiable in accordance with the document format. During operation/execution of program 122, user interface inputs may be received by reference application 124 to display reference document 125 which may, for example, be referenced by a user input via a UI to facilitate interactive operation of program 122.
Processor 116 and main memory 118 provide a storage and execution platform for operation/activity information collection code that may be part of or supplementary to the program code of application programs 122 and 124. The collection code includes an application agent 126 and a reference document agent 128. Application agent 126 is configured using any combination of program code to collect operation metrics associated with the execution of program 122. The particular types/categories of operation metrics collected by application agent 126 are determined in accordance with a collection profile received by application agent 126 from a management system such as a performance monitor system 110. For example, performance monitor system 110 may generate a collection profile message that specifies multiple program components/elements, such as a particular UI, for and/or from which operation metrics are to be collected. Application agent 126 comprises program instructions for detecting operational conditions and events as categories of operation data that may be recorded as events or quantified in terms of specified operational metrics values.
As shown, application agent 126 generates multiple program operation records 127, each corresponding to a respective training or test cycle. During a training or a test cycle, application agent 126 collects a set of operation metrics for each of multiple program elements within program 122. The set of operation metrics (i.e., combination of particular types of metrics) and the program elements are determined based on a collection profile that may be individually specified and modified for each training or test cycle. As depicted, program operation records each comprise multiple row-wise program element records corresponding to program elements PE1, PE2, PE3, etc. Each program element record associates a program element ID code (e.g., “PE2”) with a combination of operation metrics (e.g., OM=2, OM2=5.5).
Application agent 126 further comprises program code that interacts with UI code of program 122 during a training cycle to generate program element classification records including a program element classification record 130. As part of a training cycle, which may coincide with a test cycle, the UI program components of program 122 generates a UI object to which inputs corresponding to program element classifications are received and detected by application agent 126. For example, the UI object may include multiple input selection objects such as menu selection boxes each corresponding to a respective displayed program element ID. A user enters classifiers such a text-based menu selections, POSITIVE, NEGATIVE, NEUTRAL into the input selection objects and the results are recorded such as within program element classification record 130. Application agent 126 generates classification record 130 to include multiple row-wise program element records that associate a program element ID code (e.g., “PE3”) with a training cycle ID (e.g., “TEST2”), and a performance classification (e.g., “NEUTRAL”).
In response to detecting or otherwise collecting the operation metrics and program element classifications, application agent 126 sends the resultant records via network 106 to a training data generator 108 that includes, in part, performance monitor system 110. Training data generator 108 further includes a collection server 114 that is configured, using any combination of hardware and software components, to collect and organize data for each of the program elements based on the collection profiles specified by performance monitor system 110. Applicant agent 126 is configured to send program operation records such as program operation records 127 and classification records such as classification record 130 to training data generator 108 and particularly to collection server 114.
To communicate the training and/or test data to collection server 114, client device 104 may operate as an initiator device, initiating an update transaction with an update request. Alternatively, collection server 114 may request the training and/or test data updates via a centralized hub (not depicted) to which client device 104 is a subscriber. In either case, collection server 114 includes a training record generator 133 that processes the received updates from monitoring agents such as monitoring agents 126 and 128, and stores the data within a storage system 134. In the depicted embodiment, the data stored within storage system 134 is logically organized at a file system or higher level by a database 136. Database 136 may be a relational database, such as an SQL database, or may be an object-based database such as a Cassandra database. In the depicted embodiment, training record generator 133 stores records that associate data with respective application IDs, such as IDs of applications 122 and 124, from which the data was collected via agents 126 and 128. To further support logically structured and secure access to the records within database 136, training record generator 133 is further configured to collect and record additional client-related information from clients such as client 104. For example the records within database are associated starting with tenant keys T1 and T2, each of which are associated with a number of application records APP1.1-APP1.5, respectively.
Training record generator 133 is further configured to generate labelled training data sets 138 from the program and application reference information stored within database 136. More specifically, training record generator 133 processes program operation records 127, document activity records 132, and program element classification records 130 received from client device 104 to generate training records having supervisor values in the form of performance classifiers. As depicted and described in further detail with reference to
The training records within training data sets 138 are provided to a management client 140 to generate usability performance classification modules. Management client 140 includes a plugin generator 142 that receives and processes training records generated by training data generator 108 to generate performance classification plugins that include patterns recognition code. As depicted and described in further detail with reference to
DBMS 204 includes several program elements including a request handler 212 (PE1) and a catalog manager 214 (PE4). Request handler 212 comprises any combination of program code and data for processing query requests from a database client to retrieve requested portions of database data content. In support of these functions, request handler 212 includes a query optimization UI 211 (PE2) and a request compiler 213 (PE3). Catalog manager 214 comprises any combination of program code and data for generating and modifying the database catalog that stores database schema object definitions. In support of these functions, catalog manager 214 includes a query input menu 215 (PE5), a database catalog UI 217 (PE6), and an object tree generator 219 (PE7).
Incorporated or otherwise communicatively associated with DBMS 204 is an application agent 216 that, similar to agent 126 in
As shown in
Applicant agent 216 is further configured to generate performance classification records for the program elements such as performance classification record 232. As shown in
Also during test operation cycle, TEST1, a reference agent 224 within or otherwise communicatively coupled with reference application 206 detects and records interaction based events associated with the reference document comprising RE1218, RE2220, and RE3222. For example, reference agent 224 detects interaction based events such as page and object selections in association with the reference elements to generate a document activity record 234 during TEST1. As shown in
Test information collected over TEST1, including program operation record 228, classification records 232, and document activity record 234 are received and processed by training record generator 226 in conjunction with index record 230 to generate cross-domain training records. Index record 230 may be generated by training record generator 226 or external to training record generator such as by application agent 216. As depicted in
Training record 270 is “cross-domain” because it combines program element operation information patterns (i.e., combinations of multiple different types of operation metrics) with reference element input activity patterns. Each row-wise records further associates the program element ID and a combination of program element operation metrics with a combination of UI activity metrics associated with reference elements that are associated with the program elements. The reference element UI activity information is collected from document activity record 234 in combination with association record 230. For example, the fourth row-wise record of training record 270 associates PE2 with the corresponding metrics 0.45 and 1.65 and also with six activity metric fields RE1AM1, RE1AM2, RE2AM1, RE2AM2, RE3AM1, RE3AM2.
As indicated by the field labels, the activity metric fields record values corresponding to one of the metric types (AM1 or AM2) and also corresponding to the reference elements associated with the program element ID. For instance, the value in each of the metric fields for the fourth row-wise record corresponds to the cumulative total for the reference elements RE1.1, RE1.3, RE1.4, RE2.2, RE2.3, RE3.1, and RE3.2 associated by association table 230 with PE4. Training record generator 226 further associates each record, by inclusion within a record field or otherwise, a respective usability performance classification based on the performance classifiers recorded in classification records 232 during TEST1. The resulting row-wise program element records within cross-domain training record 270 provide multiple supervised training inputs including a multivariate vector comprising the reference element activity metric fields and the program element operation metric fields with the classifier serving as the supervising value for each record.
With reference to
The system depicted in
The cross-domain records generated by cross-domain synthesizer 248 are received and processed by a performance classifier 250 that comprises, at least in part, one or more of the performance classification plugins 240 called or otherwise retrieved from plugin generator 236. When executed, performance classifier 250 generates a multidimensional feature space that was determined by classification trainer 238 during the training phase. A conceptual representation of an example k-NN map feature space is illustrated in
To implement k-NN pattern classification, performance classifier 250 determines a position of an input point 280 within feature space 274. Input point 280 represents the combination of program element operation metrics and reference element activity metrics contained within a given input cross-domain record received by performance classifier 250 from cross-domain synthesizer 248. For k-NN pattern classification, the relative spacing between and among the training points and input point 280 may be computed as Euclidean distances. In this manner, performance classifier 212 computes a relative positioning of input point 280 among the training points which includes, at least in part, determining a Euclidean distance between the multivariate metric data represented by input point 280 and the multivariate metric data represented by each of the training points.
To further implement k-NN pattern classification, performance classifier 250 partitions the feature space 274 into which the training points are mapped with respect to both the position of input point 280 and an input integer value for k. The partitions are represented in
A next training cycle begins as shown at block 304 with a program agent and components of a training data generator collecting operations metrics for program elements based on a collection profile (block 306). Training data generator 108 may comprise performance monitoring elements as well as local client monitor elements including a program agent such as agent 122. At block 308, the program agent and training data generator generate program operation records such as those depicted in
The process continues as shown at superblock 314 with the program agent generating program element classification records such as records 232 depicted in
When all program element classification records are generated, control passes to block 322 with the training record generator generating cross-domain training records such as record 270 in
A next operation test cycle begins as shown at block 408. The test cycle may be requested by a client node that includes a performance test system such as performance test system 242 in
At block 416, a reference agent detects and records interaction based events to one or more of the associated reference elements. In some embodiment, the reference agent generates a document activity record, which similar to the corresponding program operation for the same cycle, is associated with the current test cycle (block 418). The document activity record includes reference element records that each associate a reference element ID with a set of reference activity metric values corresponding to a combination of different reference activity metric types. The cross-domain activity pattern generation cycle concludes at block 420 with a cross-domain synthesizer generating cross-domain input records that each associate a program element ID with the operation metrics and reference activity metrics that were recorded for the program element and the reference elements associated with the program element.
The cross-domain input records form patterns that are received, detected, and processed by a usability performance classifier that is selected based on the collection profile for the current test cycle (block 422). The usability performance classifier is executed and processes the input records to determine and record individual performance classifications for each of the program elements corresponding to the program element IDs. Control passes from block 426 back to block 408 if additional usability tests are scheduled.
Variations
The flowcharts are provided to aid in understanding the illustrations and are not to be used to limit scope of the claims. The flowcharts depict example operations that can vary within the scope of the claims. Additional operations may be performed; fewer operations may be performed; the operations may be performed in parallel; and the operations may be performed in a different order. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by program code. The program code may be provided to a processor of a general purpose computer, special purpose computer, or other programmable machine or apparatus.
As will be appreciated, aspects of the disclosure may be embodied as a system, method or program code/instructions stored in one or more machine-readable media. Accordingly, aspects may take the form of hardware, software (including firmware, resident software, micro-code, etc.), or a combination of software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” The functionality provided as individual modules/units in the example illustrations can be organized differently in accordance with any one of platform (operating system and/or hardware), application ecosystem, interfaces, programmer preferences, programming language, administrator preferences, etc.
Any combination of one or more machine readable medium(s) may be utilized. The machine readable medium may be a machine readable signal medium or a machine readable storage medium. A machine readable storage medium may be, for example, but not limited to, a system, apparatus, or device, that employs any one of or combination of electronic, magnetic, optical, electromagnetic, infrared, or semiconductor technology to store program code. More specific examples (a non-exhaustive list) of the machine readable storage medium would include the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a machine readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. A machine readable storage medium is not a machine readable signal medium.
A machine readable signal medium may include a propagated data signal with machine readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A machine readable signal medium may be any machine readable medium that is not a machine readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a machine readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as the Java® programming language, C++ or the like; a dynamic programming language such as Python; a scripting language such as Perl programming language or PowerShell script language; and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on a stand-alone machine, may execute in a distributed manner across multiple machines, and may execute on one machine while providing results and or accepting input on another machine.
The program code/instructions may also be stored in a machine readable medium that can direct a machine to function in a particular manner, such that the instructions stored in the machine readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
Any one of the previously described functionalities may be partially (or entirely) implemented in hardware and/or on the processor unit 501. For example, the functionality may be implemented with an application specific integrated circuit, in logic implemented in the processor unit 501, in a co-processor on a peripheral device or card, etc. Further, realizations may include fewer or additional components not illustrated in
While the aspects of the disclosure are described with reference to various implementations and exploitations, it will be understood that these aspects are illustrative and that the scope of the claims is not limited to them. In general, techniques for implementing data collection workflow extensions as described herein may be implemented with facilities consistent with any hardware system or hardware systems. Many variations, modifications, additions, and improvements are possible.
Plural instances may be provided for components, operations or structures described herein as a single instance. Finally, boundaries between various components, operations and data stores are somewhat arbitrary, and particular operations are illustrated in the context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within the scope of the disclosure. In general, structures and functionality shown as separate components in the example configurations may be implemented as a combined structure or component. Similarly, structures and functionality shown as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements may fall within the scope of the disclosure.
As used herein, the term “or” is inclusive unless otherwise explicitly noted. Thus, the phrase “at least one of A, B, or C” is satisfied by any element from the set {A, B, C} or any combination thereof, including multiples of any element.