Applications may be developed for a wide range of computerized devices including individual computers, networked computer systems and mobile phones. Within each such context, applications may be developed for an even wider range of different uses. During development, an application or program may be tested, perhaps repeatedly, to ensure proper execution, identify and eliminate bugs and optimize usability. Developers may learn how to improve the application under development from these tests.
The accompanying drawings illustrate various implementations of the principles described herein and are a part of the specification. The illustrated implementations are merely examples and do not limit the scope of the claims.
Throughout the drawings, identical reference numbers designate similar, but not necessarily identical, elements.
As indicated above, applications may be developed for many devices, in different contexts, to serve any number of uses. During development, applications may be tested, perhaps repeatedly, to ensure proper execution, identify and eliminate bugs and optimize usability. Developers may document and may compare different tests to learn how to improve the application.
One way of documenting an application test execution may be to make a video recording of the output of the application throughout the test. This may be done by recording the images on the screen or display device of a host system on which the application is being tested. Such a video recording may capture any actions that occur during the test that are echoed on, or output by the application to, the visual display. This will generally include receiving user input from a user input device, such as a keyboard of mouse, and the application's response to that user input.
While this video recording may document the test execution of the application, the video recording itself is unstructured data. Thus, a developer may need to watch the entire video recording to understand the application flow that occurred during the test execution. This may be cumbersome if the developer wants to more quickly understand the application flow or focus on a particular aspect of the text execution.
Additionally, the amount of data in the video recording can be significant. If multiple tests are executed to try various scenarios or compare executions under a single scenario, the volume of video data recorded may become cumbersome to store and manage.
To address these issues, implementations consistent with disclosed examples describe generating application flow entities. In some implementations, an application flow entity may represent each test execution of the application and, as will be described in more detail below, may allow a developer to more quickly and easily document and understand the application flow that occurred during a corresponding test execution of the application.
In the following description, for purposes of explanation, specific details are set forth in order to provide a thorough understanding of the disclosure. It will be apparent, however, to one skilled in the art that examples consistent with the present disclosure may be practiced without these specific details. Reference in the specification to “an implementation,” “an example” or similar language means that a particular feature, structure, or characteristic described in connection with the implementation or example is included in at least that one embodiment, but not necessarily in other embodiments. The various instances of the phrase “in one implementation” or similar phrases in various places in the specification are not necessarily all referring to the same embodiment.
As used herein and in the claims, the term test execution of an application refers to a test in which an application under test is executed on a host system, which could be any device capable of executing applications. The execution may include actions taken manually by a user, for example, as inputs to the application under test. The output from the application under test may be displayed on a screen or display device of the host system on which the test is being executed.
As used herein and in the claims, an application flow entity may be a collection of data that represents or documents a particular test execution of an application. However, the application flow entity is a smaller set of data than a full video record of the test execution so as to facilitate storage and analysis. In some implementations, the collection of data may be generated after the test execution of the application, and may include a portion of the data gathered during the test execution. For example, an application flow entity may include a number of image frames output during a test execution of an application and/or other information about a particular test execution of the application. For example, an application flow entity may include image frames identified as being significant screens in an application flow. As will be described below, in some implementations the application flow entity may have one or two tiers. For example, a one tier application flow entity may include image frame(s) identified as being significant screens in the application flow. As another example, a two tier application flow entity may include a second tier in which these significant screens have been grouped according to a corresponding application screen.
As used herein and in the claims, a “significant” frame may be a frame indicating an action occurring during the test execution of the application. The action may be an action of user input or responsive to user input or may be an action occurring from operation of the application independent of direct user input. An image of a significant screen may also be referred to as a “significant screen.”
As used herein and in the claims, an application screen may a screen output by the application under test. Various different actions may be taken on a single application screen before that screen is changed by the application. A change in application screen may occur, for example, when the application moves from one phase of operation to another and/or changes the graphical user interface presented to the user.
In an example of the principles disclosed herein, a method may include accessing a series of image frames from a test execution of an application; comparing, using an application test analysis device comprising a digital image processor, the image frames to each other to identify a subset of the image frames, the subset of image frames being identified based on actions occurring during the test execution; and automatically generating, using the application testing system, an application flow entity that represents the test execution, the application flow entity being generated based on the subset of image frames. A similar example will now be discussed further with respect to
This series of image frames can be generated in a number of ways. For example, a video camera can be used to video the display device of the host system during the test execution. In this case, the video feed used for subsequent analysis may be compressed by taking only one image frame every 50 milliseconds or at some other period and discarding intervening frames.
In another example, the host system may capture a screenshot of the output on its display device periodically. This might be every 50 milliseconds or some other interval. Additionally, this operation may include tuning the interval between the taking of screenshots based on application type or user behavior. For example, the interval between image frames can be tuned depending on the type of application under test or depending on the level of user activity solicited by the application. During period of relatively heavy user input or application output, image frames may be selected more frequently than at other periods during the test execution.
In some of these examples, a series of image frames may be produced from the test execution of the application. The method of claim 1 may include accessing (100) this series of image frames for analysis. This is explained in further detail below in
The images are compared (102). For example, each image is compared to the immediately preceding image in the series. This comparison is a “visual” comparison of the appearance one frame as against a subsequent frame. Though, this comparison is referred to as visual, it is performed electronically by comparing the digital image data for one frame against that of another. “Visual” comparison in this context is not meant to infer that a human user manually compares the appearance of two frames.
Through this comparison, a subset of image frames, i.e., only a fraction or less than the whole of the series of image frames, is selected. This subset of image frames may be the frames in which an action was occurring in the output of the application as shown on the display device of the host system.
During the test execution of the application, there will be actions that occur that are reflected in the visual output, such as a user entering input, the application responding, the application producing output, etc. In between these events, there will be times at which no action is occurring. For example, the application may be waiting for user input or maybe processing data without any output or change being made to the visual display. The subset of image fames selected will correspond to the points in time at which actions are occurring as reflected on the display device showing the visual output of the application, while image frames from the times between or apart from these moments of action will not be selected for the subset. The image frames documenting the actions that occurred during the test execution may be referred to as being “significant,” whereas image frames in which no action was occurring on the display device are not significant for purposes of understanding or documenting the test execution of the application. Examples of how the significant image frames may be identified are described below.
This subset of image frames may be used to generate (103) an application flow entity. In some implementations, an application flow entity may be an electronic collection of data that may include only the selected image frames from the test execution of the application. The application flow entity may include other data from the test execution of the application, such as a record of the user input entered, an identification of the host system and hardware and operating system environments, and other information characterizing that particular test execution.
The method of
This may be done, for example, by quantifying a difference between the digital data for a first frame as compared to that of a second frame. For example, each pixel in a frame has numeric data that define the appearance, for example, the color, of the pixel. The pixel data for each frame can be evaluated to determine and to quantify how much that pixel data has changed between frames. This also quantifies a visual difference between the frames if presented on a display device. If there is no difference in the pixel data from frame to frame, the frames will appear identical when displayed. If something has changed in the image to be displayed, that change will be reflected in the pixel data. If this difference exceeds a threshold value (203), then the second or “changed” frame is designated as a “significant” frame and added (204) to the subset. Additionally or alternatively, a frame could be selected as “significant” based on an amount of user input, such as mouse moves or clicks, associated with that frame. When the last frame has been evaluated for significance (205), the process may advance.
The method may include determining which frames in the subset of image frames correspond to a same application screen from the application; and grouping together the image frames of the subset that correspond to a same application screen. With reference to
Thus, in
Each group of frames corresponding to a single application screen may be represented subsequently by the last-in-time frame from that group. The output is a second subset of image frames, each representing a group of frames from a common application screen in the subset of significant frames.
Using the subset of significant frames, an application flow entity is generated (210). This application flow entity may include both the subset of all significant frames and the smaller subset of significant frames each representing a group of frames from a common application screen. Alternatively, the application flow entity may include only the smaller subset of image frames. The smaller the application flow entity is, the more readily it can be stored and used in a subsequent analysis that compares different test executions of the application.
In summary of
The subset of frames may be identified by comparing each frame to a previous frame in the series. This may be done by determining a degree of change between compared images from the series of image frames; and when the degree of change exceeds a threshold, adding a corresponding frame to the subset of image frames.
The method may further include determining which frames in the subset of image frames correspond to a same application screen from the application; and grouping together the image frames of the subset that correspond to a same application screen. This is done by comparing frames of the subset to each other to determine a difference between each pair of frames; and, when the difference between a pair of frames is below a threshold, assigning that pair of frames as corresponding to a same application screen. To further reduce the size of the data, the method may include representing an application screen with a last significant frame in the series that corresponds to that application screen.
As shown in
These frames are grouped according to which show or correspond to the same underlying application screen. From each such group, a representative image frame may be taken to form a second subset (302). For example, the last-in-time frame from each group may be taken as the representative frame for that group to be included in the second subset (302).
An application flow entity (305) is generated based on the subsets of frames. The application flow entity (305) may be generated in several different ways depending on the needs and preferences of the application developer. For example, the application flow entity (305) may include both the first and second subsets of frames (301 and 302) along with or without other information, described herein, about the test execution of the application. In another example, an application flow entity (305-2) may only include the second subset of frames (302), with or without other information about the corresponding test execution of the application. This would application flow entity (305-2) would have the smallest size and demands on storage and processing resources. Lastly, the application flow entity (305-1) may include only the first subset of significant frames, with or without other information about the corresponding test execution of the application.
In
In a succeeding image frame, shown in
If the image frame of
Next, in the image frame of
Referring again to the example of
As described herein, the digital image processor is further to: determine which frames in the subset of image frames correspond to a same application screen from the application being tested; and group the image frames of the subset according to corresponding application screen. The application flow entity may include one representative image frame from each group of image frames.
The test of the application is recorded visually, as described above. In some examples, this may be done using a camera (717) which videos the display device (713) of the host system (711) throughout the test. Alternatively, the host system (711) may include a screenshot grabber (712) that periodically outputs a screenshot of the output on the display device (713) of the host system. The screenshot grabber (712) may be a software component running on the host device (711), for example, a browser plug-in or client application.
As described above, a series of image frames from the application test execution, either from the camera (717) to the screenshot grabber (712), are available to an application test analysis device that generates an application flow entity as described in the illustrated example, the host system (711) is a separate device from the application test analysis device (700) and provides the series of image frames to the application test analysis device (700) via a network (705). This network (705) could be any computer network, for example, the internet or a local area network at the facility of an application developer. However, the application test analysis device could alternatively be incorporated into the host system so as to be part of the system on which the test execution is performed.
Examples of the application test analysis device (700) may include hardware, or a combination of hardware and programming. In some examples, however, the application test analysis device could be implemented entirely in hardware.
The interface (701) may be, for example, a network interface to allow the device (700) to access the image frame series from the host system (711). The image frame series may be transmitted from the host system (711) to the application test analysis device (700) directly. Alternatively, the image frame series may be archived at some network-accessible location from which the test analysis device (700) retrieves the image frame series using the interface (701).
The digital image processor (702) may be a dedicated graphics processing unit (GPU). Alternatively, the digital image processor (702) may be a general processor specifically programmed to perform various aspects of the methods described above.
The digital image processor (702) compares frames from the series of image frames output by the host system (711). This comparison may be, as described above, to determine a first subset of significant image frames from among the series of image frames from the host system (711). Additionally, the comparison may be, as described above, to group significant image frames according to a corresponding application screen.
The application test analysis device (700) outputs an application flow entity (703). Various different examples of the application flow entity are described above. The application flow entity (703) is much easier to store and use in analysis than would be the entire stream of image frames from the host system (711). The application flow entity (703) records a particular test execution of the application under test (710) and can be used to understand that test execution and compare that test execution to other test executions, including those using the same or a different test scenario or script.
In other examples, the host system could be a computer system owned and operated by a non-professional application tester operating remotely. This approach, called crowd testing, allows an application developer to pay anyone to test certain flows manually on, for example, a per hour basis or per defect found basis. In such examples, the application developer provides the volunteer tester with a testing client or browser plug-in that will run on the tester's machine to provide the functionality of the host device (711) described herein. For example, the testing client or browser plug-in may provide the screen grabber described herein that returns screenshots of the application test execution to the application developer for analysis as described above.
In other examples, such a client or browser plug-in could be used by a machine after the application has actually been deployed in a real production environment. This may be desired when the actual production environment or specific uses cases are too complex or expensive to reproduce in a lab test setting, yet further analysis and debugging of the application are still needed.
A non-transitory computer-readable medium may include, for example, a hard-drive, a solid-state drive, or any other device from which instructions can be read by a processor, including Random Access Memory and other forms of volatile memory. In some examples, the computer readable medium (905) may be the memory device (705) shown in
The preceding description has been presented only to illustrate and describe examples of the principles described. This description is not intended to be exhaustive or to limit these principles to any precise form disclosed. Many modifications and variations are possible in light of the above teaching.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US15/62914 | 11/30/2015 | WO | 00 |