This application claims priority under 35 U.S.C. §119 to Korean Patent Application No. 10-2013-0008824, filed on Jan. 25, 2013, the disclosure of which is hereby incorporated by reference in its entirety.
Exemplary embodiments of the present inventive concept relate to a test system, and more particularly, to a test system for evaluating a mobile device, and a driving method thereof.
As a result of the development and prevalence of mobile devices and the convergence of multimedia functions, the complexity of software on mobile devices is increasing. Accordingly, requirements for tools for evaluating mobile devices and test automation are increasing. As the complexity of mobile devices increases, test times also increase. As test times increase, fatigue of quality assurance (QA) engineers caused by long tests may decrease the efficiency of problem detection.
Exemplary embodiments of the present inventive concept provide a test system for automatically evaluating a mobile device, and a driving method of the test system.
According to an exemplary embodiment of the present inventive concept, a test system includes a mobile device and a host for evaluating the mobile device. The host receives an image corresponding to the screen of the mobile device from the mobile device, displays the received image on the screen of the host, scans the screen of the host, and recognizes the image based on the results of the scanning.
In an exemplary embodiment, the host may send an event to the mobile device.
In an exemplary embodiment, the image may include a plurality of icons or a plurality of widgets.
In an exemplary embodiment, the host may compare a bitmap corresponding to the screen of the host to bitmaps respectively corresponding to the plurality of icons stored in the host, or to bitmaps respectively corresponding to the plurality of widgets stored in the host.
In an exemplary embodiment, the host may recognize the screen of the host based on the results of the comparison.
In an exemplary embodiment, the mobile device may be connected to the host through an android debug bridge (ADB), and the ADB may utilize a Universal Serial Bus On-The-Go (USB OTG) specification.
In an exemplary embodiment, the host may store a test automation framework (TAF) for evaluating the mobile device, and the TAF may include a virtual screen module (VSM) configured to receive the image corresponding to the screen of the mobile device from the mobile device, and to display the received image on the screen of the host. The TAF may further include a screen scanning module (SSM) configured to scan the screen of the host, and a framework core module (FCM) configured to recognize the screen of the host based on the results of the scanning. The FCM may include a basic test module for evaluating the mobile device, and the basic test module may load a test case for evaluating hardware or the software of the mobile device, and drive the test case.
In an exemplary embodiment, the TAF may further include a user-developed test module using the basic test module.
In an exemplary embodiment, the TAF may further include a picture test module (PTM) configured to evaluate operation of displaying pictures using the basic test module, a camera test module (CTM) configured to evaluate the operation of a camera using the basic test module, and a power measurement module (PMM) configured to measure power consumption using the basic test module.
In an exemplary embodiment, the mobile device may be driven by the Android™ Operating System (OS), and the mobile device may be a smartphone, a tablet PC, or a digital camera.
In accordance with an exemplary embodiment of the present inventive concept, a driving method of a test system including a host for evaluating a mobile device includes displaying an image corresponding to the screen of the mobile device on the screen of the host, scanning the screen of the host, and recognizing the image based on the results of the scanning.
In an exemplary embodiment, the driving method may further include sending an event to the mobile device if no event is generated in the mobile device.
In an exemplary embodiment, sending the event may include executing the event by the mobile device.
In an exemplary embodiment, scanning the screen of the mobile device may include comparing the bitmap corresponding to the image to bitmaps respectively corresponding to a plurality of icons stored in the host, or to bitmaps respectively corresponding to a plurality of widgets stored in the host.
In an exemplary embodiment, recognizing the image may include recognizing the locations of the plurality of icons or the plurality of widgets forming the image based on the results of the comparison.
According to an exemplary embodiment of the present inventive concept, a test system includes a mobile device including an evaluation application, and a host including a test automation framework (TAF). The host is configured to receive an image corresponding to a screen of the mobile device via the evaluation application and the TAF, display the received image on a screen of the host, scan the received image, and identify a portion of the received image based on a result of scanning the received image.
According to an exemplary embodiment of the present inventive concept, a driving method of a test system includes receiving an image corresponding to a screen of a mobile device at a host, displaying the received image on a screen of the host, scanning the received image displayed on the screen of the host, and identifying a portion of the received image based on a result of scanning the received image.
According to an exemplary embodiment of the present inventive concept, a test system includes a test automation framework (TAF) stored at a host. The TAF includes a virtual screen module (VSM) configured to receive an image corresponding to a screen of a mobile device from an evaluation application stored at the mobile device, and display the received image at the host, a screen scanning module (SSM) configured to scan the received image, and a framework core module (FCM) configured to identify a portion of the received image based on a result of scanning the received image.
The test system, according to exemplary embodiments, may automatically evaluate a mobile device.
Further, the driving method of the test system, according to exemplary embodiments, may provide a method for automatically evaluating a mobile device.
The above and other features of the present inventive concept will become more apparent by describing in detail exemplary embodiments thereof with reference to the accompanying drawings, in which:
Exemplary embodiments of the present inventive concept will be described more fully hereinafter with reference to the accompanying drawings. Like reference numerals may refer to like elements throughout the accompanying drawings.
It will be understood that when a component is referred to as being “connected to” or “coupled to” another component, it can be directly on, connected or coupled to the other component, or intervening components may be present.
It should also be noted that in some alternative implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
As used herein, the terms evaluating and testing with regards to a mobile device may be used interchangeably.
Referring to
For example, when the hardware or software of the mobile device 10 changes, the hardware or software of the mobile device 10 may be evaluated. According to exemplary embodiments, the mobile device 10 may be, for example, a smartphone, a tablet PC, or a digital camera, however the mobile device 10 is not limited thereto. The mobile device 10 will be described in further detail with reference to
The host 20 is a test apparatus that may be used to evaluate the mobile device 10. The host 20 includes a test automation framework (TAF) 30. The TAF includes software that is used to evaluate the mobile device 10. According to exemplary embodiments, the host 20 may be, for example, a personal computer, a workstation, a server, a mainframe computer, or a supercomputer, however the host 20 is not limited thereto.
The TAF 30 receives an image corresponding to the screen of the mobile device 10, displays the received image on the screen of the host 20, scans the displayed screen, and identifies the content of the image based on the result of the scanning.
The TAF 30 may automatically send an event to the mobile device 10. The event may mimic an actual operation performed on the mobile device 10. For example, the event may include an operation corresponding to touching or dragging a specific application icon, an operation corresponding to typing on the screen of the mobile device 10, an operation corresponding to swiping between different screens of the mobile device 10, an operation corresponding to navigating through various menus of the mobile device 10, etc. The TAF 30 will be described in further detail with reference to
In an exemplary embodiment, the test system 100 may include an android debug bridge (ADB) 40 for connecting the mobile device 10 to the host 20. The ADB 40 is used to physically connect the mobile device 10 to the host 20. According to an exemplary embodiment, the ADB 40 may utilize the Universal Serial Bus On-The-Go (USB OTG) specification. The ADB 40 may be utilized in the test system 100 to evaluate a mobile device 10 that uses the Android™ operation system. However, the mobile device 10 is not limited to a device running the Android™ operating system. As a result, protocols other than the ADB 40 may be included in the test system 100 to allow for the connection of other mobile devices 10 (e.g., mobile devices 10 running operating systems other than Android™) to the host 20 for evaluation. Further, the mobile device 10 may be connected to the host 20 using the Joint Test Action Group (JTAG) specification.
The TAF 30 may test the mobile device 10 to determine whether the mobile device 10 is capable of properly displaying images. For example, the TAF 30 may evaluate hundreds of pictures. While testing the mobile device 10, the TAF 30 may store a screenshot of each test stage.
The TAF 30 may execute a benchmark application on the mobile device 10, and read the resultant score. Further, while the benchmark application is driven, the TAF 30 may measure the amount of power consumption of the mobile device 10. The TAF 30 may be used for daily regression testing and an aging test.
The TAF 30 may iteratively execute the same task a specified number of times. Accordingly, a test engineer or worker can use the test system 100 to spend less time on a repetitive or iterative task.
Referring to
The OS 12 is system software that manages the hardware 11 and provides a common system service and a hardware description platform for executing the application 13. According to exemplary embodiments, the OS 12 may be, for example, Windows™ (including Windows Phone), iOS™, Android™, or TIZEN™, however the OS 12 is not limited thereto.
The TAF 30 may evaluate the respective interfaces between the hardware 11 and the OS 12, and between the OS 12 and the application 13.
Referring to
If no event is generated at the mobile device 10, the TAF 30 may send an event to the mobile device 10, as described above. The ERM 15 may receive the event from the TAF 30, and execute the event on the mobile device 10.
Referring to
Referring to
The VSM 31 may receive an image corresponding to the screen of the mobile device 10 from the VSM 14 of the mobile device 10. The VSM 31 may display the received image on the screen of the host 20. That is, the VSM 31 may relay the screen image of the mobile device 10 to the screen of the host 20.
The screen image of the mobile device 10 may be, for example, a graphic user interface (GUI). For example, if the mobile device 10 uses the Android™ OS, the GUI may correspond to a home screen that includes icons or widgets corresponding to a plurality of applications capable of being executed by the Android™ OS.
The SSM 32 scans the screen image of the mobile device 10. For example, the SSM 32 may compare an image file (e.g., a bitmap) corresponding to the screen image of the mobile device 10 to other image files (e.g., other bitmaps) stored at the host 20 corresponding to the respective icons, or to other image files (e.g., bitmaps) stored at the host 20 corresponding to the respective widgets. The SSM 32 may send the results of the comparison to the FCM 33.
The FCM 33 may identify a portion of the screen image of the mobile device 10 based on the results of the comparison. For example, the FCM 33 may identify the respective locations of a plurality of icons or a plurality of widgets that form the screen (e.g., when the screen corresponds to a GUI) of the mobile device 10. Accordingly, the TAF 30 may evaluate any one of a plurality of icons or a plurality of widgets. Further, the FCM 33 may determine the event that will be generated on the screen of the mobile device 10 through the SSM 32. Although the present example describes a screen image of the mobile device 10 corresponding to the GUI of the mobile device 10, including icons and/or widgets present in the GUI, exemplary embodiments of the present inventive concept are not limited thereto. For example, exemplary embodiments may be used to perform evaluation of other areas of the mobile device 10 such as, for example, within different settings screens of the mobile device 10, within specific applications installed on the mobile device 10, etc.
Further, if the screen of the mobile device 10 does not need to be scanned, the FCM 33 may request the ESM 34 to send an event to the mobile device 10. Accordingly, the ERM 15 of the mobile device 10 may receive an event sent from the ESM 34 of the host 20, and in response, the ERM 15 may then provide an actual effect corresponding to the event to the screen of the mobile device 10.
The FCM 33 may determine the event that will be generated next on the mobile device 10. For example, the FCM 33 may send an event to the mobile device 10 corresponding to touching the screen of the mobile device 10, typing letters on the mobile device 10, etc. The FCM 33 is capable of determining the proper event to subsequently generate as a result of the FCM 33 having the capability to identify the current screen of the mobile device 10.
Accordingly, the TAF 30 can dynamically process the screens of the mobile device 10 without having to estimate a fixed delay between the changing of screens. That is, because the FCM 33 recognizes the content of the screen of the mobile device 10, the host 20 is able to interact with the mobile device 10 to perform evaluation of the mobile device 10 without the interaction of a user.
As described above, if no event is generated by the mobile device 10, the ESM 34 may send an event to the mobile device 10. For example, if the FCM 33 determines that an event corresponding to touching an arbitrary location on the screen of the mobile device 10 should be sent to the mobile device 10 for evaluation purposes, the FCM 33 may send the event to the mobile device 10 through the ESM 34.
The FCM 33 may include basic test modules for evaluating the mobile device 10. Some of these basic test modules included in the FCM 33 will be described below with reference to
Referring to
Referring to
At block S02, the SSM 32 scans the image sent to the host 20 from the mobile device 10. For example, the SSM 32 may compare an image file (e.g., a bitmap) corresponding to the screen image of the mobile device 10 to other image files (e.g., bitmaps) stored at the host 20 respectively corresponding to a plurality of icons, widgets, etc. The SSM 32 may then send the results of the comparison to the FCM 33.
At block S03, the FCM 33 may identify the screen image of the mobile device 10 based on the results of the comparison. For example, the FCM 33 may identify the locations of a plurality of icons and/or a plurality of widgets, and may determine that the screen image of the mobile device 10 corresponds to a GUI (e.g., a home screen) of the mobile device 10.
Further, at block S03, the FCM 33 may determine the next event to be generated. For example, the FCM 33 may send an event corresponding to touching or dragging a specific application icon, an event corresponding to typing on the screen of the mobile device 10, etc., to the mobile device 10.
At block S04, the FCM 33 determines whether to request transmission of the event to the mobile device 10. If it is determined that the event is to be transmitted, the event is sent at block S05. If it is determined that the vent is not to be transmitted, the method progresses to block S06.
For example, if no event is generated at the mobile device 10, or if the mobile device 10 is in an idle state, the FCM 33 may request that the ESM 34 send the event to the mobile device 10.
At block S05, the ESM 34 sends the event to the ERM 15 of the mobile device 10.
At block S06, the ERM 15 of the mobile device 10 receives the event sent from the ESM 34 of the host 20. The ERM 15 then executes the event at the mobile device 10. For example, the ERM 15 may provide an effect corresponding to the event to the screen of the mobile device 10. For example, if the event is to touch an Angry Birds™ icon on the screen of the mobile device 10, the ERM 15 may execute the corresponding Angry Birds™ application on the mobile device 10.
Referring to
The settings of the TAF 30 vary, and may include, for example, settings for execution of test suites, settings for performance measurement, settings for developers, etc. The basic settings of the TAF 30 may include, for example, entering a test name, selecting an operating system version, selecting the type of AP board, setting the resolution of a virtual screen, setting an execution speed, selecting auto logging, setting whether to run an optical character reader (OCR), etc.
If auto logging is enabled, the TAF 30 may store log files of both the mobile device 10 and the host 20, or log files of either the mobile device 10 or the host 20. If OCR is enabled, the TAF 30 may read text from the image using an OCR application. For example, the TAF 30 may read a benchmark score of the mobile device 10.
The settings for execution of test suites may include the selection of which test suites to execute. For example, a user may select test suites to be performed based on the specific functionality of the mobile device 10 to be tested. For example, the test suites may include test suites respectively configured to test still images, video, a camera, 3D games, etc. That is, the TAF 30 may perform a test of evaluating only still images by selecting an appropriate test suite. Further, the TAF 30 may perform a test of evaluating all of still images, video, a camera, and 3D games, by selecting multiple test suites.
Each test suite may include at least one test case. For example, the test case may include an operation of copying a picture file to an SD card or an operation of installing a benchmark application.
The user may select which test suites to be executed, and may further add a user-developed test suite that is able to be driven on the GUI of the mobile device 10.
In the settings for performance measurement, the user may, for example, select scenarios and tools for measuring the performance of the mobile device 10, and may enter the number of iterations by which evaluation will be repeatedly performed. In order to obtain more accurate measurement values, while measuring the performance of the mobile device 10, the TAF 30 may disconnect the mobile device 10 from the host 20 and then reconnect the mobile device 10 to the host 20. The disconnecting and reconnecting between the mobile device 10 and the host 20 may be performed by software, rather than physically disconnecting and reconnecting the mobile device 10 from the host 20.
The settings for developers may include, for example, selecting a rendering mode, selecting whether to save evaluation results, selecting whether to scan an SD card, selecting screen recording, selecting whether to reboot the operation system upon error generation, and selecting whether to shut down the host 20 upon termination.
If screen recording is enabled, the user may record the entire test procedure using a screen storage application.
If the setting to reboot the operation system upon error generation is selected, the TAF 30 may reboot the mobile device 10, and the TAF 30 may then resume the next test case. For example, the TAF 30 may provide a reboot signal using the JTAG specification to reboot the mobile device 10.
The user of the TAF 30 may create a new test item or add new settings to the settings of the TAF 30 by upgrading the software.
Referring to
A test case storage area 335 may store at least one test case. For example, the test case storage area 335 may store first and second test cases TC1 and TC2.
The test driver 333 may read the test cases (e.g., the first and second test cases TC1 and TC2) stored in the test case storage area 335 through a test case loader 334.
The test driver 333 may call a test procedure 336 for the first test case TC1. Before driving the first test case TC1, the test procedure 336 may configure a testing environment for the mobile device 10. For example, the test procedure 336 may control an operation of copying a picture file to an SD card, an operation of installing a benchmark application, etc. The test procedure 336 may further call a test application 337.
The test application 337 may perform actual testing for the first test case TC1. The test application 337 may perform various other functions. For example, the test application 337 may fetch a current time, capture a screen shot, store the results of testing in an arbitrary file, etc.
The test application 337 may perform an icon application 338, and may execute a home screen 337a, a screen shot 337b, a directory file time 337c, and an image comparator 337d process through the icon application 338.
The home screen 337a is an application for relaying the current screen of the mobile device 10 to a home screen. The screen shot 337b is an application for capturing the current screen of the mobile device 10. The directory file time 337c is an application for reading information corresponding to the current time of the mobile device 10. The image comparator 337d is an application for comparing images to each other.
After the actual testing of the test application 337 is terminated, in order to execute the second test case TC2, the test procedure 336 may perform a task for terminating testing of the first test case TC1. For example, the test procedure 336 may perform an operation of removing picture files of the first test case TC1, an operation of deleting a benchmarking application, etc.
If an exception occurs while the current test case is being executed, the test driver 333 may restart the test procedure from the beginning.
Referring to
For example, the picture test module 35 may use the main test module 331, the test setting module 332, the test case storage area 335, etc. as the basic test modules.
The picture test module 35 may use a test picture driver 353 extended from the test driver 333 of the FCM 33. For example, the test picture driver 353 may have a function for displaying picture files in addition to the functions of the test driver 333. Accordingly, the test picture driver 353 may use the functions of the test driver 333. Further, the picture test module 35 may use a test picture procedure 356 extended from the test procedure 336 of the FCM 33, a test picture application 357 extended from the test application 337 of the FCM 33 and an icon gallery 358, and a test case picture loader 354 extended from the test case loader 334 of the FCM 33.
The FCM 33 may be extended by test suite modules specified by a user. For example, if a test engineer wants to test a function of displaying picture images stored in the mobile device 10, the test engineer may develop a picture test item by extending the test case loader 334, the test driver 333, the test procedure 336, and the test application 337.
When the TAF 30 is driven, picture test components specified by a test engineer may be executed to evaluate the display function of the mobile device 10.
Referring to
The scripts SC may include image files (e.g., bitmaps) corresponding to icons, such as, for example, a gallery application, a camera application, device tools, etc.
The scripts SC further show an operation that is to be executed next. For example, an area surrounded by dotted lines in
Referring to
For example, the current screen of the host 20 may be displayed on the first screen D1. The SSM 32 may scan the screen of the host 20. That is, the SSM 32 may compare an image file (e.g., a bitmap) corresponding to the screen of the host 20 to an image file (e.g., a bitmap) corresponding to the application drawer in the scripts SC, and then send the results of the comparison to the FCM 33. Accordingly, the FCM 33 may identify the location of the application drawer based on the results of the comparison.
Referring to
Referring to
For example, the FCM 33 may request the ESM 34 to send an event corresponding to clicking the icon of the application drawer to the mobile device 10. That is, the ESM 34 may send an event corresponding to clicking the icon of the application drawer to the ERM 15. Accordingly, the ERM 15 may receive the event corresponding to clicking the icon of the application drawer, and execute the event at the mobile device 10.
Referring to
Referring to
Referring to
Referring to
For example, the FCM 33 may request the ESM 34 to send an event corresponding to clicking the icon of the Angry Birds™ application to the mobile device 10. That is, the ESM 34 may send the event corresponding to clicking the icon of the Angry Birds™ application to the ERM 15. Accordingly, the ERM 15 may receive the event corresponding to clicking the icon of the Angry Birds™ application, and execute the event.
Referring to
It is to be understood that the present inventive concept may be implemented in various forms of hardware, software, firmware, special purpose processors, or a combination thereof. That is, exemplary embodiments of the present inventive concept may be embodied directly in hardware, in a software module(s) executed by a processor, or in a combination of the two. In one embodiment, the present inventive concept may be implemented in software as an application program tangibly embodied on a non-transitory program storage device. The application program may be uploaded to, and executed by, a machine comprising any suitable architecture.
Referring to
It is to be further understood that, because some of the constituent system components and methods depicted in the accompanying figures may be implemented in software, the actual connections between the system components (or the processes) may differ depending upon the manner in which the present inventive concept is programmed. Given the teachings of the present inventive concept provided herein, one of ordinary skill in the related art will be able to contemplate these and similar implementations or configurations of the present inventive concept.
While the present inventive concept has been particularly shown and described with reference to the exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present inventive concept as defined by the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2013-0008824 | Jan 2013 | KR | national |