Automatically modifying a test for a program on multiple electronic devices

Information

  • Patent Grant
  • 9658933
  • Patent Number
    9,658,933
  • Date Filed
    Friday, June 28, 2013
    11 years ago
  • Date Issued
    Tuesday, May 23, 2017
    7 years ago
Abstract
A system and method for configuring a test for a program is provided. The method, for example, may include receiving, by a processor, an identification of an electronic device, retrieving, by the processor, a configuration of the electronic device from a memory, modifying, by the processor, at least one step of the test based upon the configuration of the electronic device.
Description
TECHNICAL FIELD

The following relates to program testing and more particularly to automatically customizing a test based upon a device to be tested.


BACKGROUND

Electronic devices, such as cellular phones, tablets, televisions and personal computers, come in a variety of sizes and shapes and run a variety of different operation system. Cellular phones, for example, have varying screen sizes, screen length-to-width ratios, operating systems, web browsers and a variety of other factors which force developers to customize programs such that the programs properly function on each unique cellular phone. Testing the program on each electronic device can become cumbersome and time consuming as custom tests generally have to be written for each electronic device.





DESCRIPTION OF THE DRAWING FIGURES

Exemplary embodiments will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and



FIG. 1 is a block diagram of a test environment, in accordance with an embodiment;



FIG. 2 is a flow diagram illustrating an exemplary method for operating a test environment, in accordance with an embodiment; and



FIG. 3 illustrates an exemplary multi-tenant application system in accordance with an embodiment.





DETAILED DESCRIPTION

According to various exemplary embodiments, systems and methods are provided to which allow a developer to easily test programs on a multitude of different electronic devices and software platforms. Devices can vary in a variety of different ways which require a developer to alter a program to function properly on each electronic device. Accordingly, the system presented herein, for example, includes a configuration file for each electronic device. The configuration file includes details on how each device varies. The system further includes a processor which analyzes a test step and modifies the test step for the specific electronic device based upon the configuration file. Accordingly, while the device and program may change, the developer can use the system and method disclosed herein to test the program on any device without having to write a customized test for each device.



FIG. 1 is a block diagram of a test environment 100, in accordance with an embodiment. The test environment includes a mobile test environment provisioning system 110, hereinafter referred to as the MTEP system 110. The MTEP system 110 can be implemented using any suitable computer-based platform, system, or device. Accordingly, the MTEP system 110 includes a processor 120. The processor 120 may be a central processing unit (CPU), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller, a graphics processing unit (GPU), a physics processing unit (PPU), or any other type of logic device or any combination thereof. The processor 120 is coupled to a memory 130. The memory 130 may be any type of non-volatile memory, such as a hard drive or a computer-readable memory, such as a flash drive, a CD, a DVD, a Blu-ray disk or any other type of computer-readable memory.


The environment 100 further includes a configurable testing interface 140. In one embodiment, for example, the configurable testing interface 140 may be part of the MTEP system 110. In other embodiments, for example, the configurable testing interface 140 may be in a remote system hosted on a server (not illustrated). In this embodiment, for example, the MTEP system 110 and configurable testing interface 140 may communicate via a communication system 150 such as a local area network (LAN), a Wi-Fi connection, a cellular connection, a satellite connection or any other type of data connection system.


In one embodiment, for example, the configurable testing interface 140 includes an automated testing interface 142. The automated testing interface 142 may automate a test. In other words, the automated testing interface 142 may automatically interact with a program according to a predefined series of steps. The testing steps vary depending upon the program, however, testing steps could include navigating menus, testing software functions, testing software/hardware interactions preparing software and or hardware for test with relevant configuration, validating of software/hardware response or behavior, or the like. The configurable testing interface 140 may also capture the results of the test and create a report.


In one embodiment, for example, the configurable testing interface 140 may also include a browser automation system 144. In this embodiment, the automated testing interface 142 may utilize the browser automation system 144 to perform the test when the program to be tested operates in a browser, for example, if the program was written in HTML 5. The browser automation system 144 allows a test to interact with the program as if the test were a user, enabling, for example, simulated mouse, touchpad or touchscreen interface movements and inputs (e.g., left clicks, right clicks, scrolls, etc.).


As discussed in further detail below, the MTEP system 110 sends instructions to the configurable testing interface 140 to test one or more local devices 160, local emulators 170 or remote devices or remote emulators 180.


The local devices 160 may be mobile devices or other electronic devices, including, but not limited to, cellular phones, tablet computers, laptop computers, desktop computers, televisions, electronic readers (e-readers), or any other type of electronic device having a web browser or internet connection. The one or more local devices 160 may be communicatively coupled to the MTEP system 110 and/or the configurable emulator interface 140 via a local area network (LAN) or a Wi-Fi connection.


The local emulators 170 may emulate any electronic device including, but not limited to, cellular phones, tablet computers, laptop computers, desktop computers, televisions, electronic readers (e-readers), or any other type of electronic device having a web browser or internet connection. As with the local devices 160, local emulators 170 may be communicatively coupled to the MTEP system 110 and/or the configurable emulator interface 140 via a local area network (LAN) or a Wi-Fi connection.


The remote devices or remote emulators 180 may be actual or virtual devices including, but not limited to, cellular phones, tablet computers, laptop computers, desktop computers, televisions, electronic readers (e-readers), or any other type of electronic device having a web browser or internet connection. The remote devices or remote emulators 180 may be communicatively coupled to the MTEP system 110 and/or the configurable emulator interface 140 via the communication system 150.


A configuration file for each of the local devices 160, local emulators 170 or remote devices or remote emulators 180 is stored in the memory 130. The configuration file may include, for example, a device location (i.e., local or cloud), a device type configuration (i.e., emulator or physical, a device operating system configuration (e.g., iOS, Android, Windows, BlackBerry, an orientation configuration (i.e., landscape or portrait).


Each local device 160, local emulator 170 or remote device or remote emulator 180 includes at least one program. The program could be written in any computing language, including, but not limited to, java, C, C++, C#, flash, HTML 5, or any other computer language. HTML 5, for example, is a markup language for structuring and presenting content in a browser. Accordingly, applications or programs written in HTML 5 would be similar across a variety of mobile or non-mobile platforms, including, but not limited to, Apple iOS, Apple OS, Google Android, Google Chrome, Microsoft Windows Phone, Microsoft Windows, BlackBerry 10, Ubuntu Touch, Unix, MeeGo, Linux, or any other mobile or non-mobile platform. Furthermore, as programs written in HTML 5 run in a web browser environment, the programs would be similar across a variety of web browsers, including, but not limited to, Internet Explorer, Firefox, Chrome, Safari, Opera, or any other web browser. Typically, however, developers must write custom code for each operating system. In some instances, the developer may have to write custom code for each device even if the devices have the same operating system.


Typically a test procedure for the program would also have to be customized for each electronic device. For example, electronic devices have different screen sizes, screen size ratios (i.e., length-width ratio), or orientations. Accordingly, a particular input interface (e.g., a menu, function button, or the like) in the program or application may be located at a different space in various electronic devices. Furthermore, different electronic devices may utilize different procedures for accessing functions, such as a camera, data connection, location based services, touch events or scanner, may have different privacy or security settings, and may be operating under various driver or firmware versions, operating mode—such as airplane mode, do not disturb mode, 3G or Wi-Fi mode, or the like. For example, some devices may require a user to give permission for the program to access certain functions, while other devices may not. As discussed in further detail below, the MTEP system 110 automatically customizes a test for an electronic device such that a developer only needs to write a single testing program, rather than writing a custom test program for every possible electronic device, operating system, web browser, driver, firmware combination.



FIG. 2 is a flow diagram illustrating an exemplary method 200 for operating a test environment, in accordance with an embodiment. In one embodiment, for example, the instructions for operating the test environment may be stored in a computer-readable medium, such as the non-transitory memory 130 illustrated in FIG. 1. A processor, such as the processor 120 illustrated in FIG. 1, first receives a target device identification, a target environment identification and a test identification. (Step 210). The target device identification corresponds to a make and/or model of the electronic device, the target environment corresponds to a location of the target device or emulator as discussed above, and the test identification corresponds to the test to be run. For example, a user may input: iPad/localaddressX/TestY to indicate that the test Y should be customized and run on an iPad located locally at address X. One of ordinary skill in the art would recognize that a variety of input systems could be used to collect the target device identification, target environment identification and test identification information from a user. In another embodiment, for example, a user may merely input a device and optionally an operating system or firmware version. In this embodiment, for example, a list of the location of devices and their corresponding configurations may be stored in the memory 130. Accordingly, the processor, upon receipt of the device information, determines one or more locations of corresponding devices to route the test to.


The test identification corresponds to a test for the program written by the developer. In one embodiment, for example, the test may be stored in the memory 130. In other embodiments, for example, the test may be stored in a memory of the testing interface 140. The test itself might vary dramatically depending upon the program. However, in general, the test would include at least one testing step including an input corresponding with an interaction with the program and a procedure for capturing a response, whether expected or unexpected. Accordingly, if, for example, the program is a quick response code (QR) reader, the testing procedures may include navigating various menus or inputs including an instruction to interact with a camera of the device.


The processor then retrieves a configuration file for the target device in the target environment. (Step 220). As discussed above, the configuration file includes the data needed by the processor to set up and customize the test for the given device in the given environment. The processor then generates the instructions to implement the identified test for the target device in the target environment. (Step 230). As discussed above, electronic devices may have different screen sizes, ratios or orientations, may utilize different procedures for accessing functions and may have different security settings. The processor, based upon the configuration file, modifies the identified test to compensate for the specific device. In one embodiment, for example, the test has a hook to the processor through an annotation. The annotation captures the change in the form of a configuration and flags it to the processor to setup the test for execution. For example, a test that needs to run on an iPad/iOS 6.0 located in the cloud will have a hook in the following format, @Listeners(TestAnnotationDriver.class) @MobileTestConfig(targetDevice=TargetDevice.IPAD_5_0, targetEnvironment=TargetEnvironment.SAUCELABS)).


Furthermore, as discussed above, the device to be tested may be an actual device or an emulator and may be communicatively located on a local network or an external network relative to the developer. Accordingly, the processor, based upon the internet protocol (IP) address (i.e., the location) of the device to be tested, customizes the test instructions such that the test instructions are directed towards the identified electronic device. In one embodiment, for example, the test may be stored on the memory 130. Accordingly, in this embodiment, the processor 120 may customize each test instruction, if necessary, before send the test instruction to the testing interface 140. In another embodiment, for example, the testing interface 140 may transmit a test step to the processor 120. The processor 120 would then determine if the test step requires modification based upon the configuration of the identified electronic device.


The processor then utilizes a configurable testing interface to test the target device. (Step 240). As discussed above, the configurable testing interface 140 may include an automated testing interface and/or a browser automation system to implement the test. In one embodiment, for example, the MTEP system 110 configures the test in real time. In this embodiment, for example, the processor, prior to sending instructions for a particular testing step to the configurable testing interface 140, will determine if the step requires customization for a particular electronic device. If the testing step requires customization for a particular electronic device, the processor modifies the base test procedure based upon the configuration file for the identified electronic device.


Accordingly, the MTEP system 110 allows a developer to easily test a multitude of different electronic devices or emulators, located locally or remotely, having a variety of system specs, driver versions, firmware versions, operating systems and browsers, with a single test.


In one embodiment, for example, the MTEP system 110 may be implemented in a multi-tenant application system. FIG. 3 illustrates an exemplary multi-tenant application system 300 in accordance with an embodiment. The multi-tenant application system 300 suitably includes a server 302 that dynamically creates virtual applications 328A-B based upon data 332 from a common database 330 that is shared between multiple tenants. Data and services generated by the virtual applications 328A-B are provided via network 345 to any number of client devices 340A-B, as desired. Each virtual application 328A-B is suitably generated at run-time using a common platform 310 that securely provides access to data 332 in database 330 for each of the various tenants subscribing to system 300. The multi-tenant application system 300 may also include any number of content delivery networks (“CDNs”) 360A-B, as desired. The CDNs 160A-B may contain a copy of at least some of the data 332 which may be accessible via the network 345. The multi-tenant application system 300 may also employ any number of proxy servers 370A-B which may be used to direct traffic between the server 302 and the CDNs 360A-B.


A “tenant” generally refers to a group of users that shares access to common data within database 330. Tenants may represent customers, customer departments, business or legal organizations, and/or any other entities that maintain data for particular sets of users within system 300. Although multiple tenants may share access to a common server 302 and database 330, the particular data and services provided from server 302 to each tenant can be securely isolated from those provided to other tenants, as described more fully below. However, the applications 328A-B, which are generally written by the customer, may also share common application data in the database 330. The multi-tenant architecture allows different sets of users to share functionality without necessarily sharing each other's data 332. For example, generically written test steps for applications could be stored in the database 330, allowing customers to quickly build testing programs.


Database 330 is any sort of repository or other data storage system capable of storing and managing data 332 associated with any number of tenants. Database 330 may be implemented using conventional database server hardware. In various embodiments, database 330 shares processing hardware 304 with server 302. In other embodiments, database 330 is implemented using separate physical and/or virtual database server hardware that communicates with server 302 to perform the various functions described herein.


Server 302 is implemented using one or more actual and/or virtual computing systems that collectively provide a dynamic application platform 310 for generating virtual applications 328A-B. Server 302 operates conventional computing hardware 304, such as a processor 305, memory 306, input/output features 307 and the like. Processor 305 may be implemented using one or more of microprocessors, microcontrollers, processing cores and/or other computing resources spread across any number of distributed or integrated systems, including any number of “cloud-based” or other virtual systems. Memory 306 represents any non-transitory short or long term storage capable of storing programming instructions for execution on processor 305, including any sort of random access memory (RAM), read only memory (ROM), flash memory, magnetic or optical mass storage, and/or the like. Input/output features 307 represent conventional interfaces to networks (e.g., to network 345, or any other local area, wide area or other network), mass storage, display devices, data entry devices and/or the like. In a typical embodiment, application platform 310 gains access to processing resources, communications interfaces and other features of hardware 304 using any sort of conventional or proprietary operating system 308. As noted above, server 302 may be implemented using a cluster of actual and/or virtual servers operating in conjunction with each other, typically in association with conventional network communications, cluster management, load balancing and other features as appropriate.


The term “exemplary” is used herein to represent one example, instance or illustration that may have any number of alternates. Any implementation described herein as “exemplary” should not necessarily be construed as preferred or advantageous over other implementations.


Although several exemplary embodiments have been presented in the foregoing description, it should be appreciated that a vast number of alternate but equivalent variations exist, and the examples presented herein are not intended to limit the scope, applicability, or configuration of the invention in any way. To the contrary, various changes may be made in the function and arrangement of the various features described herein without departing from the scope of the claims and their legal equivalents.

Claims
  • 1. A method for configuring a test for a program, comprising: receiving, by a processor, an identification of an electronic device configured to operate the program;retrieving, by the processor, a configuration of the electronic device from a memory, the configuration including a screen size ratio, a device web browser and a device operating system;automatically modifying, by the processor, at least one step of the test based upon the screen size ratio, the device web browser and the device operating system of the electronic device.
  • 2. The method according to claim 1, further comprising: receiving, by the processor, a location of the electronic device; andautomatically modifying, by the processor, the at least one step of the test based upon the location of the electronic device.
  • 3. The method according to claim 1, further comprising determining the at least one step of the test requires modification based upon the configuration of the electronic device.
  • 4. The method according to claim 1, further comprising: determining, by the processor, a location of the electronic device based upon the identification of the electronic device; andautomatically modifying, by the processor, the at least one step of the test based upon the location of the electronic device.
  • 5. The method according to claim 1, wherein the electronic device is an emulator.
  • 6. A system for configuring a test for a program on a plurality of electronic devices, comprising: a memory configured to store a respective configuration for each of the plurality of electronic devices, the configuration including a screen size ratio, a device web browser and a device operating system associated with each of the plurality of electronic devices;a communication system; anda processor communicatively coupled to the memory and the communication system, the processor configured to: receive an identification for one of the plurality of electronic devices;retrieve a corresponding configuration of the one of the plurality of electronic devices from the memory; andautomatically modify at least one step of the test based upon the screen size ratio, the device web browser and the device operating system of the one of the plurality of electronic devices.
  • 7. The system of claim 6, wherein the processor is further configured to retrieve the test from the memory.
  • 8. The system of claim 6, wherein the processor is further configured to retrieve at least one step of the test from the communication.
  • 9. The system of claim 6, wherein the processor is further configured to: receive a location of the one of the plurality of electronic devices; andautomatically modify the at least one step of the test based upon the location of the one of the plurality of electronic devices.
  • 10. The system of claim 6, wherein the processor is further configured to determine the at least one step of the test requires modification based upon the configuration of the one of the plurality of electronic devices.
  • 11. The system of claim 6, wherein the processor is further configured to determine a location of the one of the plurality of electronic devices based upon the identification of the one of the plurality of electronic devices; and automatically modifying, by the processor, the at least one step of the test based upon the location of the one of the plurality of electronic devices.
  • 12. The system of claim 6, wherein the program is a HTML 5 program and the test is performed by a browser automation system.
  • 13. The system of claim 6, wherein the processor is further configured to perform the test in an emulator for the one of the plurality of electronic devices.
  • 14. A computer-readable medium storing instructions which when executed by a processor cause the processor to: determine an identification of an electronic device configured to operate a program;retrieve a configuration of the electronic device from a memory, the configuration including a screen size ratio, a device web browser and a device operating system associated with the electronic devices; andautomatically modify at least one step of a test for the program based upon the screen size ratio, the device web browser and the device operating system of the electronic device.
CROSS-REFERENCE TO RELATED APPLICATION(S)

This application claims the benefit of U.S. provisional patent application serial number 61/667,013, filed Jul. 2, 2012, the entire content of which is incorporated by reference herein.

US Referenced Citations (119)
Number Name Date Kind
5577188 Zhu Nov 1996 A
5608872 Schwartz et al. Mar 1997 A
5649104 Carleton et al. Jul 1997 A
5715450 Ambrose et al. Feb 1998 A
5761419 Schwartz et al. Jun 1998 A
5819038 Carleton et al. Oct 1998 A
5821937 Tonelli et al. Oct 1998 A
5831610 Tonelli et al. Nov 1998 A
5873096 Lim et al. Feb 1999 A
5918159 Fomukong et al. Jun 1999 A
5963953 Cram et al. Oct 1999 A
6092083 Brodersen et al. Jul 2000 A
6169534 Raffel et al. Jan 2001 B1
6178425 Brodersen et al. Jan 2001 B1
6189011 Lim et al. Feb 2001 B1
6216135 Brodersen et al. Apr 2001 B1
6233617 Rothwein et al. May 2001 B1
6266669 Brodersen et al. Jul 2001 B1
6295530 Ritchie et al. Sep 2001 B1
6324568 Diec et al. Nov 2001 B1
6324693 Brodersen et al. Nov 2001 B1
6336137 Lee et al. Jan 2002 B1
D454139 Feldcamp et al. Mar 2002 S
6367077 Brodersen et al. Apr 2002 B1
6393605 Loomans May 2002 B1
6405220 Brodersen et al. Jun 2002 B1
6421793 Lester Jul 2002 B1
6434550 Warner et al. Aug 2002 B1
6446089 Brodersen et al. Sep 2002 B1
6535909 Rust Mar 2003 B1
6549908 Loomans Apr 2003 B1
6553563 Ambrose et al. Apr 2003 B2
6560461 Fomukong et al. May 2003 B1
6574635 Stauber et al. Jun 2003 B2
6577726 Huang et al. Jun 2003 B1
6601087 Zhu et al. Jul 2003 B1
6604117 Lim et al. Aug 2003 B2
6604128 Diec Aug 2003 B2
6609150 Lee et al. Aug 2003 B2
6621834 Scherpbier et al. Sep 2003 B1
6654032 Zhu et al. Nov 2003 B1
6665648 Brodersen et al. Dec 2003 B2
6665655 Warner et al. Dec 2003 B1
6684438 Brodersen et al. Feb 2004 B2
6711565 Subramaniam et al. Mar 2004 B1
6724399 Katchour et al. Apr 2004 B1
6728702 Subramaniam et al. Apr 2004 B1
6728960 Loomans et al. Apr 2004 B1
6732095 Warshavsky et al. May 2004 B1
6732100 Brodersen et al. May 2004 B1
6732111 Brodersen et al. May 2004 B2
6754681 Brodersen et al. Jun 2004 B2
6763351 Subramaniam et al. Jul 2004 B1
6763501 Zhu et al. Jul 2004 B1
6768904 Kim Jul 2004 B2
6782383 Subramaniam et al. Aug 2004 B2
6804330 Jones et al. Oct 2004 B1
6826565 Ritchie et al. Nov 2004 B2
6826582 Chatterjee et al. Nov 2004 B1
6826745 Coker Nov 2004 B2
6829655 Huang et al. Dec 2004 B1
6842748 Warner et al. Jan 2005 B1
6850895 Brodersen et al. Feb 2005 B2
6850949 Warner et al. Feb 2005 B2
6988895 Lamarche Jan 2006 B1
7062502 Kesler Jun 2006 B1
7165191 Vakrat Jan 2007 B1
7340411 Cook Mar 2008 B2
7401094 Kesler Jul 2008 B1
7620655 Larsson et al. Nov 2009 B2
7698160 Beaven et al. Apr 2010 B2
7983871 Kimbrough Jul 2011 B2
8082301 Ahlgren et al. Dec 2011 B2
8095413 Beaven Jan 2012 B1
8095594 Beaven et al. Jan 2012 B2
8275836 Beaven et al. Sep 2012 B2
20010044791 Richter et al. Nov 2001 A1
20020072951 Lee et al. Jun 2002 A1
20020082892 Raffel Jun 2002 A1
20020129352 Brodersen et al. Sep 2002 A1
20020140731 Subramaniam et al. Oct 2002 A1
20020143997 Huang et al. Oct 2002 A1
20020162090 Parnell et al. Oct 2002 A1
20020165742 Robins Nov 2002 A1
20030004971 Gong Jan 2003 A1
20030018705 Chen et al. Jan 2003 A1
20030018830 Chen et al. Jan 2003 A1
20030066031 Laane et al. Apr 2003 A1
20030066032 Ramachandran et al. Apr 2003 A1
20030069936 Warner et al. Apr 2003 A1
20030070000 Coker et al. Apr 2003 A1
20030070004 Mukundan et al. Apr 2003 A1
20030070005 Mukundan et al. Apr 2003 A1
20030074418 Coker et al. Apr 2003 A1
20030120675 Stauber et al. Jun 2003 A1
20030151633 George et al. Aug 2003 A1
20030159136 Huang et al. Aug 2003 A1
20030187921 Diec et al. Oct 2003 A1
20030189600 Gune et al. Oct 2003 A1
20030204427 Gune et al. Oct 2003 A1
20030206192 Chen et al. Nov 2003 A1
20030225730 Warner et al. Dec 2003 A1
20040001092 Rothwein et al. Jan 2004 A1
20040010489 Rio et al. Jan 2004 A1
20040015981 Coker et al. Jan 2004 A1
20040027388 Berg et al. Feb 2004 A1
20040128001 Levin et al. Jul 2004 A1
20040186860 Lee et al. Sep 2004 A1
20040193510 Catahan et al. Sep 2004 A1
20040199489 Barnes-Leon et al. Oct 2004 A1
20040199536 Barnes Leon et al. Oct 2004 A1
20040199543 Braud et al. Oct 2004 A1
20040249854 Barnes-Leon et al. Dec 2004 A1
20040260534 Pak et al. Dec 2004 A1
20040260659 Chan et al. Dec 2004 A1
20040268299 Lei et al. Dec 2004 A1
20050050555 Exley et al. Mar 2005 A1
20050091098 Brodersen et al. Apr 2005 A1
20080022167 Chung Jan 2008 A1
Related Publications (1)
Number Date Country
20140005974 A1 Jan 2014 US
Provisional Applications (1)
Number Date Country
61667013 Jul 2012 US