This disclosure is generally directed to a cellular field testing automation tool and improvements thereof. In one example, a method includes (i) initiating a cellular field testing tool that tests a condition of cellular network connectivity of a device under test, (ii) collecting logs of raw data from the cellular field testing tool based on the cellular field testing tool testing the condition of cellular network connectivity of the device under test, and (iii) applying autonomously, by the cellular field testing tool, an intelligent testing policy for a specific network condition test to the logs of raw data to output a binary decision of whether the specific network condition test passed or failed.
In some examples, the intelligent testing policy enhances a version of the cellular field testing tool that otherwise was limited to collecting raw data without analyzing that raw data.
In some examples, the raw data comprises an indication of a number of call drops or a number of call setup failures.
In some examples, outputting the binary decision of whether the specific network condition test passed or failed is based at least in part on the number of call drops or the number of call setup failures.
In some examples, the intelligent testing policy specifies criteria for determining whether the specific network condition test passed or failed.
In some examples, the criteria are provided by a cellular network carrier that uses the cellular field testing tool to test performance of a cellular service network that the cellular network carrier operates.
In some examples, the criteria specify that at least three failed runs of the specific network condition test result in the binary decision indicating that the specific network condition test failed.
In some examples, the criteria specify that two failed runs of the specific network condition test in combination with a successful run of the specific network condition test result in the binary decision indicating that the specific network condition test passed.
In some examples, a failure to satisfy preconditions for executing the specific network condition test results in the binary decision indicating that the specific network condition test failed.
In one example, a computer-readable medium encodes instructions that, when executed by at least one physical processor of a computing device, cause the computing device to perform operations comprising (i) initiating a cellular field testing tool that tests a condition of cellular network connectivity of a device under test, (ii) collecting logs of raw data from the cellular field testing tool based on the cellular field testing tool testing the condition of cellular network connectivity of the device under test, and (iii) applying autonomously, by the cellular field testing tool, an intelligent testing policy for a specific network condition test to the logs of raw data to output a binary decision of whether the specific network condition test passed or failed.
Another method may include (i) initiating a cellular field testing tool that tests a condition of cellular network connectivity of a device under test, (ii) detecting an indication that an initial attempted execution of a specific network condition test has failed, and (iii) reattempting, by recovery logic in response to detecting the indication that the initial attempted execution of the specific network condition test has failed, the same specific network condition test.
In some examples, the method further includes outputting a binary decision of whether the specific network condition test passed or failed based on results of the initial attempted execution and the reattempting of the specific network condition test.
In some examples, the specific network condition test is scheduled to be performed prior to a second and distinct network condition test within a queue of network condition tests.
In some examples, reattempting, by recovery logic in response to detecting the indication that the initial attempted execution of the specific network condition test has failed, the same specific network condition test is performed rather than proceeding directly to the second and distinct network condition test within the queue of network condition tests.
In some examples, detecting the indication that the initial attempted execution of the specific network condition test has failed comprises detecting that an initial call executed by the cellular field testing tool dropped during a predetermined window of time assigned to the specific network condition test.
In some examples, detecting that the initial call executed by the cellular field testing tool dropped during the predetermined window of time triggers the cellular field testing tool to perform a second call.
In some examples, the cellular field testing tool performs the second call in an attempt to avoid dropping the call for the predetermined window of time.
In some examples, a successful execution of the second call for the predetermined window of time results in a binary decision that the specific network condition test passed.
In some examples, a failed execution of the second call for the predetermined window of time results in a binary decision that the specific network condition test failed.
In another example, a non-transitory computer-readable medium encodes instructions that, when executed by at least one physical processor of a computing device, cause the computing device to perform operations comprising (i) initiating a cellular field testing tool that tests a condition of cellular network connectivity of a device under test, (ii) detecting an indication that an initial attempted execution of a specific network condition test has failed, and (iii) reattempting, by recovery logic in response to detecting the indication that the initial attempted execution of the specific network condition test has failed, the same specific network condition test.
Non-limiting and non-exhaustive embodiments are described with reference to the following drawings. In the drawings, like reference numerals refer to like parts throughout the various figures unless otherwise specified.
For a better understanding of the present invention, reference will be made to the following Detailed Description, which is to be read in association with the accompanying drawings:
The following description, along with the accompanying drawings, sets forth certain specific details in order to provide a thorough understanding of various disclosed embodiments. However, one skilled in the relevant art will recognize that the disclosed embodiments may be practiced in various combinations, without one or more of these specific details, or with other methods, components, devices, materials, etc. In other instances, well-known structures or components that are associated with the environment of the present disclosure, including but not limited to the communication systems and networks, have not been shown or described in order to avoid unnecessarily obscuring descriptions of the embodiments. Additionally, the various embodiments may be methods, systems, media, or devices. Accordingly, the various embodiments may be entirely hardware embodiments, entirely software embodiments, or embodiments combining software and hardware aspects.
Throughout the specification, claims, and drawings, the following terms take the meaning explicitly associated herein, unless the context clearly dictates otherwise. The term “herein” refers to the specification, claims, and drawings associated with the current application. The phrases “in one embodiment,” “in another embodiment,” “in various embodiments,” “in some embodiments,” “in other embodiments,” and other variations thereof refer to one or more features, structures, functions, limitations, or characteristics of the present disclosure, and are not limited to the same or different embodiments unless the context clearly dictates otherwise. As used herein, the term “or” is an inclusive “or” operator, and is equivalent to the phrases “A or B, or both” or “A or B or C, or any combination thereof,” and lists with additional elements are similarly treated. The term “based on” is not exclusive and allows for being based on additional features, functions, aspects, or limitations not described, unless the context clearly dictates otherwise. In addition, throughout the specification, the meaning of “a,” “an,” and “the” include singular and plural references.
As used herein, the term “logs of raw data” generally refers to passive measurement data output from a cellular field testing tool and/or radiofrequency drive testing tool. In general, these logs of raw data can be passive in the sense that the cellular field tool does not necessarily apply any evaluation or judgment onto these measurements (e.g., onto all, most all, or the predominant majority of such measurements), but rather leaves further evaluation, judgment, and/or commentary or analysis to the operator of the tool. As used herein, the term “intelligent testing policy” can simply refer to any policy that takes as input passive measurements from the cellular field testing tool and, by applying the policy, generates one or more evaluations, gradings, and/or judgments, etc., to thereby arrive at one or more conclusions regarding how good or bad one or more of the measurements are, as discussed in more detail below. In other words, the policy can be “intelligent” in the sense that it enhances passive measurement data with intelligence to analyze the degree to which the passively measured data is evaluated as desirable, good, appropriate, consistent with specifications, etc.
As used herein, the term “cellular field testing tool” generally refers to a tool that helps to test, when a device under test is connected to a cellular base station and/or a cellular network, one or more attributes of performance and/or cellular network connectivity provided to the device under test. In other words, the cellular field testing tool generally tests how well the device under test performs (or how well the network performs) when connected, and configured, in accordance with a particular configuration at a particular location. Cellular network carriers may be requested to, or required to, satisfy one or more specifications when smartphones and/or other items of user equipment are connected to cellular networks. To help ensure that the cellular network carriers satisfy these particular specifications, the cellular field testing tool can be used to connect to a device under test and then check or verify that the device under test is actually achieving cellular network connectivity that satisfies one or more corresponding performance metrics, which may include dozens or even hundreds of such performance metrics. In some examples, a cellular field testing tool may correspond to (or include) a radio frequency drive testing tool, as that term is used by those having skill in the art.
Despite the above, some cellular field testing tools can suffer from one or more deficiencies or sub-optimizations and these tools may, therefore, benefit from one or more improvements, including improvements that automate one or more procedures that assist a user with operating the tool. These improved cellular field testing tools can, therefore, enable employees, contractors, and/or administrators of the cellular network carriers to appropriately operate these tools even if the administrators lack a degree of experience, sophistication, and/or detailed education regarding the performance and operation of the tools. In other words, automated improvements for the cellular field testing tools can enable less sophisticated operators to operate the tools in a more streamlined and/or user-friendly manner. Consequently, these improvements can furthermore reduce a burden on the carriers of training and/or educating these operators, while further increasing a potential pool of candidate operators for carrying out these testing procedures, as discussed in more detail below.
Similarly, as used herein, the term “precondition” can generally refer to one or more conditions that must be satisfied prior to the starting of a specific and corresponding cellular field testing tool test. Generally speaking, these preconditions refer to contextual preconditions that help to establish that the cellular field testing tool, when operating, will perform successfully and obtain results that are valid and useful (see the discussion of
As used herein, the term “set” can generally refer to a collection of at least one precondition, unless indicated otherwise. Generally speaking, such cellular testing tools may benefit from checking or verifying a larger multitude of preconditions, as discussed in more detail below.
Returning to
At step 402, method 400 may begin. At decision step 404, method 400 may perform a first check of whether the set of preconditions is satisfied. If the answer is yes at decision step 404, then method 400 may proceed to step 416, at which point method 400 may enable the user to begin a specific test, as discussed in more detail below. Alternatively, if the answer is no at decision step 404, then method 400 may proceed to step 406, at which point method 400 may cycle airplane mode on and off the specific device that is failing the preconditions (e.g., the device under test and/or the reference device).
From step 406, method 400 may proceed to decision step 408, which may correspond to the second stage of a staggered series of stages of testing whether the overall set of preconditions has been satisfied. In particular, at decision step 408, method 400 may check for the second time whether the set of preconditions has been satisfied. If the answer is no at decision step 408, then method 400 may proceed to step 410, at which point method 400 may power cycle the device that is failing the preconditions. Alternatively, again, if the answer is yes at decision step 408, then method 400 may proceed to step 416, at which point method 400 may enable the user to begin a specific test.
Lastly, as a third stage of method 400, at decision step 412, method 400 may again check whether the set of preconditions has been satisfied. If the answer is yes at decision step 412, then method 400 may proceed to step 416 again, at which point method 400 may enable the user to begin a specific test. Alternatively, if the answer is no at decision step 412, then method 400 may proceed to step 414, at which point method 400 may raise an audio and/or visual alarm to the user (see also the discussion of
Nevertheless, as previously discussed above in connection with method 400, graphical user interface element 1302 and/or any suitable substitute for inputting information within the computing arts, may be disabled if the set of preconditions has not been satisfied. Thus, in various examples, graphical user interface element 1302 may be displayed in a “grayed out” manner such that, although the user can read a dimmer or grayer version of the “Start” text, attempting to toggle or select graphical user interface element 1302 may not result in any corresponding functionality. In other words, when not enabled, graphical user interface element 1302 may simply correspond to essentially frozen pixels that remain the same regardless of whether the user attempts to toggle them or not. Those having skill in the art will readily understand that any other suitable mechanism for disabling an input mechanism or graphical user interface button may be used to achieve essentially the same purpose of preventing the user from beginning a specific test procedure prior to the preconditions all being satisfied. Moreover, as soon as the preconditions are satisfied, perhaps after one or more stages of performing a series of remedial actions (see
As further discussed above, in some scenarios, even after performing a series of remedial actions (see the three remedial actions of
Returning to the inventive concept reflected in the flow diagram of
Illustrative examples of log file format extensions may include one or more of .drm, .scn, .udm, .cal, .sc1, .sc2, .dlf, .sbs, .mbl, .fmt, .asc, .dtn, .fsn, .ftn, .cap, .pcap, .csv, .txt, .log, etc. Nevertheless, the mere use of such a file format extension does not necessarily indicate that the corresponding file constitutes one or more logs of raw data, as used in method 100A. Rather, the use of such a file extension with a file containing measurement output from a cellular field testing tool and/or radiofrequency drive testing tool, and consistent with the figures described herein, is a factor that can be indicative that the file constitutes a log of raw data.
In some examples, the intelligent testing policy of method 100A enhances a version of the cellular field testing tool that otherwise was limited to collecting raw data without analyzing that raw data. Generally speaking, the inventive concept reflected within method 100A may solve a problem relating to the fact that related cellular field testing tools and/or radiofrequency drive testing tools are generally passive collectors of measurement information. In other words, these tools generally collect a comprehensive multitude of different items of measurement information as part of their testing procedures and then report these items of measurement information without any significant or meaningful analysis, evaluation, judgment, and/or commentary. This shortcoming or deficiency thereby leaves the task of significant or meaningful analysis to one or more human users, which introduces a variety of different applications, deficiencies, and/or shortcomings, including human error, limited analysis speed, limited accuracy, limited precision, the cost of human labor, degrading performance over time, and/or the ability or tendency of such human analysts to become overwhelmed by the sheer quantity of measurement information. To help address one or more of the deficiencies or suboptimizations that are outlined above, method 100A reflects an inventive concept whereby intelligent automation may be enabled, consistent with
In some examples, the raw data may include an indication of a number of call drops or a number of call setup failures. Moreover, in these examples, outputting the binary decision of whether the specific network condition test passed or failed can be based at least in part on the number of call drops or the number of call setup failures.
If the decision is yes at decision step 1606, then the specific network condition test can be marked as a failure at step 1612. Otherwise method 1600 can proceed to decision step 1608, at which point it can be evaluated whether a number of failed runs equals a second variable value, such as two. In other words, the criteria specify that two failed runs of the specific network condition test in combination with a successful run of the specific network condition test result in the binary decision indicating that the specific network condition test passed (e.g., conditionally passed). If the decision is yes at decision step 1608, then at step 1614 the specific network condition test can be marked as a conditional pass. As used herein, the term “conditional pass” can refer to a decision that a specific network condition test passed, while also being supplemented with one or more items of metadata indicating that the passing of the specific network condition test was performed in a conditional sense or context. Moreover, although
As further discussed above, method 100B may involve (i) initiating a cellular field testing tool that tests a condition of cellular network connectivity of a device under test, (ii) detecting an indication that an initial attempted execution of a specific network condition test has failed, and (iii) reattempting, by recovery logic in response to detecting the indication that the initial attempted execution of the specific network condition test has failed, the same specific network condition test. Consistent with method 100B,
Consistent with diagram 1700, the user of the laptop can also receive a binary decision of whether the specific network condition test passed or failed based on results of the initial attempted execution and the reattempting of the specific network condition test. For example, as shown in
In some examples, the specific network condition test is scheduled to be performed prior to a second and distinct network condition test within a queue of specific network condition tests.
In some examples, the specific network condition test is scheduled to be performed prior to a second and distinct network condition test within the queue of network condition tests. By way of illustrative example, as shown in diagram 1800, the specific network condition test “MO VoNR” is scheduled to be performed after the specific network condition test “Voice CALL TEST” and the specific network condition test “LONG CALL TEST” is scheduled to be performed after the specific network condition test “MO VoNR.”
In some examples, reattempting, by recovery logic in response to detecting the indication that the initial attempted execution of the specific network condition test has failed, the same specific network condition test is performed rather than proceeding directly to the second and distinct network condition test within the queue of network condition tests. Thus, in the example of diagram 1800, if an initial attempted execution of one specific network condition test, such as voice CALL TEST, fails, then the recovery logic can re-attempt execution of the same specific network condition test, rather than proceeding directly to the next specific network condition test within the queue (i.e., rather than proceeding to the specific network condition test named “MO VoNr”).
In further examples, detecting the indication that the initial attempted execution of the specific network condition test has failed comprises detecting that an initial call executed by the cellular field testing tool dropped during a predetermined window of time assigned to the specific network condition test. In these examples, detecting that the initial call executed by the cellular field testing tool dropped during the predetermined window of time can trigger the cellular field testing tool to perform a second call. Moreover, in these examples, the cellular field testing tool can perform the second call in an attempt to avoid dropping the call for the predetermined window of time. A successful execution of the second call for the predetermined window of time can result in a binary decision that the specific network condition test passed, whereas a failed execution of the second call for the predetermined window of time can result in a binary decision that the specific network condition test failed.
In particular, shown is example host computer system(s) 1901. For example, such computer system(s) 1901 may execute a scripting application, or other software application, as further discussed above, and/or to perform one or more of the other methods described herein. In some embodiments, one or more special-purpose computing systems may be used to implement the functionality described herein. Accordingly, various embodiments described herein may be implemented in software, hardware, firmware, or in some combination thereof. Host computer system(s) 1901 may include memory 1902, one or more central processing units (CPUs) 1914, I/O interfaces 1918, other computer-readable media 1920, and network connections 1922.
Memory 1902 may include one or more various types of non-volatile and/or volatile storage technologies. Examples of memory 1902 may include, but are not limited to, flash memory, hard disk drives, optical drives, solid-state drives, various types of random access memory (RAM), various types of read-only memory (ROM), neural networks, other computer-readable storage media (also referred to as processor-readable storage media), or the like, or any combination thereof. Memory 1902 may be utilized to store information, including computer-readable instructions that are utilized by CPU 1914 to perform actions, including those of embodiments described herein.
Memory 1902 may have stored thereon control module(s) 1904. The control module(s) 1904 may be configured to implement and/or perform some or all of the functions of the systems or components described herein. Memory 1902 may also store other programs and data 1910, which may include rules, databases, application programming interfaces (APIs), software containers, nodes, pods, clusters, node groups, control planes, software defined data centers (SDDCs), microservices, virtualized environments, software platforms, cloud computing service software, network management software, network orchestrator software, network functions (NF), artificial intelligence (AI) or machine learning (ML) programs or models to perform the functionality described herein, user interfaces, operating systems, other network management functions, other NFs, etc.
Network connections 1922 are configured to communicate with other computing devices to facilitate the functionality described herein. In various embodiments, the network connections 1922 include transmitters and receivers (not illustrated), cellular telecommunication network equipment and interfaces, and/or other computer network equipment and interfaces to send and receive data as described herein, such as to send and receive instructions, commands and data to implement the processes described herein. I/O interfaces 1918 may include a video interface, other data input or output interfaces, or the like. Other computer-readable media 1920 may include other types of stationary or removable computer-readable media, such as removable flash drives, external hard drives, or the like.
The various embodiments described above can be combined to provide further embodiments. These and other changes can be made to the embodiments in light of the above-detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible embodiments along with the full scope of equivalents to which such claims are entitled. Accordingly, the claims are not limited by the disclosure.