This application is related to the following U.S. patent applications, all of which are incorporated herein by reference in their entirety:
Embodiments of the present invention relate to the field of device testing.
Numerous electronic technologies such as digital computers, video equipment, and telephone systems have facilitated increased productivity and reduced costs in processing information in most areas of business, science, and entertainment. More and more the components used in these activities interact with a network (e.g., the internet, the cloud, etc.). The number of electronic devices used in these activities is growing rapidly, with new versions and new types of devices with diverse capabilities being continuously and rapidly introduced. Thorough testing of the devices under many different scenarios is important to make sure the devices will function correctly. Providing proper testing environments is often critical to achieving accurate test results. However, when the devices are interacting with very large networks beyond the control of the tester it is difficult to ensure accurate test results.
Traditional attempts at testing devices that communicate with large networks often involve trying to simulate the large communication network. This typically involves significant resources. The traditional approaches are typically implemented in a large stationary facility or room with lots of costly equipment attempting to simulate the large communication network. In addition, providing radio frequency interference mitigation for the large facilities is also typically very expensive and involving numerous individual different test devices in a large shielded room (e.g., oscilloscopes, voltmeters, etc.). These large facilities often require significant manual interaction and supervision to accurately test a device. Each different type of device under test often involves a complete reset and reconfiguration of the large facility. It is also usually inconvenient and disruptive for ongoing field operations to ship products to a single facility for testing. Traditional attempts to automate some aspects of the testing are typically limited. Conventional approaches typically require significant manual support for various activities such as configuring the test environment, equipment maintenance, test case delivery, device profile delivery, test data collection, data analytics and reporting, and consulting, for example. These factors contribute significantly to the cost of traditional device testing.
In one embodiment, a test system comprises: a network access point simulation component, a local control component, and a reference component. The network access point simulation component is configured to simulate communication network access point operations comprising test interactions with user equipment. The number of devices under test included in the user equipment and distinct network access points that are coincidentally simulated are variable. The local control component is configured to direct the network access point simulation component and to control the test interactions with the user equipment. The local control component comprises a test executive operable to direct simulation of communication network operations and the test interactions in accordance with information received from the remote control components. The reference component is operable to communicatively couple with the network access point simulation component similar to the user equipment and validate the test interactions.
In one embodiment, the network access point simulation component, local control component and reference component are portable. The reference component can be configured to simulate at least a portion of the functionality of the user equipment. The reference component is a trusted component with reliable communication characteristics and features. The reference component is also operable to enable calibration of the network access point simulation component and local control component. The reference component enables validation and calibration of the network access point simulation component operations and local control component operations in a controlled local test environment, including validation and calibration of communications with devices under test included in the user equipment. The reference component is operable to communicatively couple with the remote control components. In one exemplary implementation, the test system further comprises a test box communicatively coupled to the network access point simulation component. The test box comprises material operable to shield contents of the test box from electromagnetic radiation interference, and wherein further contents of the test box comprises the reference component.
In one embodiment, a test method comprises: receiving test configuration information; automatically configuring a test network simulation component, automatically configuring a user equipment test control component, and verifying configuration of the test network simulation component and the user equipment test control component. The test network simulation component is operable to simulate test network components comprising test network communication components based on the test network configuration information. The user equipment test control component operable to control communications with user equipment in accordance with the under test control configuration information; and verifying configuration of the test network simulation component and the user equipment test control component.
In one exemplary implementation, the test method further comprises calibrating the test network simulation component and the user equipment test control component. The verifying can be performed locally. The verifying simulates at least a portion of the functionality of the user equipment. The type of the test network component that is verified and calibrated can vary. The type of the test network components that are configured can be selected from the group comprising: a small cell, an evolved packet core (EPC) component, evolved node B (eNodeB) component, Internet Protocol Multimedia System (IMS) component and application servers.
The accompanying drawings, which are incorporated in and form a part of this specification, are included for exemplary illustration of the principles of the present invention and not intended to limit the present invention to the particular implementations illustrated therein. The drawings are not to scale unless otherwise specifically indicated.
Reference will now be made in detail to the preferred embodiments of the invention, examples of which are illustrated in the accompanying drawings. While the invention will be described in conjunction with the preferred embodiments, it will be understood that they are not intended to limit the invention to these embodiments. On the contrary, the invention is intended to cover alternatives, modifications and equivalents, which may be included within the spirit and scope of the invention as defined by the appended claims. Furthermore, in the following detailed description of the present invention, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be obvious to one ordinarily skilled in the art that the present invention may be practiced without these specific details. In other instances, well known methods, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure aspects of the current invention.
Efficient and effective flexible test systems and methods are presented. In one embodiment, a test system is readily adaptable to a variety of configurations. The configurations can be automatically implemented locally and can be based on a large reservoir or database of test information stored and managed remotely. The test systems can be automatically configured to simulate network communication interactions that correspond to various different implementations (e.g., small cell operations, EnodeB operations, evolved packet core (EPC) operations etc). The test systems can be configured to operate in a variety of implementations (e.g., various different types of devices under test, a single device under test, a plurality of devices under test, a single network access point is simulated, a plurality of network access points are simulated, etc.). The test systems are portable and can be conveniently deployed in local environments.
The local test systems and methods facilitate easily implemented convenient local testing of various user equipment. The local test systems and methods can be portable and easy to use, unlike traditional test systems. Unlike conventional test approaches that typically have a number of limitations, traditional testing approaches usually have very cumbersome and complicated test equipment and configuration procedures that consume significant resources to implement and maintain. Even though traditional approaches consume significant resources, the testing capabilities of the traditional testing approaches are also usually limited. For example, the configuration of UE to eNodeB and EPCs (e.g., one to one, one to multiple, and multiple to multiple, etc.) are typically limited or not possible in traditional approaches. A number of traditional test systems and method are also typically directed to limited types of devices that are tested. Local test systems and methods are easily adaptable to and configurable for different UE devices under test. Traditional approaches do not even typically attempt this flexibility and scalability due to the cost of the traditional resources and daunting traditional configuration issues. In a local test system and method, the local test system components have reasonably costs to implement and the automated configuration can be substantially effortless from a user's perspective.
In one embodiment, the local test user equipment interface 120 is configured to simulate various communication characteristics and features (e.g., communications in accordance with a communication infrastructure component, protocol, network, architecture, etc.). The local test user equipment interface 120 can include communications mechanisms compatible with various different types of communication links for communicating with the user equipment 130. The communication links can include wireless communication links (e.g., cellular, WiFi, small cell, etc.), wired communication links (e.g., coaxial radio frequency (RF) link, Ethernet, universal serial bus (USB), etc.), or combinations of different types of communication links.
It is also appreciated that user equipment (UE) can include a variety of different devices under test. The devices under test may provide end users with many different capabilities (e.g., cell phones, computers, tablets, laptops, devices in the Internet-of-things (IoT), etc.). The user equipment can include the capability to collect and exchange data among themselves and with other devices over a network. The user equipment can communicate over networks through a wired or wireless medium or communication link using different types of network protocols, such as but not limited to the 3rd Generation Partnership Project (3GPP) Long-Term Evolution (LTE) standard.
Test system 100 is compatible with simulating various communication network environments or architectures for communicating with the user equipment 130. In one exemplary implementation, the local test user interface component 120 is a network access point simulation component configured to simulate network access point operations. The local control component 110 can simulate a network communication core. The components of the test system can be automatically configured. The configuration of the test environment topology is also flexible. The user equipment can comprise a single device under test or a plurality of devices under test. The number of devices under test and distinct interfaces or network access points that are coincidentally simulated is variable. A single network access point can be simulated or a plurality of network access points can be simulated.
It is appreciated that a plurality of simulated network components and network interfaces or access points can be implemented coincidentally. In one embodiment, the plurality of simulated network components and network interfaces or access points can be implemented substantially simultaneously or in parallel. It is also appreciated that a plurality of UE devices under test can be can be implemented coincidentally. In one embodiment, the plurality of simulated UE devices under test can be implemented substantially simultaneously or in parallel.
In one embodiment, local control component 110 includes an interface component 190 configured to communicate management information. Interface component 190 can be configured to communicate management information from a variety of sources and received via a variety of mechanisms. The mechanism for delivery of management information can be an external network communication connection to a component external to the local test environment, an internal communication connection to another internal test environment management component, and so on. In
In one embodiment, local control component 110 can communicate with a remote component via a real network in the remote environment 101. In one embodiment, the real network (e.g., the Web, the internet, the Cloud, etc.) is a “real” network for communicating information in a normal mode as opposed to being part of a “simulated” network used for testing. The remote component can be a server that provides various testing related information (e.g., device profile test information, test management information, etc.)
While the above management information communications and remote management environment interactions are described as flowing to the local test environment, it is appreciated the local test environment can forward information in the opposite direction. In one embodiment, the local test environment can forward information externally through similar communication mechanisms (e.g., via a communication network, via physical transportation of portable storage devices, etc.). The information can be communicated to a remote management environment.
In one embodiment, configuration of the local control component 110 and local test user equipment interface 120 is automated. The configuration can be based upon information received from the remote management environment 101. The automated configuration can include automated configuration of various aspects of a local test system (e.g., software, firmware, hardware, etc.).
The automated configuration can include automated configuration of local test systems described in Ser. No. 15/236,326 entitled “Local Portable Test Systems and Methods” by Dinesh Doshi et al., which is incorporated herein by reference. In one exemplary implementation, the configuration of local control component 110 and local test user equipment interface 120 is performed with little or no local manual interaction. The remote management environments 101 can be similar to remote management environments described in Ser. No. 15/236,292 entitled “Cloud-Based Services For Management Of Cell-Based Test Systems” by Dinesh Doshi et al., which is incorporated herein by reference. In one exemplary implementation, the configuration of local control component 110 and local test user equipment interface 120 is performed with little or no local manual interaction.
In block 210, test configuration information is received. The test configuration information can include user equipment control configuration information. The test configuration information can be received from a remote management environment. The test configuration information can include device profile information. The device profile has the information or intelligence to control a given type of user equipment being tested. A device profile may be a standard application program interface (API). The API can be modified or adapted by an original equipment manufacturer (OEM) for a particular piece of UE. In one embodiment, a device profile is or includes a test script that adapts generalized software (test code) associated with a component and executed by the computer system to the specific type, make, model, and/or features of the user equipment being tested. In one exemplary implementation, the generalized software is associated with an EPC or IMF at a generic level (e.g., characteristics, features, etc. common to multiple components of the same type, etc.) and the test script adapts the generalized EPC or IMF software to an EPC config file indicating how a specific vendor or communication service provider wants an EPC or IMF configured.
In block 220, a test network simulation component operable to simulate test network components is configured, including test network communication components based on the test network configuration information. The configuration of test network simulation components can be performed locally.
The types of test network components that are configured can vary. The type of test network components that are configured can include: a small cell, an evolved packet core (EPC) component, evolved node B (eNodeB) component, Internet Protocol Multimedia System (IMS) component and application servers. The number of test network simulation components that are configured can vary. In one exemplary implementation, the configuration includes configuring a local control component. The local control component can be similar to local control component 110.
In block 230, a user equipment test control component operable to control communications with user equipment is configured in accordance with the test configuration information. The number of devices under test and the type of devices under test in the user equipment can vary. The configuring of the user equipment test control components can be performed locally. In one exemplary implementation, the configuration includes configuring a local test user equipment interface component. The test user equipment interface component can be similar to test user equipment interface component 120. In one embodiment, test user equipment interface component can at least in part be implemented on a local control component.
In one embodiment, a local control component is implemented on a computer system. The computer system can be portable. The local control component can include various test associated modules implemented on the computer system.
It is appreciated the test system is compatible with simulating various network environments for communicating with user equipment. The simulated network access point can simulate evolved node B (eNodeB) component operations, small cell operations, evolved packet core (EPC) operations, and so on. The configuration of the test network topology is also flexible. The user equipment can comprise a single device under test or a plurality of devices under test. A single network access point can be simulated or a plurality of network access point topologies described in Ser. No. 15/236,326 entitled “Local Portable Test Systems and Methods” by Dinesh Doshi et al., which is incorporated herein by reference.
In one embodiment, a small cell is a network access node that utilizes relatively low power radio communications with a limited range. In one exemplary implementation, the range is between approximately 10 meters and up to approximately 2 kilometers. A small cell can be a femtocell, a picocell, a microcell and so on. The small cell can include a wide range of interfaces (e.g., GSM, LTE interfaces including eNodeB, other 3GPP interfaces, CDMA 2000, W-CDMA, LTE, Wi-Fi, TD-SCDMA, etc.).
In one embodiment, local control component 410 directs network interface simulation component 417 and controls communication with user equipment 430 via network interface simulator 417. In one exemplary implementation, local control component 410 directs configuration of network interface simulation component 417. The local control component 410 can also direct configuration of itself. The configurations can be based on information received from a remote management environment. The configuration of both local control component 410 and network interface simulation component 417 can be substantially or completely automated.
In one embodiment, the local control component 410 and network interface simulation component 417 are configured to simulate a communication network 420. The simulated communication network 420 can include network core component 430 functions and simulated network interfaces 441 and 442. The network interface simulation component 417 can simulate communications for interacting with the user equipment 430 in accordance with characteristics and features of a simulated network interface 441 and 442. The simulated network core component functions can correspond to a public data network (PDN) component functions. The local control component 411 can also simulate interactions with various services 472 and network administrative operations 471. Network interfaces 441 and 442 can include a variety of different types of interfaces with the user equipment 450. In one embodiment, network interface 441 is a cellular wireless interface and network interface 442 is a land line interface.
It is appreciated that the local flexible test system and method approach can enable testing configuration installation and maintenance that facilitates convenient and effective testing of user equipment. In one exemplary implementation, local flexible test systems and methods can handle configurations and re-configurations associated with ongoing evolutions and revisions to communication network technology.
In one embodiment, automated configuration is utilized to handle evolution of a Universal Mobile Telecommunications System (UMTS) network communication to a Long Term Evolution (LTE) network communication architecture. It is appreciated that this is not a trivial task. It is complicated and complex to deal with and implement changes from a Node B network access point interface in a Universal Terrestrial Radio Access (UTRA) of an UMTS architecture to a Evolved UMTS Terrestrial Radio Access (E-UTRA) Node B or eNode B in the UTRA of an LTE architecture. The reconfiguration can include changing the UTRA protocols of Wideband Code Division Multiple Access (WCDMA) or Time Division-Synchronous Code Multiple Access (TD-SCMA) on E-UTRA protocol Uu interfaces to Orthogonal Frequency Division Multiple Access (OFDMA) for downlinks and Singe Carrier-Frequency Domain Multiple Access (SC-FDMA) for uplinks on the LTE-Uu interfaces. Even understanding what the vast complex terminology means at high levels of description, let alone the intricate and sophisticated details of how the details of such network communication systems are implemented is very difficult to say the least. Traditional systems typically required highly trained and specialized users to configure the testing. Automated configuration in the local flexible test system and method approach enables efficient and convenient testing proliferation to a vast number of users.
The network access point simulation component 541 simulates an eNobeB access point. The simulated eNodeB access point can be compatible with 3rd Generation Partnership Project (3GPP) accesses (e.g., GPRS, UMTS, EDGE, HSPA, LTE, LTE advanced, etc.). The eNodeB network access point can be a wireless communication network access point (e.g., similar to a cellular communication system base station, a small cell, etc.). The access point 542 simulates a non-eNodeB access point (e.g., non-3GPP access technology, WiFi, etc).
The local control component 510 simulates LTE network core operations. The simulated LTE network core operations can include simulation of operations typically associated with a Mobile Management Entity (MME) 532, a Serving Gateway (S-GW) 532, a Home Subscriber Server (HSS) 533, a PDN Gateway (P-GW) 534 and 535, an AAA component 537 and an Evolved Packet Data Gateway (EPDG) 538. The simulated network core can include simulated interactions is various services 571 on PDN(s) (e.g., IMS, Internet, etc.) and administrative services 572 including Access Network Directory Selection Function (ANDSF).
The UE control module 620 directs simulation of various UE functions including Hayes command set Attention (AT) modem control function 621, Operating System Android Debug Bridge (OS ADB) and US control function 622, physical robotic control function 623 and Universal Integrate Circuit Card (UICC) function 624. The eNode control function includes LTE PHY function 631, layer 2 Media Access Control (MAC)/Radio Link Control (RLC)/Packet Data Convergence Protocol (PDCP) function 632, Radio Resources Control (RRC) function 633, Non Access Stratum (NAS) EMME/ESM/USI function 624, Internet Protocol (IP) function 635, User Datagram Protocol (UDP)/Real-time Transport Protocol (RTP) 637, and Transmission Control Protocol (TCP) 638. The eNode B control module also simulates various other communication functions (e.g., WiFi, L2, L3, L4/5/6, etc.).
EPC control module 640 directs simulation of various EPC functions including HSS function 642, SGW function 643, PGW function 644 and PCRF function 645 and ePDG function 671. The EPC control module 640 can also direct simulation of various server functions 650 including Evolved Multimedia Broadcast Services (eMBMS), BMSC, content functions 551, Over The Air Device Management (OTA-DM) functions 652, IP Fader functions 653. IP Multimedia Subsystem (IMS) functions 654, Domain Name Service (DNS) functions 655, File Transfer Protocol (FTP)/Hyper Text Transfer Protocol (HTTP) functions 656, Streaming functions 657 and Subscriber Identification Module Over The Air management (SIM OTA) functions 658. The remote cloud services 690 can include test case library 691, UE library 672, Test User Manager 694, Reporting and Data Analytics 695 and SME Engineering 697. Working together the EPC control module 640 and eNodeB control module 630 can direct simulation of UICC function 673.
It is appreciated the same physical local controller and network access simulator can be automatically reconfigured to simulate a different network architecture.
In one embodiment, operations of a local test system are validated. The validation can include checking interactions between the local test components and a trusted reference component that simulates user equipment. The reference component can also be used for calibrating the local test components. In one exemplary implementation, the configuration of the local test system is validated.
In one embodiment, the network access point simulation component 1017 is communicatively coupled to reference component 1050. Reference component 1050 is operable to validate results of the local control component automatic configuration of the test system components. In one exemplary implementation, the reference component 1050 validates the simulated network communications are operating correctly. The network access point simulation component, local control component and reference component are portable. The reference component simulates user equipment communications. The validation includes calibration of the test system components.
In one exemplary implementation, verification and calibration control component 1071 includes processing component 1072, memory 1071, and reference signal generator 1075. Processing component 1072 generates the processing component 1072, directions and can optionally perform analysis of the verification and calibration results. Memory 1071 stores instructions and information for processing component 1072. Reference signal generator 1075 generates reference signals with particular characteristics (e.g., particular frequency, voltage, etc.). Wireless connection interface 1080 communicates with external components wirelessly. In one exemplary implementation, Wireless connection interface 1080 includes antenna 1085, transceiver 1081, and signal processing component 1082.
In one embodiment, the reference component can be an integral part of a local test system component. In one exemplary implementation, the reference component in integrated in the local test control component. The reference component can be integrated in a network access interface simulation component.
It is appreciated that a reference component can be configured to verify and calibrate various different characteristics and features of a local test system. The reference component can verify and calibrate characteristics of a physical layer, protocol layer, and data layer. The validation of the physical layer can include checking communication signal characteristics (e.g., signal strength, frequency, waveform shape, in proper RF bandwidth, channel, MIMO correlations, differences in uplink/downlink, etc.). The reference component can verify and calibrate protocol layer activities including simulated communication network component operations (e.g., EPC operations, server operations, sequencing and scheduling of component attachment, security conformity, etc.). Verification and calibration can include data layer operations (e.g., IP, RMF, data throughput, etc.). In one embodiment, the reference component can be used to verify the integrity of the testing control in a local environment (e.g., check if a test box is shielding from environment electrical interference on RF signals, humidity of test environment, etc.).
It is appreciated that the verification and calibration can be iterative and progressive. In one embodiment, a particular verification and calibration process is performed iteratively. In one exemplary implementation, signal strength is checked and if it is week a calibration change is made to increase the strength and then the signal is checked again, and so on until the signal strength is verified or validated as correct. In one embodiment, a verification and calibration is performed progressively. In one exemplary implementation, the signal characteristics are verified and calibrated and if resolved satisfactorily, then simulated component verification and calibration are performed. If there is still an issued verification of the integrity of the testing condition control can be performed.
In one embodiment, interactions between a remote management system and a reference component include updating the reference component information and configuration information. The interactions can be communicated with or without the local test system in the communication path (e.g., the reference component can communicatively couple directly to a remote network or can go through the local test system components). In one exemplary implementation, if there is a change to UE (e.g., new or updated version of the EU is introduced by a original equipment manufacturer (OEM), correction of a bug in the UE, etc.), and a reference component is meant simulate that UE, then the remote management component can initiate a corresponding appropriate change to the reference component.
In block 1110, test configuration information is received. The test configuration information can include and user equipment control configuration information.
In block 1120, a test network simulation component operable to simulate test network components including test network communication components based on the test network configuration information is automatically configured. The type of the test network component that is configured varies. The type of the test network components that are configured is selected from the group comprising: a small cell, an evolved packet core (EPC) component, evolved node B (eNodeB) component, Internet Protocol Multimedia System (IMS) component and application servers. The number and type of devices under test in the user equipment varies.
In block 1130, a user equipment test control component operable to control communications with user equipment in accordance with the under test control configuration information is automatically configured.
In block 1140, configuration and operations of the test network simulation component and the user equipment test control component are verified. The verification can include calibrating the test network simulation component and the user equipment test control component. The number and type of the test network simulation components that are configured varies. The verification is performed locally. In one embodiment, a verification process is performed.
The validation process can be triggered by a remote management system. In one embodiment, the remote management system performs a variety of different analytics that may indicate a validation process is appropriate. In one exemplary implementation, information from local test system interactions with a reference device is reported back to the remote management system. The information from UE testing can also be reported back to the remote management system.
Based on analytics performed by the remote management system new or additional validation operations can be triggered. In one exemplary implementation, the remote management system collects information from various different sources regarding UE testing (e.g., from OEMs, from other local test systems, etc.) and if a particular local test system UE test results are outside a norm or threshold based on the remote management system information and analytics, the remote management system can trigger a validation process for the particular local test system.
In block 1151, verification interactions between a reference device and a local test system are performed. The verification interactions are directed to verifying operations of the local test system. The verification interactions can be performed in response to an automatic triggering event. The verification interactions can include verifying communication signal characteristics, simulated component operations (e.g., simulated communication network components, other simulated components, etc.) and so on.
In block 1152, results of the verification interactions are reported. The results of the verification interactions can include indications of acceptability and problems with the operations being verified. The results can be reported to the local test system. In one exemplary implementation, the results are reported to a remote management system.
In block 1153, the local test system is calibrated. The calibration can be based upon the verification results. In one embodiment, the calibration is directed to correcting (e.g., adjusting signal power, frequency, signal shape, etc.) issues in the verification results. In one exemplary implementation, the calibration is performed in accordance with information received from a remote management system.
In one embodiment, the automated local test system configuration ensures that the user is able to easily setup and operate the local test system. The configuration automatically installs pre-requisite software, a test executive, test cases, UE library and reporting components. Various communication network simulated components are automatically installed and configured (e.g., eNodeB, EP, IMS server and other application servers for a given configuration, etc.). Software component can be automatically initialized. Validation and calibration can be performed, including using test case routines, to verify successful operation of the local test system. Again, it is appreciated that automated configuration of local test systems is user friendly and convenient, unlike traditional configuration approaches that typically involve significant manual interaction.
In one embodiment, a local test system and method automated configuration call flow includes an initialization process, retrieving information from a remote management system, performing a verification/calibration process, and so on. In one embodiment, an initialization process includes activating the local control component, receiving information from a remote management system, installing and launching local control component modules based on the received information. Retrieving information from a remote management system can include: signing on/registering account with the remote management system, downloading pre-requisites and configuration wizard information, installing virtual box applications (e.g., eNodeB, EPC, and other pre-requisites, etc.), automatically configuring the username/password, set host and confirm host status IP address. Installing and launching includes the QuiNS, Test Executive modules and simulated network components (e.g., EPC, network core components, etc.), An IMS and other application servers can be launched and verified. The eNodeB is powered up and QuiNS connects to eNodeB and verifies successful connectivity.
In one embodiment, a local test system and method automated configuration includes a validation and self calibration process. The validation and self calibration process can include communicative coupling of a reference component to a local test system environment. In one embodiment, the reference component is similar to reference component 1050. The validation and calibration process can be initiated or started automatically or manually. The local test system initializes itself, including initialization of servers (e.g., IMS, FTP, etc.). The local test system can include a local test control component and a local test user equipment interface component). In one exemplary implementation, the local test system retrieves device profile information for a reference component or reference UE. The local test system reboots the reference component and monitors network to verify successful reference component or device attachment. The IMS is monitored for successful registration. The local test system verifies several operations or activities. The operations and activities can include verification of successful mobile originated and mobile terminated SMS operations, successful data throughput operations, and various RF signal strengths. The verification can be repeated at multiple signal strength levels. Verification results can be recorded locally for each level. In one exemplary implementation, RF signal strength loses can be reported for each RF strength level. The results can also be reported externally or remotely. In one embodiment, the results can be reported to remote management environment via a variety of mechanisms. Calibration can be performed based on the verification results. The calibration can include application of proper compensations or offsets to correct issues identified during verification.
While the local testing system and method operations are automated, it is appreciated local testing systems and methods can be readily adapted to or implement varying degrees of manual interaction. In one embodiment, a UE test or local test system validation can be triggered or initiated by manual inputs and the remaining associated test or validation operations performed automatically. In one exemplary implementation, a local test system includes a local user interface. The local user interface can include a presentation of testing information (e.g., local test system configuration information, UE device under test information, testing result information, result information associated with verification/calibration of the local test system, etc.). A reference component can include a user interface for conveying verification/calibration related information and receiving user input.
A remote management system can include a remote user interface. The remote user interface can include a presentation of testing information (e.g., local test system configuration information, UE device under test information, testing result information, result information associated with verification/calibration of the local test system, etc.). In one embodiment, automated aspects of the remote management system, the remote user interface, or combinations of both can remotely monitor or take remote control of local test system and method operations.
In a second test scenario, the local control component directs the simulated base station to simulate an IP connection loss. The UE device reacts to the IP connection loss and sends an indication to the local control component. The local control component monitors the UE device behavior.
In a third test scenario, the local control component directs the simulated base station to simulate a SMS wakeup. The UE device reacts to the SMS wakeup and connects to the IoT server. The local control component monitors the UE device behavior.
While users may have varying degrees of understanding of normal communication networks, the automated configuration capabilities and features of the local test systems and methods enables the test systems and methods configurations to be performed by users with little or no manual interfacing or understanding of how the test systems and methods themselves work. Unlike traditional systems that typically require very sophisticated users that have a thorough understanding of intricate internal workings of the numerous components in a traditional complex test system itself (in addition to the vast different types of complicated network communication architectures and protocols involved in the testing), local test systems and methods facilitate easy configuration from a user standpoint.
Many of the described examples and embodiments of the local flexible test systems and methods are described in terms of single complex communication architectures in order not to obfuscate the invention. It is appreciated that some embodiments of the local flexible test systems and methods can be readily expanded to handle much more complicated and complex testing communication architectures and environments. In one embodiment, multiple communication core architectures can be simulated. In one exemplary implementation, configuration of the local flexible test systems can be expanded to test network communications as user equipment travels from an EPC core network to a GSM network.
Automated testing is flexibly scalable to large numbers of different devices and can be accomplished quicker, more systematically, and at less expense than, for example, manual testing. This is turn can increase test coverage, scalability and reliability while reducing time-to-market, and the cost to both manufacturers and consumers.
Some portions of the detailed descriptions are presented in terms of procedures, logic blocks, processing, and other symbolic representations of operations on data bits within a computer memory. These descriptions and representations are the means generally used by those skilled in data processing arts to effectively convey the substance of their work to others skilled in the art. A procedure, logic block, process, etc., is here, and generally, conceived to be a self-consistent sequence of steps or instructions leading to a desired result. The steps include physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical, magnetic, optical, or quantum signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
It should be borne in mind, however, that all of these and similar terms are associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the present application, discussions utilizing terms such as “processing”, “computing”, “calculating”, “determining”, “displaying” or the like, refer to the action and processes of a computer system, or similar processing device (e.g., an electrical, optical or quantum computing device) that manipulates and transforms data represented as physical (e.g., electronic) quantities. The terms refer to actions and processes of the processing devices that manipulate or transform physical quantities within a computer system's component (e.g., registers, memories, other such information storage, transmission or display devices, etc.) into other data similarly represented as physical quantities within other components.
The foregoing descriptions of specific embodiments of the present invention have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed, and obviously many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and its practical application, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as suited to the particular use contemplated. It is intended that the scope of the invention be defined by the Claims appended hereto and their equivalents. The listing of steps within method claims do not imply any particular order to performing the steps, unless explicitly stated in the claim.
Number | Name | Date | Kind |
---|---|---|---|
3082374 | Buuck | Mar 1963 | A |
4228537 | Henckels et al. | Oct 1980 | A |
4766595 | Gollomp | Aug 1988 | A |
5475624 | West | Dec 1995 | A |
5827070 | Kershaw et al. | Oct 1998 | A |
6308065 | Molinari | Oct 2001 | B1 |
6522995 | Conti et al. | Feb 2003 | B1 |
7406645 | Nozuyama | Jul 2008 | B2 |
7624378 | Achlioptas et al. | Nov 2009 | B2 |
7810001 | Zhou et al. | Oct 2010 | B2 |
7889663 | Wright | Feb 2011 | B1 |
7895575 | Yoon et al. | Feb 2011 | B2 |
8054221 | Luong et al. | Nov 2011 | B1 |
8131831 | Hu | Mar 2012 | B1 |
8283933 | Dasnurkar | Oct 2012 | B2 |
8310385 | Dasnurkar | Nov 2012 | B2 |
8325614 | Poon | Dec 2012 | B2 |
8433953 | Gaudette et al. | Apr 2013 | B1 |
8577813 | Weiss | Nov 2013 | B2 |
8737980 | Doshi et al. | May 2014 | B2 |
8781797 | Oltman | Jul 2014 | B1 |
8793117 | Varshney | Jul 2014 | B1 |
8856539 | Weiss | Oct 2014 | B2 |
8983394 | Deforge | Mar 2015 | B2 |
8996166 | Jenkinson | Mar 2015 | B2 |
9065556 | Popescu et al. | Jun 2015 | B2 |
9100826 | Weiss | Aug 2015 | B2 |
9116873 | Majumdar et al. | Aug 2015 | B2 |
9185675 | Chen et al. | Nov 2015 | B2 |
9253242 | Macalet et al. | Feb 2016 | B2 |
9446519 | Gray et al. | Sep 2016 | B2 |
9469037 | Matthews et al. | Oct 2016 | B2 |
9481084 | Park | Nov 2016 | B2 |
9530137 | Weiss | Dec 2016 | B2 |
9544910 | Delsol et al. | Jan 2017 | B2 |
9596166 | Iyer | Mar 2017 | B2 |
9652077 | Jenkinson | May 2017 | B2 |
9767009 | Cobb, Jr. et al. | Sep 2017 | B2 |
9948411 | Diperna | Apr 2018 | B2 |
9959181 | Hittel | May 2018 | B2 |
9973416 | Henry | May 2018 | B2 |
10003418 | Yellapantula | Jun 2018 | B1 |
10020899 | Diperna | Jul 2018 | B2 |
10025883 | Paule et al. | Jul 2018 | B2 |
20010003209 | Sato | Jun 2001 | A1 |
20020072359 | Moles et al. | Jun 2002 | A1 |
20020116507 | Manjure et al. | Aug 2002 | A1 |
20020165952 | Sewell et al. | Nov 2002 | A1 |
20040012529 | Teshirogi et al. | Jan 2004 | A1 |
20040214564 | Rosen | Oct 2004 | A1 |
20050060132 | Hollander et al. | Mar 2005 | A1 |
20050083068 | Clarridge | Apr 2005 | A1 |
20050222690 | Wang et al. | Oct 2005 | A1 |
20060075305 | Robinson et al. | Apr 2006 | A1 |
20060208754 | Takeuchi et al. | Sep 2006 | A1 |
20060229018 | Mlinarsky | Oct 2006 | A1 |
20060250972 | Seebacher et al. | Nov 2006 | A1 |
20060282736 | Schroth | Dec 2006 | A1 |
20060288243 | Kim | Dec 2006 | A1 |
20070050166 | Spinner et al. | Mar 2007 | A1 |
20070150250 | Inoue et al. | Jun 2007 | A1 |
20070281684 | Parmar | Dec 2007 | A1 |
20070294580 | Lu | Dec 2007 | A1 |
20080072050 | Klonover et al. | Mar 2008 | A1 |
20080081608 | Findikli et al. | Apr 2008 | A1 |
20080096553 | Saksena | Apr 2008 | A1 |
20080238430 | Page et al. | Oct 2008 | A1 |
20080263410 | Mittal et al. | Oct 2008 | A1 |
20080313313 | Doshi | Dec 2008 | A1 |
20080319728 | Bruski et al. | Dec 2008 | A1 |
20090063062 | Takamatsu et al. | Mar 2009 | A1 |
20090112505 | Engel et al. | Apr 2009 | A1 |
20090119084 | Nagashima et al. | May 2009 | A1 |
20090170528 | Bull | Jul 2009 | A1 |
20090215444 | Topaltzas | Aug 2009 | A1 |
20090216495 | Fujiwara et al. | Aug 2009 | A1 |
20090217100 | Grechanik et al. | Aug 2009 | A1 |
20090249284 | Antosz et al. | Oct 2009 | A1 |
20090265035 | Jenkinson et al. | Oct 2009 | A1 |
20090265036 | Jamieson et al. | Oct 2009 | A1 |
20090276190 | Bell, Jr. et al. | Nov 2009 | A1 |
20090312009 | Fishel | Dec 2009 | A1 |
20100075664 | Maucksch | Mar 2010 | A1 |
20100083045 | Qiu et al. | Apr 2010 | A1 |
20100113011 | Gregg | May 2010 | A1 |
20100134090 | Burns et al. | Jun 2010 | A1 |
20100240317 | Giles et al. | Sep 2010 | A1 |
20100299419 | Ramankutty et al. | Nov 2010 | A1 |
20110025337 | Morrow et al. | Feb 2011 | A1 |
20110047428 | Kikta et al. | Feb 2011 | A1 |
20110151863 | Shaw | Jun 2011 | A1 |
20110178766 | York et al. | Jul 2011 | A1 |
20110275364 | Austin | Nov 2011 | A1 |
20110293840 | Newkirk et al. | Dec 2011 | A1 |
20110294470 | Pasquero et al. | Dec 2011 | A1 |
20120041745 | Spilman | Feb 2012 | A1 |
20120131515 | Rice | May 2012 | A1 |
20120139571 | Nickel et al. | Jun 2012 | A1 |
20120146956 | Jenkinson | Jun 2012 | A1 |
20120280934 | Ha et al. | Nov 2012 | A1 |
20120282891 | Mohammed et al. | Nov 2012 | A1 |
20120300649 | Parmar | Nov 2012 | A1 |
20130047038 | Huang | Feb 2013 | A1 |
20130065575 | Poon | Mar 2013 | A1 |
20130078983 | Doshi | Mar 2013 | A1 |
20130090881 | Janardhanan et al. | Apr 2013 | A1 |
20130104105 | Brown et al. | Apr 2013 | A1 |
20130179865 | Neumeyer et al. | Jul 2013 | A1 |
20130183898 | Strid | Jul 2013 | A1 |
20130227348 | Stephenson et al. | Aug 2013 | A1 |
20130273853 | Reed et al. | Oct 2013 | A1 |
20130275072 | Arnold et al. | Oct 2013 | A1 |
20130294255 | Olgaard et al. | Nov 2013 | A1 |
20130331080 | Poon et al. | Dec 2013 | A1 |
20130338958 | Shanishchara et al. | Dec 2013 | A1 |
20130345864 | Park | Dec 2013 | A1 |
20140111484 | Welch et al. | Apr 2014 | A1 |
20140122009 | Meiyappan | May 2014 | A1 |
20140242986 | Poon et al. | Aug 2014 | A1 |
20140305224 | Zhang et al. | Oct 2014 | A1 |
20140321303 | Iyer | Oct 2014 | A1 |
20140379935 | Venkatasubramaniam | Dec 2014 | A1 |
20140380281 | McLaughlin | Dec 2014 | A1 |
20150003505 | Lusted et al. | Jan 2015 | A1 |
20150024720 | Efrati | Jan 2015 | A1 |
20150126132 | Chung | May 2015 | A1 |
20150175005 | Wilding et al. | Jun 2015 | A1 |
20150178421 | Borrelli et al. | Jun 2015 | A1 |
20150297991 | Mahlmeister et al. | Oct 2015 | A1 |
20150301108 | Hamid et al. | Oct 2015 | A1 |
20150327088 | Makhlouf | Nov 2015 | A1 |
20160044520 | Iyer | Feb 2016 | A1 |
20160057607 | Dubesset | Feb 2016 | A1 |
20160065410 | Brunet et al. | Mar 2016 | A1 |
20160087856 | Groenendijk | Mar 2016 | A1 |
20160134737 | Pulletikurty | May 2016 | A1 |
20160187876 | Diperna | Jun 2016 | A1 |
20160187877 | Diperna et al. | Jun 2016 | A1 |
20160192213 | Diperna et al. | Jun 2016 | A1 |
20160255192 | Poon et al. | Sep 2016 | A1 |
20160320889 | Jenkinson | Nov 2016 | A1 |
20160337053 | Diperna et al. | Nov 2016 | A1 |
20160356800 | Glavina et al. | Dec 2016 | A1 |
20170052527 | Dougherty et al. | Feb 2017 | A1 |
20170063474 | Humphrey et al. | Mar 2017 | A1 |
20170111258 | Bezold et al. | Apr 2017 | A1 |
20170126539 | Tiwari et al. | May 2017 | A1 |
20170149634 | Bezold et al. | May 2017 | A1 |
20170156073 | Liu et al. | Jun 2017 | A1 |
20170242129 | Kallankari et al. | Aug 2017 | A1 |
20170300402 | Hoffner et al. | Oct 2017 | A1 |
20180024847 | Campbell | Jan 2018 | A1 |
20180026840 | Toepke et al. | Jan 2018 | A1 |
20180048555 | Doshi | Feb 2018 | A1 |
20180049050 | Doshi | Feb 2018 | A1 |
20180049051 | Doshi | Feb 2018 | A1 |
20180049052 | Doshi | Feb 2018 | A1 |
20180049054 | Doshi | Feb 2018 | A1 |
20180066301 | Karlsson et al. | Mar 2018 | A1 |
20180268378 | Liu et al. | Sep 2018 | A1 |
20180316443 | Diperna et al. | Nov 2018 | A1 |
20190020423 | Diperna et al. | Jan 2019 | A1 |
Number | Date | Country |
---|---|---|
2013192118 | Dec 2013 | WO |
Entry |
---|
Varga et al. (“LTE corenetwork testing using generated traffic based on models from real-world data”, ResearchGate, 2014, pp. 1-6) (Year: 2014). |
Sakai et al. (“Performance Comparison of a Custom Emulation-based Test Environment Against a Real-world LTE Testbed”, ACM, 2015, pp. 106-111) (Year: 2015). |
Number | Date | Country | |
---|---|---|---|
20180049050 A1 | Feb 2018 | US |