Set top boxes under test

Information

  • Patent Grant
  • 10291959
  • Patent Number
    10,291,959
  • Date Filed
    Thursday, July 6, 2017
    7 years ago
  • Date Issued
    Tuesday, May 14, 2019
    5 years ago
Abstract
A system for testing multiple set top boxes independently and simultaneously using different types of device probes is disclosed. The system includes real-time, bi-directional/asynchronous communication and interaction between system components.
Description
TECHNICAL FIELD

The present invention is directed to a system for testing devices.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates a high-level system architecture for testing devices, according to certain embodiments.



FIG. 2 illustrates some of the testing components and the interaction between the testing components, according to certain embodiments.



FIG. 3 illustrates a sample architecture that includes the testing components, according to certain embodiments.



FIG. 4 illustrates a set top box under test, according to certain embodiments.





DETAILED DESCRIPTION

Methods, systems, user interfaces, and other aspects of the invention are described. Reference will be made to certain embodiments of the invention, examples of which are illustrated in the accompanying drawings. While the invention will be described in conjunction with the embodiments, it will be understood that it is not intended to limit the invention to these particular embodiments alone. On the contrary, the invention is intended to cover alternatives, modifications and equivalents that are within the spirit and scope of the invention. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.


Moreover, in the following description, numerous specific details are set forth to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that the invention may be practiced without these particular details. In other instances, methods, procedures, components, and networks that are well known to those of ordinary skill in the art are not described in detail to avoid obscuring aspects of the present invention.


According to certain embodiments, an innovative system can test a set of devices simultaneously. Further, such a testing system is capable of testing disparate devices simultaneously.


According to certain embodiments, such a testing system provides a separate set of interfaces for each device that is under testing of the set of devices. Further, such a system is designed to be adaptive by being extendable for testing new devices with corresponding new testing interfaces without fundamentally changing the core architecture of the testing system. As a non-limiting example, the testing system includes a core testing subsystem with a user interface and asynchronous communication among the system components such that new types of devices and new tests can be added and executed in a seamless fashion.


According to certain embodiments, the user interface can communicate through web sockets with the universal tester. Such communication is in real-time, bi-directional and asynchronous so that the user can control and monitor the testing of multiple devices simultaneously and independently of each other using the same universal tester.


According to certain embodiments, the testing system is capable of testing a set of similar types of devices or a set of disparate devices.


According to certain embodiments, a testing solution system can be a three layer implementation. The number of layers may vary from implementation to implementation. FIG. 1 illustrates a high-level system architecture for testing devices, according to certain embodiments. FIG. 1 shows a test bench browser interface 102 that is in communication with a web-socket 104, that is, in turn, in communication with a core testing processor 106. According to certain embodiments, the communication between the test bench browser 102, web-socket 104 and core testing processor 106 can be a TCP/IP communication. As a non-limiting example, the web browser is used as a user interface that communicates through web-sockets with the core testing processor. As a non-limiting example, communication may be in the form of JSON messages using TCP/IP protocol, according to certain embodiments. JSON is Java script object notation for transmitting data between the server and web applications.



FIG. 2 illustrates some of the testing components and the interaction between the testing components, according to certain embodiments. FIG. 2 shows a user interface 202, web-sockets 204, a core testing processor 206, database 208, test configuration modules 210, testing environmental modules 212, a plurality of probes (214, 216, 218) to connect the devices under test (DUT) to the core testing processor 206, and a speed test module 220, according to certain embodiments. Speed testing is used for evaluating the performance of the Wifi and other media network connection and accessibility of the device under test. FIG. 2 shows as non-limiting examples, a Wifi probe 214, an Ethernet local area network (LAN) probe 216 and a MOCA probe 218. In other words, according to certain embodiments, various probes can be included such as a wireless local area network (WLAN) probe, an Ethernet wide area network (WAN) probe, a multimedia over coax alliance (MoCA) WAN probe, a MoCA LAN probe and a wireless probe via antenna. According to certain embodiments, servers and other components in the testing system may be distributed over a plurality of computers.


According to certain embodiments, core testing processor 206 loads and reads files from test configuration modules 210 and test environmental modules 212 to initialize various components of the testing system. When the system is ready to begin testing after the initialization process, the system notifies a user that is using the testing system to test one or more devices (DUTs) of the readiness of the testing system. The user installs each device (DUT) of the set of DUTs that are to be tested on the test bench and the serial number of each DUT is scanned. The core testing processor 206 receives the serial number information of each DUT and using the serial number, retrieves further information associated with each DUT based on the serial number from database 208, according to certain embodiments. The core testing processor 206 dynamically loads test configuration information 210 and test environment information 212 based on device information such as make, model etc of a given DUT. After the test configuration and test environment information are loaded, the core testing processor 206 begins executing the various tests corresponding to each DUT so that the set of DUTs can be tested simultaneously. Each test may correspond to underlying testing modules associated with Wifi, LAN, WAN or MoCA etc, interfaces of the DUT and such modules can be executed locally, remotely or at the device.


According to certain embodiments, the test configuration information identifies the test modules and corresponding testing scripts that are to be executed by the core testing processor 206 at run time. The core testing processor 206 also provides the test results and other feedback information to the user via the browser user interface 202 and web sockets 204. Further, the user can send user input and requests to the system through the browser user interface 202 and web sockets 204.


According to certain embodiments, core testing processor 206 determines the success or failure of a given test based on the test configuration parameters and output results of the testing. Further, upon failure of a given test, core testing processor 206 may continue further testing or halt test execution based on test configuration parameters, according to certain embodiments.


Upon completion of the relevant tests, a success message can be sent to the user via the browser user interface 202 and web sockets 204. Even though the DUTs in the set of DUTs are tested simultaneously, the user does not have to wait until all the DUTs in the set have been completed to begin installing other devices that need testing. Further, the testing of the devices need not be started at the same time. Soon after the testing is completed for a given DUT, the tested DUT may be uninstalled from its slot in the test bench and a new DUT can be installed in its slot so that testing can begin for the newly installed device.


According to certain embodiments, the test results can be stored locally and/or pushed to the cloud so that the results can be viewed remotely from any location. Further, the test results can be aggregated. According to certain embodiments, aggregated data includes data combined from several data measurements. Summary reports can be generated from such aggregated data. Non-limiting examples of summary reports include charts and graphs that display information on all the DUTs or at least a subset of the DUTs. Thus, the summary reports generated from the aggregated data can provide an overview of the testing information and characteristics of the DUTs. The aggregated data can reveal trends and other related information associated with the DUTs. Further, the aggregated data can include user-level data, access account activity, etc. According to certain embodiments, the testing system includes a billing system to charge for the testing services for each device.



FIG. 3 illustrates a sample architecture that includes the testing components of a universal tester, according to certain embodiments. FIG. 3 shows a browser user interface or operator dashboard 302, a test controller 304, a universal tester 306 and a device under test (DUT) 308. There may be multiple devices under testing simultaneously but only one device under test is shown for convenience in FIG. 3.


According to certain embodiments, browser user interface or operator dashboard 302 may include information 310 associated with each device under test. The information 310 can include DUT serial number 311, and testing progress information 312. Browser user interface or operator dashboard 302 may also include user command function buttons 314 and drop down menus (not shown in FIG. 3). According to certain embodiments, the user can configure slot details (e.g., port numbers, IP address for the slot, etc), configure testing preferences such as push to cloud, export to billing, etc.


According to certain embodiments, test controller 304 may include a universal tester webserver 316 that is in communication (e.g., TCP/IP) with a universal tester database 318. A billing process within the controller (not shown in FIG. 3) may be in communication with a billing service or application (not shown in FIG. 3). As a non-limiting example, database 318 can be a SQL database. Database 318 can store information associated with each slot in the test bench. As non-limiting examples, database 318 can store for each slot, test details, test history, test logs, DUT information (e.g., DUT serial number, model name, etc), testing preferences/configuration, user interface details/preferences/configuration, billing information, cloud push information, MSO/customer information (media subscriber organization), OEM (original equipment manufacturer) information, slot information, user information, and any persistent data needed by the universal device testing system for running tests.


According to certain embodiments, universal tester 306 may include web sockets 320 that are in communication (e.g., TCP/IP) with browser user interface or operator dashboard 302 and core testing processor 324. According to certain embodiments, core testing processor 324 is in communication with test controller 304 (e.g., TCP/IP) and in communication (e.g., Telnet/SSH secure shell) with probes/containers (328, 330, . . . , 332, 334). Core testing processor 324 is also in communication with configuration modules 322 (e.g., testing and environment configuration). Non-limiting examples of probes include Wifi probe 328, LAN probe 330, MoCA probe 332 and WAN probe 334. There may be other types of probes including MoCA WAN probe, MoCA LAN probe and other types of wireless probes besides Wifi probes depending on the characteristics of the device being tested.


According to certain embodiments, Wifi probe 328, LAN probe 330, MoCA probe 332 and WAN probe 334 communicate with the respective device under test through the relevant ports on the device such as Wifi port 336, LAN port 338, MoCA port 340 and WAN port 342. Core testing processor 324 executes the relevant configured tests for the respective DUT. Status and test results can be sent to the user's dashboard (using JSON format messages as a non-limiting example) via the web-sockets.


Non-limiting examples of devices under test (DUTs) include set top boxes, cable modems, embedded multimedia terminal adapters, and wireless routers including broadband wireless routers for the home or for commercial networks.



FIG. 4 illustrates a testing architecture for a set top box under test, according to certain embodiments. As previously explained, multiple similar or disparate devices can be tested simultaneously and independently of each other using the same universal tester. Thus, multiple set top boxes can be tested simultaneously and independently of each other using the same universal tester, along with other types of devices using the same universal tester. For purposes of simplicity only one set top box is shown in FIG. 4. FIG. 4 shows a universal tester 404 and set top box 402, which is the device under test for this specific case. Universal tester 404 includes a plurality of virtualization containers (probes) for communicating with corresponding interfaces of set top box 402. For example, the core testing processor of the universal tester (as described herein) uses the HDMI (high definition multimedia interface) probe/container 406b to test the HDMI interface 406a of set top box 402. Similarly, audio/video probe/container 408b can be used to test the audio/video interface 408a of set top box 402. Another audio/video probe/container 410b can be used to test the Coax TV Output interface 410a of set top box 402. IR (infra red) probe/container 412b can be used to test the IR interface 412a of set top box 402. CATV coax probe/container 416b can be used to test the Coax interface 416a of set top box 402. The associated core testing processor executes the relevant configured tests for the set top box 402. Status and test results can be sent to the user's dashboard (using JSON format messages as a non-limiting example) via the web-sockets.


According to certain embodiments, when executing a specific test for a given DUT, the core testing processor loads and reads test configuration information (for example from an XML structure) and identifies the relevant test script that needs to be executed. Inputs that are needed for executing the relevant test script are retrieved and supplied as inputs to the relevant test script. The following is a non-limiting sample script.

  • Create DUT object & Environment Object
  • Verify Serial Number
  • Verify Warranty
  • Check Report Server
  • Check DUT Staging


Checks for DUT Serial number in Database or Webservice

  • Get DUT Readiness Information


Checks Webservice for test readiness status of DUT in the test process

  • Configure Container Environment
  • Clear Environment Temp Files
  • Analyze DUT for Factory Reset


Checks ability to login to DUT


Asks operator to manually Factory Reset if unable to login

  • Confirm Factory Reset (if needed)


Waits for operator to confirm that DUT was factory reset and booted up properly

  • Check Ethernet LAN connections to DUT


Ping connections: Eth LAN 1, 2, 3, 4


Fails if any ping to these connections fail

  • Detect DUT


Checks connection to DUT through socket connection

  • Reset Password


Operator scans password which is stored temporarily for use in the remainder of test until finished

  • Login to GUI


Done through web-scraping

  • Get DUT Information and compare values


Information retrieved through web-scraping

  • Enable Telnet


Enables telnet on DUT through web-scraping

  • Factory Reset


Factory resets DUT through telnet command

  • Enable Telnet after Factory Reset


Enables telnet on DUT through web-scraping

  • Confirm Power, WAN Ethernet, and Internet LEDs
  • Confirm all LAN Ethernet LEDs
  • Confirm WiFi LED
  • Configure Wireless Network


Through telnet commands


Sets N Mode


Enables Privacy


Sets WPA (Wi-Fi Protected Access)


Removes WEP (Wired Equivalent Privacy)


Assigns WiFi Channel to DUT (channel different by slot)


[Channel 1: slots 1, 4, 7, 10, 13, 16]


[Channel 6: slots 2, 5, 8, 11, 14]


[Channel 11: slots 3, 6, 9, 12, 15]


Verifies changes through GUI


Disables WiFi once done through telnet

  • Check Firmware Version and Upgrade Firmware (if needed)


Firmware version: 40.21.18

  • Cage Closed Confirmation Check


Asks Operator to Close Door on Cage

  • Connect Wireless Card


Waits on shared Resource Server (located on TC) for Resource L2 (Layer 2) Lock

    • Lock waiting timeout: 600 sec
    • All L2 Locks are able to run in parallel but not when any L3 (Layer 3) Lock is running


Obtains Lock


Enables WiFi through telnet


Set WiFi Card

    • Total Retries allowed: 6 (2 sets of 3 retries)


Ping WiFi from DUT


L2 ARP Test on WiFi: must receive 10/10 ARP packets

    • Total Retries allowed: 6 (2 sets of 3 retries)


If either Set WiFi Card or L2 ARP Test Fail after its 3 retries, Ask Operator to Check Antennas


Performs one more retry in full (set of 3 retries each for Set WiFi Card and L2 ARP Wifi Test) after Check Antennas


Disables WiFi through telnet


Releases Lock

  • Wireless to LAN Ethernet Speed Test


Waits on shared Resource Server (located on TC) for Resource L3 Lock

    • Lock waiting timeout: 1800 sec
    • L3 Locks must be run one at a time and when no L2 Lock is running


Obtains Lock


Enables WiFi through telnet


Connects WiFi Card


Iperf3 Speed Test, 5 seconds for UDP Speed Test, 7 seconds for TCP Speed Test, Sending 200 Mbps Bandwidth


Bandwidth must be greater than 60 Mbps on TCP (Reverse) or 70 Mbps on UDP (Forward)

    • If Fail after 2 retries, ask operator to Check Antennas
    • Retries up to 2 times more if still Fail
    • Therefore, Total Retries allowed: 4 (2 sets of 2 retries)


Runs sudo iwlist wlan0 scan and returns all Wireless Signals seen

    • Results parsed to print all visible SSIDs and its matching Signal level


Disables WiFi through telnet


Releases Lock

  • Confirm WPS LED
  • Confirm LAN Coax LED
  • Confirm USB 1+2 LEDs
  • Configure WAN MoCA
  • Confirm WAN Coax LED
  • Ping WAN MoCA
  • L2 Test on LAN Ethernet


Arp Test from Eth LAN 1 to Eth LAN 2, 3, 4


Must receive 10/10 on all LAN connections

  • LAN Ethernet to LAN Ethernet Speed Test


From Eth LAN 1 to Eth LAN 2, 3, 4


Iperf3 Speed Test, 5 seconds Reverse and Forward, Sending 1200 Mbps Bandwidth


Bandwidth must be greater than 700 Mbps


Total Retries allowed: 2

  • Check WAN and LAN MoCA Data Rates


Rx and Tx Data rates for both WAN and LAN MoCA retrieved through telnet


All Rates must be greater than 180 Mbps

  • LAN Ethernet to WAN MoCA FTP Speed Test


From Eth LAN 1 to WAN MoCA


Iperf3 Speed Test, 5 seconds Reverse and Forward, Sending 1200 Mbps Bandwidth


Bandwidth must be greater than 60 Mbps


Total Retries allowed: 2

  • LAN MoCA to LAN Ethernet FTP Speed Test


From Eth LAN 1 to LAN MoCA


Iperf3 Speed Test, 5 seconds Reverse and Forward, Sending 240 Mbps Bandwidth


Bandwidth must be greater than 60 Mbps


Total Retries allowed: 2

  • LAN MoCA to WAN MoCA FTP Speed Test


From LAN MoCA to WAN MoCA


Iperf3 Speed Test, 5 seconds Reverse and Forward, Sending 240 Mbps Bandwidth


Bandwidth must be greater than 60 Mbps


Total Retries allowed: 2

  • Enable WAN Ethernet


Through telnet command

  • LAN Ethernet to WAN Ethernet FTP Speed Test


From Eth LAN 1 to Eth WAN


Iperf3 Speed Test, 5 seconds Reverse and Forward, Sending 1200 Mbps Bandwidth


Bandwidth must be greater than 700 Mbps


Total Retries allowed: 2

  • Clear Persistent Logs
  • Final Factory Restore


According to certain embodiments, the core testing processor uses a reflection and command design pattern to invoke the relevant configured script(s) corresponding to each DUT being tested. For example, in the command design pattern one or more of the following are encapsulated in an object: an object, method name, arguments. According to certain embodiments, the core testing processor uses the Python “reflection” capability to execute the relevant test scripts for a given DUT. The core testing processor is agnostic of the inner workings of the relevant test scripts for a given DUT.


According to certain embodiments, lightweight software containers (virtualization containers) are used to abstract the connection of probes to the different DUT interfaces in order to avoid conflicts. Non-limiting examples of virtualization containers are Linux containers. As a non-limiting example, Linux container is an operating-system-level virtualization environment for running multiple isolated Linux systems (containers) on a single Linux control host. In other word, lightweight virtualization containers are used to ensure isolation across servers. By using virtualization containers, resources can be isolated, services restricted, and processes provisioned to have an almost completely private view of the operating system with their own process ID space, file system structure, and network interfaces. Multiple virtualization containers share the same kernel, but each virtualization container can be constrained to only use a defined amount of resources such as CPU, memory and I/O. The relevant test script might need to connect to the DUT interfaces directly or through the virtualization containers to execute the tests. The core testing processor receives the test results from running the relevant test scripts. The core testing processor can further process and interpret such results and can also send the results to the user's browser via web sockets. According to certain embodiments, the respective core testing processors are in communication (e.g., Telnet/SSH secure shell) with the virtualization containers (there may be multiple virtualization containers). The virtualization containers (probes) are in communication with corresponding DUT interfaces using Telnet/SSH/TCP/UDP/HTTP/HTTPS etc, as non-limiting examples.


According to certain embodiments, a system for testing a plurality of devices comprises: a universal tester; at least one test controller; a plurality of sets of testing probes; and a plurality of web sockets; wherein:


the plurality of devices includes a plurality of set top boxes;


the universal tester is enabled for communication with a platform independent user interface through the plurality of web sockets;


the plurality of sets of testing probes comprising:

    • at least one HDMI probe for testing a corresponding HDMI interface of a set top box of the plurality of set top boxes;
    • at least one audio video probe for testing a corresponding audio video interface of the set top box of the plurality of set top boxes;
    • at least one audio video probe for testing a corresponding coax TV output interface of the set top box of the plurality of set top boxes;
    • at least one IR probe for testing a corresponding IR interface of the set top box of the plurality of set top boxes;
    • at least one CATV coax probe for testing a corresponding coax interface of the set top box of the plurality of set top boxes; and


the plurality of web sockets enable real-time bi-directional and asynchronous communication between the platform independent user interface and the universal tester for simultaneously testing the plurality of devices under test by the universal tester.


According to certain embodiments, the real-time bi-directional and asynchronous communication of the plurality of web sockets enable a user to control the testing of the plurality of devices simultaneously but asynchronously and independently of each other using the universal tester.


According to certain embodiments, the plurality of devices installed in the universal tester for purposes of simultaneous testing comprise a set of disparate devices.


According to certain embodiments, the plurality of devices installed in the universal tester for purposes of simultaneous testing comprise a set of similar devices.


According to certain embodiments, the testing system is adaptable to augmenting the test controller, the plurality of web sockets, the user interface and the plurality of sets of testing probes to accommodate testing of new types of devices.


In the foregoing specification, embodiments of the invention have been described with reference to numerous specific details that may vary from implementation to implementation. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims
  • 1. A test system for simultaneously testing a plurality of devices, comprising: a universal tester having a plurality of sets of testing probes, each set of testing probes of the plurality of sets of testing probes configured to communicate with corresponding ports of a respective device under test (DUT) of a plurality of devices under test (DUTs) when the DUT is connected to the universal tester for testing;an operator dashboard for simultaneously displaying an information window associated with each DUT of the plurality of DUTs; anda plurality of web sockets, each web socket of the plurality of web sockets associated with a DUT of the plurality of DUTs and configured to enable real-time bi-directional and asynchronous communication between the corresponding information window on the operator dashboard and the universal tester for simultaneously testing the plurality of DUTs and displaying the information associated with each DUT of the plurality of DUTs.
  • 2. The test system of claim 1, wherein each set of testing probes of the plurality of sets of testing probes comprises at least one of a local area network (LAN) probe, a multimedia over coax alliance (MoCA) probe, and an Ethernet wide area (WAN) probe.
  • 3. The test system of claim 1, further comprising a plurality of test configuration modules operatively connected to the universal tester, the plurality of test configuration modules containing test configuration information for a plurality of different makes and models of DUTs.
  • 4. The test system of claim 1, further comprising a plurality of test environmental modules operatively connected to the universal tester, the plurality of test environmental modules containing test environment information for a plurality of different makes and models of DUTs.
  • 5. The test system of claim 1, further comprising at least one speed test module operatively connected to the universal tester, the at least one speed test module configured to evaluate media network connections and accessibility of each device under test (DUT).
  • 6. The test system of claim 1, further comprising a core testing processor configured to read serial number information associated with each DUT, and based on the serial number read from each DUT, selectively retrieve make and model information associated with each read DUT, the make and model information stored in a database operatively connected to the core testing processor.
  • 7. The test system of claim 6, wherein the core testing processor is further configured to determine success or failure of a test of each DUT based on test configuration parameters and output results of testing each DUT.
  • 8. The test system of claim 6, wherein the core testing processor is further configured to halt test execution based on test configuration parameters.
  • 9. The test system of claim 6, wherein the core testing processor is further configured to send a success message to a user via at least one of the plurality of web sockets.
  • 10. The test system of claim 9, further comprising a web server operatively connected to the core testing processor, the web server configured to transmit the success message to a user.
  • 11. The test system of claim 1, wherein the displayed information associated with each DUT comprises serial number information communicated asynchronously to the operator dashboard via the web socket associated with each DUT.
  • 12. The test system of claim 1, wherein the displayed information associated with each DUT comprises testing progress information communicated asynchronously to the operator dashboard via the web socket associated with each DUT.
  • 13. The test system of claim 1, wherein the operator dashboard displays drop down menus for selecting commands sent to, and displaying information received from, each DUT via the web socket associated with each DUT.
  • 14. The test system of claim 1, wherein the operator dashboard displays user command function buttons for selecting commands sent to each DUT via the web socket associated with each DUT.
  • 15. The test system of claim 13, wherein the user command function buttons allow a user to configure testing preferences.
  • 16. The test system of claim 1, wherein the real-time bidirectional and asynchronous communication between the operator dashboard and the universal tester employs web sockets using JSON format messages.
  • 17. A test system for simultaneously testing a plurality of devices, comprising: a universal tester having a plurality of sets of testing probes, each set of testing probes of the plurality of sets of testing probes configured to communicate with corresponding ports of a respective device under test (DUT) of a plurality of devices under test (DUTs) when the DUT is connected to the tester for testing;a core testing processor configured to read serial number information associated with each DUT, and based on the serial number read from each DUT, selectively retrieve make and model information associated with each read DUT, the make and model information stored in a database operatively connected to the core testing processora plurality of test configuration modules operatively connected to the universal tester, the plurality of test configuration modules containing test configuration information for a plurality of different makes and models of DUTs;an operator dashboard for simultaneously displaying an information window associated with each DUT of the plurality of DUTs during the testing of each DUT; anda plurality of web sockets, each web socket of the plurality of web sockets associated with a DUT of the plurality of DUTs and configured to enable real-time bi-directional and asynchronous communication between the corresponding information window on the operator dashboard and the universal tester for simultaneously testing the plurality of DUTs and displaying the information associated with each DUT, wherein the operator dashboard displays menus for selecting commands sent to, and displaying information received from, each DUT via the web socket associated with each DUT.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation of International Patent Application No. PCT/US16/53768, filed Sep. 26, 2016, which is a continuation-in-part of U.S. patent application Ser. No. 14/866,630, filed Sep. 25, 2015; a continuation-in-part of U.S. patent application Ser. No. 14/866,720, filed Sep. 25, 2015; a continuation-in-part of U.S. patent application Ser. No. 14/866,752, filed Sep. 25, 2015, a continuation-in-part of U.S. patent application Ser. No. 14/866,780, filed Sep. 25, 2015, now U.S. Pat. No. 9,491,454 issued on Nov. 8, 2016; a continuation-in-part of U.S. patent application Ser. No. 14/948,143, filed Nov. 20, 2015; a continuation-in-part of U.S. patent application Ser. No. 14/948,925, filed Nov. 23, 2015; and a continuation-in-part of U.S. patent application Ser. No. 14/987,538, filed Jan. 4, 2016. Applicant claims priority to each of those applications. The disclosure of only U.S. patent application Ser. No. 14/866,780 is hereby incorporated by reference herein, in its entirety.

US Referenced Citations (160)
Number Name Date Kind
5005197 Parsons Apr 1991 A
5897609 Choi et al. Apr 1999 A
5910977 Torregrossa Jun 1999 A
5917808 Koshbab Jun 1999 A
6088582 Canora et al. Jul 2000 A
6308496 Lee Oct 2001 B1
6367032 Kasahara Apr 2002 B1
6662135 Burns Dec 2003 B1
6671160 Hayden Dec 2003 B2
6826512 Dara-Abrams Nov 2004 B2
6859043 Ewing Feb 2005 B2
7068757 Burnett Jun 2006 B1
7254755 De Obaldia et al. Aug 2007 B2
7664317 Sowerby Feb 2010 B1
7809517 Zuckerman Oct 2010 B1
8121028 Schlesener Feb 2012 B1
8209732 Le Jun 2012 B2
8229344 Petersen Jul 2012 B1
8324909 Oakes Dec 2012 B2
8418000 Salame Apr 2013 B1
8418219 Parsons Apr 2013 B1
8515015 Maffre Aug 2013 B2
8689071 Valakh Apr 2014 B2
8806400 Bhawmik Aug 2014 B1
9013307 Hussain Apr 2015 B2
9270983 Hare, Jr. Feb 2016 B1
9316714 Rada Apr 2016 B2
9319908 Nickel Apr 2016 B2
9372228 Nickel Jun 2016 B2
9402601 Berger Aug 2016 B1
9490920 Parte Nov 2016 B2
9491454 Kumar Nov 2016 B1
9571211 Partee Feb 2017 B2
9602556 Cham Mar 2017 B1
9609063 Zhu et al. Mar 2017 B2
9810735 Kumar et al. Nov 2017 B2
9838295 Kumar et al. Dec 2017 B2
9900113 Kumar et al. Feb 2018 B2
9900116 Kumar et al. Feb 2018 B2
9960989 Kumar et al. May 2018 B2
9992084 Kumar et al. Jun 2018 B2
10116397 Kumar et al. Oct 2018 B2
10122611 Kumar et al. Nov 2018 B2
10158553 Tiwari et al. Dec 2018 B2
10230617 Kumar et al. Mar 2019 B2
20020070725 Hilliges Jun 2002 A1
20020077786 Vogel et al. Jun 2002 A1
20030005380 Nguyen Jan 2003 A1
20030184035 Yu Oct 2003 A1
20040010584 Peterson Jan 2004 A1
20040016708 Rafferty Jan 2004 A1
20040160226 Ewing Aug 2004 A1
20040189281 Le et al. Sep 2004 A1
20040203726 Wei Oct 2004 A1
20050041642 Robinson Feb 2005 A1
20050053008 Griesing Mar 2005 A1
20050102488 Bullis May 2005 A1
20050193294 Hildebrant Sep 2005 A1
20050249196 Ansari et al. Nov 2005 A1
20050286466 Tagg Dec 2005 A1
20060015785 Chun Jan 2006 A1
20060271322 Haggerty Nov 2006 A1
20070097659 Behrens May 2007 A1
20070220380 Ohanyan Sep 2007 A1
20080026748 Alexander et al. Jan 2008 A1
20080117907 Hein May 2008 A1
20080144293 Aksamit Jun 2008 A1
20080159737 Noble et al. Jul 2008 A1
20080168520 Vanderhoff Jul 2008 A1
20080274712 Rofougaran Nov 2008 A1
20080315898 Cannon Dec 2008 A1
20090059933 Huang Mar 2009 A1
20090089854 Le Apr 2009 A1
20090213738 Volpe et al. Aug 2009 A1
20090282446 Breed Nov 2009 A1
20090282455 Bell Nov 2009 A1
20090289020 Wurmhoringer Nov 2009 A1
20100132000 Straub May 2010 A1
20100138823 Thornley Jun 2010 A1
20100246416 Sinha Sep 2010 A1
20100281107 Fallows et al. Nov 2010 A1
20110001833 Grinkemeyer Jan 2011 A1
20110006794 Sellathamby Jan 2011 A1
20110012632 Merrow Jan 2011 A1
20110035676 Tischer Feb 2011 A1
20110072306 Racey Mar 2011 A1
20110090075 Armitage et al. Apr 2011 A1
20110099424 Rivera Trevino Apr 2011 A1
20110107074 Chan et al. May 2011 A1
20110116419 Cholas May 2011 A1
20110149720 Phuah et al. Jun 2011 A1
20110222549 Connelly Sep 2011 A1
20110267782 Petrick Nov 2011 A1
20110306306 Reed Dec 2011 A1
20120140081 Clements Jan 2012 A1
20120122406 Gregg et al. May 2012 A1
20120163227 Kannan Jun 2012 A1
20120198084 Keskitalo Aug 2012 A1
20120198442 Kashyap Aug 2012 A1
20120213259 Renken et al. Aug 2012 A1
20120220240 Rothschild Aug 2012 A1
20120275784 Soto Nov 2012 A1
20120278826 Jones Nov 2012 A1
20120306895 Faulkner et al. Dec 2012 A1
20130033279 Sozanski Feb 2013 A1
20130049794 Humphrey Feb 2013 A1
20130076217 Thompson Mar 2013 A1
20130093447 Nickel Apr 2013 A1
20130104158 Partee Apr 2013 A1
20130160064 Van Rozen Jun 2013 A1
20130167123 Dura Jun 2013 A1
20130257468 Mlinarsky Oct 2013 A1
20130305091 Stan et al. Nov 2013 A1
20140047322 Kim Feb 2014 A1
20140091874 Cook et al. Apr 2014 A1
20140115580 Kellerman Apr 2014 A1
20140123200 Park May 2014 A1
20140126387 Gintis May 2014 A1
20140156819 Cavgalar Jun 2014 A1
20140187172 Partee Jul 2014 A1
20140187173 Partee Jul 2014 A1
20140207404 Fritzsche Jul 2014 A1
20140256373 Hernandez Sep 2014 A1
20140266930 Huynh Sep 2014 A1
20140269386 Chu Sep 2014 A1
20140269871 Huynh Sep 2014 A1
20140282783 Totten Sep 2014 A1
20140370821 Guterman Dec 2014 A1
20150024720 Efrati Jan 2015 A1
20150093987 Ouyang Apr 2015 A1
20150109941 Zhang Apr 2015 A1
20150151669 Meisner Jun 2015 A1
20150180743 Jana et al. Jun 2015 A1
20150226716 Nelson Aug 2015 A1
20150237010 Roskind Aug 2015 A1
20150253357 Olgaard Sep 2015 A1
20150288589 Radford et al. Oct 2015 A1
20150369851 Even Dec 2015 A1
20160080241 Rocha De Maria Mar 2016 A1
20160102951 Cole Apr 2016 A1
20160191364 Ajitomi Jun 2016 A1
20160381818 Mills Dec 2016 A1
20170048519 Friel Feb 2017 A1
20170089981 Kumar Mar 2017 A1
20170093682 Kumar Mar 2017 A1
20170093683 Kumar Mar 2017 A1
20170126536 Kumar May 2017 A1
20170126537 Kumar May 2017 A1
20170126539 Tiwari May 2017 A1
20170149635 Kumar May 2017 A1
20170149645 Kumar May 2017 A1
20170195071 Kumar Jul 2017 A1
20170250762 Kumar et al. Aug 2017 A1
20170288791 Kumar et al. Oct 2017 A1
20170288993 Kumar et al. Oct 2017 A1
20170289012 Tiwari et al. Oct 2017 A1
20180024193 Kumar et al. Jan 2018 A1
20180076908 Kumar et al. Mar 2018 A1
20180077046 Kumar et al. Mar 2018 A1
20180351846 Kumar et al. Dec 2018 A1
Foreign Referenced Citations (7)
Number Date Country
202261360 May 2012 CN
2001013604 Feb 2001 WO
2013169728 Nov 2013 WO
2014035462 Mar 2014 WO
2014065843 May 2014 WO
2017053961 Mar 2017 WO
2017074872 May 2017 WO
Non-Patent Literature Citations (94)
Entry
Kumar, Samant; Non-Final Office Action for U.S. Appl. No. 14/866,630, filed Sep. 25, 2015, dated Aug. 9, 2017, 24 pgs.
Kumar, Samant; Notice of Allowance for U.S. Appl. No. 14/866,720, filed Sep. 25, 2015, dated Aug. 28, 2017, 11 pgs.
Kumar, Samant; Response to Rule 312 Communication for U.S. Appl. No. 14/866,720, filed Sep. 25, 2015, dated Jul. 26, 2017, 2 pgs.
Kumar, Samant; Non-Final Office Action for U.S. Appl. No. 14/987,538, filed Jan. 4, 2016, dated Jan. 21, 2017, 18 pgs.
Kumar, Samant; Non-Final Office Action for U.S. Appl. No. 14/929,180, filed Oct. 30, 2015, dated Aug. 22, 2017, 32 pgs.
Kumar, Samant; Non-Final Office Action for U.S. Appl. No. 14/929,220, filed Oct. 30, 2015, dated Aug. 24, 2017, 31 pgs.
Businesswire; Article entitled: “GENBAND and CTDI Settle Legal Dispute”, located at <http://www.businesswire.com/news/home/20140321005528/en/GENBAND-CTDI-Settle-Legal-Dispute>, Mar. 21, 2014, 1 pg.
CED Magazine; Article entitled: “Cable Connects in Atlanta”, located at <https://www.cedmagazine.com/article/2006/04/cable-connects-atlanta>, Apr. 30, 2006, 21 pgs.
Consumer Electronics Net; Article entitled: “Teleplan Enhances Test Solution Portfolio with Titan”, located at <http://www.consumerelectronicsnet.com/article/Teleplan-Enhances-Test-Solution-Portfolio-With-Titan-4673561>, published on Nov. 1, 2016, 3 pgs.
Digital Producer; Article entitled: “S3 Group Unveils Exclusive Partnership in North America With First US StormTest(TM) Decision Line Customer”, located at <http://www.digitalproducer.com/article/S3-Group-Unveils-Exclusive-Partnership-in-North-America-With-First-US-StormTest(TM)-Decision-Line-Customer--1668213>, Sep. 8, 2011, 3 pgs.
Electronic Design; Article entitled: “Testing of MPEG-2 Set-Top Boxes Must be Fast, Thorough”, located at <http://www.electronicdesign.com/print/839>, published Nov. 18, 2001, 9 pgs.
Euromedia; Article entitled: “Automated TV Client testing: Swisscom partners with S3 Group to deliver the ultimate IPTV experience”, located at <http://advanced-television.com/wp-content/uploads/2012/10/s3.pdf>, earliest known pub. date—May 30, 2013, 2 pgs.
Exact Ventures; Report entitled: North American Telecommunications Equipment Repair Market, located at http://www.fortsol.com/wp-content/uploads/2016/08/Exact-Ventures-NA-Repair-Market-Report.pdf>, earliest known publication date Aug. 1, 2016, 12 pgs.
Promptlink Communications; Article entitled: “Promptlink Communications Officially Launches Sep-Top Box Testing Platform”, located at <https://www.promptlink.com/company/assets/media/2014-05-20.pdf>, published on May 20, 2014, 2 pgs.
Promptlink; Article entitled: “Cable Modem Test Platform”, located at <https://www.promptlink.com/products/cmtp.html>, earliest known publication date Aug. 11, 2016, 10 pgs.
Promptlink; Article entitled: “Set-Top Box Test Platform”, located at <http://promptlink.com/products/stbtp.html>, earliest known publication date Aug. 11, 2016, 7 pgs.
S3 Group; Document entitled: “White Paper: The Importance of Automated Testing in Set-Top Box Integration”, earliest known publication date Jun. 17, 2014, 11 pgs.
Teleplan; Article entitled: “Screening & Testing”, located at <https://www.teleplan.com/innovative-services/screening-testing/>, earliest known publication date Mar. 21, 2015, 7 pgs.
TVtechnology; Article entitled: “S3 Group's StormTest”, located at <http://www.tvtechnology.com/expertise/0003/s3-groups-stormtest/256690>, published May 1, 2012, 2 pgs.
Kumar, Samant; Notice of Allowance for U.S. Appl. No. 14/948,925, filed Nov. 23, 2015, dated Sep. 20, 2017, 15 pgs.
Kumar, Samant; Supplemental Notice of Allowance for U.S. Appl. No. 14/948,925, filed Nov. 23, 2015, dated Oct. 5, 2017, 2 pgs.
Kumar, Samant; Notice of Allowance for U.S. Appl. No. 15/057,085, filed Feb. 29, 2016, dated Sep. 29, 2017, 28 pgs.
Kumar, Samant; Issue Notification for U.S. Appl. No. 14/866,780, filed Sep. 25, 2015, dated Oct. 19, 2016, 1 pg.
Kumar, Samant; Notice of Allowance for U.S. Appl. No. 14/866,780, filed Sep. 25, 2015, dated Jul. 19, 2016, 8 pgs.
Kumar, Samant; Non-Final Office Action for U.S. Appl. No. 14/866,720, filed Sep. 25, 2015, dated Jan. 23, 2017, 17 pgs.
Kumar, Samant; Notice of Allowance for U.S. Appl. No. 14/866,720, filed Sep. 25, 2015, dated Jun. 29, 2017, 26 pgs.
Kumar, Samant; Ex-Parte Quayle Office Action for U.S. Appl. No. 14/948,925, filed Nov. 23, 2015, mailed Jun. 20, 2017, 29 pgs.
Kumar, Samant; Notice of Allowance for U.S. Appl. No. 14/987,538, filed Jan. 4, 2016, dated Mar. 23, 2017, 12 pgs.
Kumar, Samant; Non-Final Office Action for U.S. Appl. No. 15/057,085, filed Feb. 29, 2016, dated Apr. 7, 2017, 15 pgs.
Kumar, Samant; International Search Report and Written Opinion for PCT/US16/53768, filed Sep. 26, 2016, dated Feb. 3, 2017, 17 pgs.
Kumar, Samant; International Search Report and Written Opinion for PCT/US2016/058507, filed Oct. 24, 2016, dated Jan. 3, 2017, 12 pgs.
Nordman, Bruce, “Testing Products with Network Connectivity,” Jun. 21, 2011 [retrieved online at http://citeseerx.is1. psu.edu/viewdoc/download?doi=10.1.1.695.772&rep=rep1&type=pdf on Feb. 6, 2017], 20 pgs.
Kumar, Samant; Non-Final Office Action for U.S. Appl. No. 15/624,961, filed Feb. 29, 2016, dated Jul. 19, 2017, 7 pgs.
Kumar, Samant; Notice of Allowance for U.S. Appl. No. 14/866,630, filed Sep. 25, 2015, dated Dec. 20, 2017, 19 pgs.
Kumar, Samant; Non-Final Office Action for U.S. Appl. No. 14/866,752, filed Sep. 25, 2015, dated Nov. 7, 2017, 26 pgs.
Kumar, Samant; Issue Notification for U.S. Appl. No. 14/866,720, filed Sep. 25, 2015, dated Oct. 18, 2017, 1 pg.
Kumar, Samant; Issue Notification for U.S. Appl. No. 14/948,925, filed Nov. 23, 2015, dated Nov. 16, 2017, 1 pg.
Kumar, Samant; Non-Final Office Action for U.S. Appl. No. 14/948,143, filed Nov. 20, 2015, dated Dec. 28, 2017, 39 pgs.
Kumar, Samant; Notice of Allowance for U.S. Appl. No. 14/987,538, filed Jan. 4, 2016, dated Dec. 4, 2017, 20 pgs.
Kumar, Samant; Corrected Notice of Allowance for U.S. Appl. No. 15/057,085, filed Feb. 29, 2016, dated Oct. 31, 2017, 6 pgs.
Tiwari, Rajeev; Non-Final Office Action for U.S. Appl. No. 15/348,920, filed Nov. 10, 2016, dated Nov. 20, 2017, 53 pgs.
Tiwari, Rajeev; Non-Final Office Action for U.S. Appl. No. 15/642,967, filed Jun. 16, 2017, dated Nov. 7, 2017, 52 pgs.
Kumar, Samant; Supplemental Notice of Allowance for U.S. Appl. No. 14/866,630, filed Sep. 25, 2015, dated Jan. 31, 2018, 9 pgs.
Kumar, Samant; Supplemental Notice of Allowance for U.S. Appl. No. 14/866,630, filed Sep. 25, 2015, dated Mar. 30, 2018, 6 pgs.
Kumar, Samant; Certificate of Correction for U.S. Appl. No. 14/866,720, filed Sep. 25, 2015, dated Feb. 13, 2018, 1 pg.
Kumar, Samant; Notice of Allowance for U.S. Appl. No. 14/948,143, filed Nov. 20, 2015, dated Jan. 25, 2018, 13 pgs.
Kumar, Samant; Corrected Notice of Allowance for U.S. Appl. No. 14/987,538, filed Jan. 4, 2016, dated Jan. 10, 2018, 8 pgs.
Kumar, Samant; Issue Notification for U.S. Appl. No. 14/987,538, filed Jan. 4, 2016, dated Jan. 31, 2018, 1 pg.
Kumar, Samant; Response to Amendment under Rule 312 for U.S. Appl. No. 14/987,538, filed Jan. 4, 2016, dated Jan. 17, 2018, 2 pgs.
Kumar, Samant; Notice of Non-Compliant Amendment for U.S. Appl. No. 15/624,961, filed Jun. 16, 2017, dated Jan. 10, 2018, 5 pgs.
Kumar, Samant; Issue Notification for U.S. Appl. No. 15/057,085, filed Feb. 29, 2016, dated Jan. 31, 2018, 1 pg.
Kumar, Samant; Issue Notification for U.S. Appl. No. 14/866,630, filed Sep. 25, 2015, dated Apr. 11, 2018, 1 pg.
Kumar, Samant; Notice of Allowance for U.S. Appl. No. 14/866,752, filed Sep. 25, 2015, dated May 17, 2018, 16 pgs.
Kumar, Samant; Issue Notification for U.S. Appl. No. 14/948,143, filed Nov. 20, 2015, dated May 16, 2018, 1 pg.
Kumar, Samant; Supplemental Notice of Allowance for U.S. Appl. No. 14/948,143, dated Nov. 20, 2015, mailed May 7, 2018, 7 pgs.
Kumar, Samant; Notice of Allowance for U.S. Appl. No. 15/624,961, filed Jun. 16, 2017, dated May 22, 2018, 44 pgs.
Kumar, Samant; Final Office Action for U.S. Appl. No. 14/929,180, filed Oct. 30, 2015, dated May 8, 2018, 35 pgs.
Kumar, Samant; Final Office Action for U.S. Appl. No. 14/929,220, filed Oct. 30, 2015, dated May 10, 2018, 38 pgs.
Tiwari, Rajeev; Final Office Action for U.S. Appl. No. 15/348,920, filed Nov. 10, 2016, dated Apr. 30, 2018, 34 pgs.
Tiwari, Rajeev; Final Office Action for U.S. Appl. No. 15/624,967, filed Jun. 16, 2017, dated May 8, 2018, 40 pgs.
Kumar, Samant; International Preliminary Report on Patentability for PCT Application No. PCT/US2016/053768, dated Sep. 26, 2016, mailed Apr. 5, 2018, 13 pgs.
Kumar, Samant; International Preliminary Report on Patentability for PCT Application No. PCT/US2016/058507, filed Oct. 24, 2016, dated May 11, 2018, 12 pgs.
Kumar, Samant; Corrected Notice of Allowability for U.S. Appl. No. 14/866,752, filed Sep. 25, 2015, dated Jul. 10, 2018, 5 pgs.
Kumar, Samant; Non-Final Office Action for U.S. Appl. No. 15/624,950, filed Jun. 16, 2017, dated Jul. 9, 2018, 50 pgs.
Kumar, Samant; Corrected Notice of Allowance for U.S. Appl. No. 15/624,961, filed Jun. 16, 2017, dated Mar. 28, 2018, 7 pgs.
Kumar, Samant; Advisory Action for U.S. Appl. No. 14/929,180, filed Oct. 30, 2015, dated Jul. 27, 2018, 9 pgs.
Kumar, Samant; Advisory Action for U.S. Appl. No. 14/929,220, filed Oct. 30, 2015, dated Jul. 27, 2018, 8 pgs.
Kumar, Samant; Non-Final Office Action for U.S. Appl. No. 15/818,803, filed Nov. 21, 2017, dated Jul. 25, 2018, 46 pgs.
Tiwari, Rajeev; Advisory Action for U.S. Appl. No. 15/348,920, filed Nov. 10, 2016, dated Jul. 17, 2018, 8 pgs.
Tiwari, Rajeev; Non-Final Office Action for U.S. Appl. No. 15/348,920, filed Nov. 10, 2016, dated Mar. 24, 2018, 10 pgs.
Tiwari, Rajeev; Advisory Action for U.S. Appl. No. 15/624,967, filed Jun. 16, 2017, dated Jul. 17, 2018, 7 pgs.
Tiwari, Rajeev; Notice of Allowance for U.S. Appl. No. 15/624,967, filed Jun. 16, 2017, dated Aug. 28, 2018, 6 pgs.
Kumar, Samant; Corrected Notice of Allowance for U.S. Appl. No. 14/866,752, filed Sep. 25, 2015, dated Oct. 4, 2018, 5 pgs.
Kumar, Samant; Issue Notification for U.S. Appl. No. 14/866,752, filed Sep. 25, 2015, dated Oct. 17, 2018, 1 pg.
Kumar, Samant; Notice of Allowance for U.S. Appl. No. 15/813,838, filed Nov. 15, 2017, dated Oct. 2, 2018, 52 pgs.
Kumar, Samant; Corrected Notice of Allowance for U.S. Appl. No. 15/624,961, filed Jun. 16, 2017, dated Sep. 28, 2018, 9 pgs.
Kumar, Samant; Issue Notification for U.S. Appl. No. 15/624,961, filed Jun. 16, 2017, dated Oct. 10, 2018, 1 pg.
Kumar, Samant; Non-Final Office Action for U.S. Appl. No. 14/929,180, filed Oct. 30, 2015, dated Oct. 1, 2018, 15 pgs.
Kumar, Samant; Non-Final Office Action for U.S. Appl. No. 14/929,220, filed Oct. 30, 2015, dated Oct. 1, 2018, 13 pgs.
Kumar, Samant; Final Office Action for U.S. Appl. No. 15/818,803, filed Nov. 21, 2017, dated Nov, 26, 2018, 20 pgs.
Tiwari, Rajeev; Notice of Allowance for U.S. Appl. No. 15/348,920, filed Nov. 10, 2016, dated Dec. 4, 2018, 11 pgs.
Tiwari, Rajeev; Issue Notification for U.S. Appl. No. 15/624,967, filed Jun. 16, 2017, dated Nov. 29, 2018, 1 pg.
Tiwari, Rajeev; Supplemental Notice of Allowance for U.S. Appl. No. 15/624,967, filed Jun. 16, 2017, dated Sep. 19, 2018, 7 pgs.
Kumar, Samant; Notice of Allowance for U.S. Appl. No. 16/103,546, filed Aug. 14, 2018, dated Jan. 28, 2019, 36 pgs.
Kumar, Samant; Non-Final Office Action for U.S. Appl. No. 15/722,235, filed Oct. 2, 2017, dated Jan. 8, 2019, 62 pgs.
Kumar, Samant; Corrected Notice of Allowance for U.S. Appl. No. 15/813,838, filed Nov. 15, 2017, dated Feb. 12, 2019, 6 pgs.
Kumar, Samant; Issue Notification for U.S. Appl. No. 15/813,838, filed Nov. 15, 2017, dated Jan. 9, 2019, 1 pg.
Kumar, Samant; Issue Notification for U.S. Appl. No. 15/813,838, filed Nov. 15, 2017, dated Feb. 20, 2019, 1 pg.
Kumar, Samant; Final Office Action for U.S. Appl. No. 15/624,950, filed Jun. 16, 2017, dated Dec. 20, 2018, 33 pgs.
Kumar, Samant; Notice of Allowance for U.S. Appl. No. 14/929,220, filed Oct. 30, 2015, dated Feb. 19, 2019, 24 pgs.
Kumar, Samant; Advisory Action for U.S. Appl. No. 15/818,803, filed Nov. 21, 2017, dated Feb. 5, 2019, 13 pgs.
Tiwari, Rajeev; Corrected Notice of Allowance for U.S. Appl. No. 15/348,920, filed Nov. 10, 2016, dated Feb. 26, 2019, 8 pgs.
Kumar, Samant; Corrected Notice of Allowance for U.S. Appl. No. 16/103,546, filed Mar. 25, 2019, dated Mar. 25, 2019, 7 pgs.
Tiwari, Rajeev; Corrected Notice of Allowance for U.S. Appl. No. 15/348,920, filed Nov. 10, 2016, dated Mar. 27, 2019, 13 pgs.
Related Publications (1)
Number Date Country
20170302994 A1 Oct 2017 US
Continuations (1)
Number Date Country
Parent PCT/US2016/053768 Sep 2016 US
Child 15642915 US
Continuation in Parts (7)
Number Date Country
Parent 14866630 Sep 2015 US
Child PCT/US2016/053768 US
Parent 14866720 Sep 2015 US
Child 14866630 US
Parent 14866752 Sep 2015 US
Child 14866720 US
Parent 14866780 Sep 2015 US
Child 14866752 US
Parent 14948143 Nov 2015 US
Child 14866780 US
Parent 14948925 Nov 2015 US
Child 14948143 US
Parent 14987538 Jan 2016 US
Child 14948925 US