The present disclosure relates generally to communication networks, and more particularly, to network optimization using linear programming.
Many networking problems can be categorized as mathematical optimization problems involving an objective such as the maximizing or minimizing of a set of variables, while subject to a series of constraints. These optimization problems may be modeled using linear programming (LP) formulations and solved using LP solvers, however, there are a number of difficulties in using conventional LP solvers. For example, objectives or constraints may not be supported by a system built on top of the LP solvers and solving the LP problem may be time consuming for large data sets or stringent constraints.
Corresponding reference characters indicate corresponding parts throughout the several views of the drawings.
In one embodiment, a method generally comprises identifying at a network device, metrics associated with constraints of an optimization problem, receiving values for the metrics from a stream reasoner, obtaining an initial solution of the optimization problem from a linear programming solver based on the values of the metrics, and instructing the linear programming solver to calculate a new solution to the optimization problem when the stream reasoner indicates that the constraints of the optimization problem are violated.
In another embodiment, an apparatus generally comprises a processor configured to identify metrics associated with constraints of an optimization problem, process values of the metrics received from a stream reasoner, obtain an initial solution of the optimization problem from a linear programming solver based on the values of the metrics, and instruct the linear programming solver to calculate a new solution to the optimization problem when the stream reasoner indicates that the constraints of the optimization problem are violated. The apparatus further comprises memory for storing the metrics and the constraints of the optimization problem.
The following description is presented to enable one of ordinary skill in the art to make and use the embodiments. Descriptions of specific embodiments and applications are provided only as examples, and various modifications will be readily apparent to those skilled in the art. The general principles described herein may be applied to other applications without departing from the scope of the embodiments. Thus, the embodiments are not to be limited to those shown, but are to be accorded the widest scope consistent with the principles and features described herein. For purpose of clarity, details relating to technical material that is known in the technical fields related to the embodiments have not been described in detail.
Optimization problems may be modeled using Linear Programming (LP) formulations and solved using domain-independent LP solvers. Examples include assignment of flows to links or paths, placement of ACLs (Access Control Lists) on network elements, placement of virtual machines (VMs)/foglets in fog nodes, placement of virtual network functions, assignment of wireless clients to access points, and many others.
Conventional systems built on top of these LP solvers typically hard-code the objective and constraints that the user can specify. From the user's perspective, if the objective or constraints that they want to use are not supported by the system, the user would need to write new code. Furthermore, solving the LP problem may require large processing resources and be time consuming for large data sets or stringent constraints. As such, it is important to be able to determine automatically whether the operating conditions of the compute/network infrastructure are still within bounds specified by the placement or assignment constraints, or whether a new placement or assignment calculation needs to be triggered.
The embodiments described herein provide for automatic triggering of an LP solver when constraints of the LP formulation are violated through the use of semantic stream reasoning. This allows for the automatic determination of when the LP solver needs to be triggered to find a new optimal solution, which is important given the cost associated with solving highly constrained LP formulations. Certain embodiments may also provide the flexibility to support any objective or constraints without the need for writing new code by modeling the optimization problem in network ontology.
Referring now to the drawings, and first to
The network shown in the example of
The optimization engine 10 and semantic reasoner 12 may operate at a controller, server, appliance, or any other network element or general purpose computing device located in the network or in a cloud or fog. The optimization device 10 and stream reasoner 12 may operate at separate network devices or be co-located at the same network device. Also, one or more the components of the optimization device 10 may be located on another network device or distributed in the network.
The optimization device 10 and semantic reasoner 12 may also utilize ontology information from a network ontology file 19, which may be maintained, for example, in an ontology server or other network device or database. The ontology 19 formally represents knowledge as a hierarchy of concepts within a domain (e.g., network) using a shared vocabulary to denote types, properties, and interrelationships of concepts. In particular, the ontology 19 may comprise an explicit representation of a shared conceptualization of the network, providing a formal structural framework for organizing knowledge related to the network as a hierarchy of inter-related concepts. The shared conceptualization may include conceptual frameworks for modeling domain knowledge (e.g., knowledge related to the network, concept specific protocols for communication among devices, and applications within the network, etc.) and agreements about representation of particular domain theories. The network ontology may be encoded in any suitable knowledge representation language, such as Web Ontology Language (OWL).
In one embodiment, the semantic reasoner 12 is a stream reasonser configured to infer logical consequences from a set of asserted facts or axioms. The semantic reasoner 12 may comprise, for example, a semantic mapper or pre-processing components operable to populate a knowledge database with data extracted from the network data according to the network ontology 19, for example. The semantic reasoner 12 may further comprise a reasoning engine configured to perform machine reasoning according to a semantic model, for example, using policies and rules from a policy database, and generate actions or reports appropriate for controlling and managing the network 14.
The optimization device 10 is configured to identify metrics based on the ontology 19 and instruct the stream reasoner 12 to monitor data streams to provide temporal readings of the metrics. The current values of the metrics, obtained from the semantic reasoner 12 are input to the LP solver 16 to obtain an initial solution of an optimization problem. The LP solver 16 may be any component or module (e.g., code, software, logic) operable to optimize a linear function subject to linear equality and linear inequality constraints. The LP solver 16 may use any type of programming or modeling language or software environment.
As the state or condition of the network 14 or compute infrastructure changes, the metrics may vary. As described in detail below, the LP solver trigger 18 is operable to automatically trigger the LP solver 16 to calculate a new optimal solution when the constraints of the LP formulation are being violated (i.e., no longer being met) through use of the semantic stream reasoner 12. In one embodiment, the optimization device 10 generates one or more filters 17 for installation at the semantic reasoner 12 for use in identifying when the optimization constraints are no longer being met. This allows the LP solver 16 to only run when the current placement/assignment within the infrastructure is no longer optimal. The LP solver trigger 18 may be any suitable mechanism or module (e.g., code, software, program) operable to trigger the LP solver 16 to calculate a new optimal solution based on input from the stream reasoner 12.
In one example, the stream reasoner 12 comprises a C-SPARQL (Continuous SPARQL Protocol and RDF (Resource Description Framework) Query Language) engine and the filters 17 are constructed using SPARQL FILTER primitive types.
One or more components shown in
Constraints of the optimization problem may be associated with resources (e.g., memory (RAM, TCAM, etc.), latency, bandwidth, or any other network or device limitations.
The optimization results may be provided to one or more network management devices, controller, service node, or other system or device for use in assigning flows to links or paths, placement of ACLs on network elements, placement of VMs/foglets in fog nodes, placement of virtual network functions, placement of VMs, containers, or applications in a network, assignment of wireless clients to access points, or any other network optimization problem. An example using the embodiments to place foglets at fog nodes is described below with respect to
In addition to the functions described above, the optimization engine 10 may also be responsible for programming the LP solver 16 based on the optimization model captured in the ontology 19. The embodiments thus provide the flexibility to support any objective or constraints, without the need to write new code, by modeling the optimization problem in the ontology 19.
It is to be understood that the network shown in
Memory 24 may be a volatile memory or non-volatile storage, which stores various applications, operating systems, modules, and data for execution and use by the processor 22. Memory 24 may include, for example, one or more databases (e.g., network knowledge database, polices database, etc.) or any other data structure configured for storing policies, constraints, objectives, metrics, network data (e.g., topology, resources, capabilities, ontology), or other information. One or more optimization components 28 (e.g., code, logic, software, firmware, etc.) may also be stored in memory 24. The network device 20 may include any number of memory components.
Logic may be encoded in one or more tangible media for execution by the processor 22. The processor 22 may be configured to implement one or more of the functions described herein. For example, the processor 22 may execute codes stored in a computer-readable medium such as memory 24 to perform the process described below with respect to
The network interface 26 may comprise any number of interfaces (linecards, ports) for receiving data or transmitting data to other devices. The network interface 26 may include, for example, an Ethernet interface for connection to a computer or network. The network interface 26 may be configured to transmit or receive data using a variety of different communication protocols. The interface 26 may include mechanical, electrical, and signaling circuitry for communicating data over physical links coupled to the network.
It is to be understood that the network device 20 shown in
As shown in
Any positive results identified by the stream reasoner 12 are fed to the optimization engine 10. For example, as shown in the logic of block 30, if the constraints identified by the stream reasoner 12 exceed the current constraint bounds (e.g., upper bound, lower bound, equality, or any combination thereof), an alert will be raised causing the LP trigger 18 to request the LP solver 16 to calculate a new optimal solution. The embodiments thus schedule the LP solver 16 to run only when current placement/assignment within the infrastructure is no longer optimal.
It is to be understood that the process shown in
As shown in
The following example illustrates a simplified foglet placement problem in which there are constraints on memory size. In this example, Foglet 1 requires 3 MB of RAM, Foglet 2 requires 2 MB, and Foglet 3 requires 7 MB. The network includes two fog nodes 56 (fog node A and fog node B) with initial available memory of 8 MB and 5 MB, respectively. The memory related constraints for the foglet placement problem may be formulated as follows:
3×p1,A+2×p2,A+7×p3,A<=8 (Node A RAM Constraint)
3×p1,B+2×p2,B+7×p3,B<=5 (Node B RAM Constraint)
While the above constraints are shown here in numeric formulae, it is to be understood that the embodiments may capture these constraints in an ontology (e.g. using RDF/OWL constructs).
In one example, the initial solution assigns Foglet 1 and Foglet 2 to Node B, and Foglet 3 to Node A. The optimization engine 10 may then render these two constraints into the proper programming of the stream reasoner 12 (
As the memory demands of the foglets and the memory available on the fog nodes 56 change over time, the stream reasoner 12 may use the above query to determine whether the constraints of the placement are still within the required bounds, or whether a new placement needs to be calculated. When the query returns a match, the optimization engine 10 provides the updated set of metrics to the LP solver 16 and re-programs the LP solver to calculate the new placement solution.
While the above example shows a simple constraint along a single dimension (memory), it is to be understood that the embodiments described herein may be applied to complex multifaceted constraints that map to the composition of multiple data streams being fed to the stream reasoner 12.
Although the method and apparatus have been described in accordance with the embodiments shown, one of ordinary skill in the art will readily recognize that there could be variations made without departing from the scope of the embodiments. Accordingly, it is intended that all matter contained in the above description and shown in the accompanying drawings shall be interpreted as illustrative and not in a limiting sense.