The present invention relates generally to the field of computing, and more specifically, to time series data visualization.
Generally, a time series is a series of data points indexed (or listed or graphed) in time order. Most commonly, a time series may include snapshots of a sequence that is taken at successive equally spaced points in time. As such, time series data analysis may be useful to see how a given data point, asset, security or economic variable changes over time. Examples of time series data may include sensor data associated with an internet of things (IoT) device, daily weather data, and the daily closing value of the Dow Jones Industrial Average. Multivariate data visualization is a type of time series data visualization where multiple data dimensions or attributes may be analyzed and represented. As such, a multivariate time series has more than one time-dependent variable. Each variable may not only depend on its past values but also has some dependency on other variables. This dependency is used for forecasting future values. Thus, multivariate analysis may involve determining potential relationships, patterns and correlations amongst these variables/attributes. Multivariate time series data analysis is a popular tool due to its widespread and popular application to fields ranging from health and medicine, market and online retail, weather, security, infrastructure, biology and others. The increase in utilizing multivariate time series data can be attributed to the improvements in sensor and data collection infrastructure and growing utilization of multivariate time series data in classification, forecasting, and anomaly detection related applications.
A method for generating and displaying an embedding of multivariate time series data in an embedded space is provided. The method may include optimizing a generative deep learning neural network model by adding a regularization term to train the generative deep learning neural network model in an optimization process, wherein the regularization term maintains a temporal relationship between data points associated with the multivariate time series data when representing the data points of the multivariate time series data in an embedded space. The method may further include, in response to receiving the multivariate time series data, applying the optimized generative deep learning neural network model to the input, and representing and displaying the multivariate time series data in the embedded space such that the temporal relationship between the data points from the input is captured by and presented in the representation, wherein the representation comprises an embedding of the multivariate time series data in the embedded space.
A computer system for generating and displaying an embedding of multivariate time series data in an embedded space is provided. The computer system may include one or more processors, one or more computer-readable memories, one or more computer-readable tangible storage devices, and program instructions stored on at least one of the one or more storage devices for execution by at least one of the one or more processors via at least one of the one or more memories, whereby the computer system is capable of performing a method. The method may include optimizing a generative deep learning neural network model by adding a regularization term to train the generative deep learning neural network model in an optimization process, wherein the regularization term maintains a temporal relationship between data points associated with the multivariate time series data when representing the data points of the multivariate time series data in an embedded space. The method may further include, in response to receiving the multivariate time series data, applying the optimized generative deep learning neural network model to the input, and representing and displaying the multivariate time series data in the embedded space such that the temporal relationship between the data points from the input is captured by and presented in the representation, wherein the representation comprises an embedding of the multivariate time series data in the embedded space.
A computer program product for generating and displaying an embedding of multivariate time series data in an embedded space is provided. The computer program product may include one or more computer-readable storage devices and program instructions stored on at least one of the one or more tangible storage devices, the program instructions executable by a processor. The computer program product may include program instructions to optimizing a generative deep learning neural network model by adding a regularization term to train the generative deep learning neural network model, wherein the regularization term maintains a temporal relationship between data points associated with the multivariate time series data when reconstructing the data points of the multivariate time series data in an embedded space. The computer program product may further include program instructions to, in response to receiving the multivariate time series data as input, apply the optimized generative deep learning neural network model to the input, and represent and display the multivariate time series data in the embedded space such that the temporal relationship between the data points from the input is captured by and presented in the representation, wherein the representation comprises an embedding of the multivariate time series data in the embedded space.
These and other objects, features and advantages of the present invention will become apparent from the following detailed description of illustrative embodiments thereof, which is to be read in connection with the accompanying drawings. The various features of the drawings are not to scale as the illustrations are for clarity in facilitating one skilled in the art in understanding the invention in conjunction with the detailed description. In the drawings:
Detailed embodiments of the claimed structures and methods are disclosed herein; however, it can be understood that the disclosed embodiments are merely illustrative of the claimed structures and methods that may be embodied in various forms. This invention may, however, be embodied in many different forms and should not be construed as limited to the exemplary embodiments set forth herein. In the description, details of well-known features and techniques may be omitted to avoid unnecessarily obscuring the presented embodiments.
As previously described, embodiments of the present invention relate generally to the field of computing, and more particularly, to time series data visualization. Specifically, the following described exemplary embodiments provide a system, method and program product for generating and displaying an embedding of multivariate time series data in an embedded space. More specifically, the present invention has the capacity to improve the technical field associated with time series data visualization by optimizing a generative deep learning neural network model by adding a regularization term to the generative deep learning neural network model such that the regularization term maintains a temporal relationship between data points associated with the multivariate time series data when reconstructing the data points of the multivariate time series data in an embedded space. Then, in turn, the optimized generative deep learning neural network model may be used to effectively parse and visualize different patterns and anomalies in the multivariate time series data.
Specifically, and as previously described with respect to multivariate time series data, multivariate time series data analysis is a popular tool due to its widespread and popular application ranging from healthcare, retail, weather, security, and infrastructure. However, the wide use of multivariate time series data has also brought challenges in trying to parse long sequences, in its raw form, to identify patterns and create labeled data for different modeling approaches. Generally, typical approaches to analyze such data involve transforming the data into a representation that is compact (reduced dimensionality) and efficiently preserves the information. For most time series data, a data value at a given time point, by itself, does not carry as much useful information as it would when analyzed in the context of its neighboring time values. Summarizing time windows may also help reduce the data noise for useful information extraction. Naive approaches to summarize overlapping windows of contiguous data involves symmetric functioning, like average, sum, min, max, etc. Principal Component Analysis (PCA) is also widely used to summarize the relation between variables in stationary data with limited time series. In addition, transforming the time series data to frequency domain, via Fast Fourier Transform, is another solution. However, it is not trivial to extend these kind of approaches to a multivariate time series data analysis problem. Since 2012, deep neural networks have achieved lots of successes in artificial intelligence community, especially in natural language understanding and computer vision areas. The learned representations via deep neural networks have demonstrated tremendous successes in various tasks such as classification, generation, detection, translation, etc.
Deep neural networks can be categorized into discriminative and generative models. Compared to discriminative learning models, which heavily rely on labeled data, generative models are more flexible in terms of label quality. This has made it popular in image generation, reinforcement learning, and semi-supervised learning tasks. Due to its powerful learning capability, deep neural networks demonstrate great potential for non-linear dimensionality reduction of multivariate data. Among these different generative models, Variational Autoencoder (VAE) has become a prevailing methodology for presenting a rich representation, or modeling, time series data. Specifically, the VAE is a class of deep neural network generative models that have been used to learn a concise representation of input data while minimizing data loss when reconstructing the input data in an embedded space. Thus, the VAE may be used as the deep neural network basis to summarize contiguous blocks of multivariate time series data by learning the compact latent representation of the input multivariate time series data. However, directly applying the VAE to the multivariate time series data may not capture temporal characteristics of the contiguous time sequential data blocks. Specifically, for example, when analyzing the input multivariate time series data, the VAE does not care if two data points are next to each other in a time sequence, and in turn, reconstruction (representation) and embedding of data points in an embedding space (i.e. a reduced dimensionality) using the VAE may be unrelated even if two consecutive data points are in consecutive time sequence in the input data. In turn, when using the VAE to summarize contiguous blocks of the multivariate time series data, samples of the multivariate time series data that belong to nearby time windows may be displayed discretely, or separate, in an embedding space and lose the continuity that was observed in the input data. Therefore, it then becomes a problem to visualize this embedded multivariate time series data to look for patterns in the data as temporal continuity may be lost in the embedded space.
As such, it may be advantageous, among other things, to provide a method, computer system, and computer program product for generating and displaying a smooth representation of an embedding of multivariate time series data using a variational autoencoder (VAE) along with a manifold regularization term. More specifically, the method, computer system, and computer program product may optimize the variational autoencoder (VAE) by introducing a manifold regularization term in the VAE in order to generate the smooth representation, whereby generating the smooth representation of the embedding includes displaying a more temporal relationship/connection between data points associated with the multivariate time series data in an embedded space, which in turn, can be used to effectively parse and visualize different patterns and anomalies in the data. Therefore, the manifold regularization term may be used to visually and temporally correlate time series data in an embedded space and help detect changes in the time series data.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
Referring now to
According to at least one implementation, the present embodiment may also include a database 116, which may be running on server 112. The communication network 110 may include various types of communication networks, such as a wide area network (WAN), local area network (LAN), a telecommunication network, a wireless network, a public switched network and/or a satellite network. It may be appreciated that
The client computer 102 may communicate with server computer 112 via the communications network 110. The communications network 110 may include connections, such as wire, wireless communication links, or fiber optic cables. As will be discussed with reference to
According to the present embodiment, a program, such as a manifold regularization program 108A and 108B may run on the client computer 102 and/or on the server computer 112 via a communications network 110. The manifold regularization program 108A, 108B may optimize a variational autoencoder (VAE) by applying a manifold regularization term to learn a representation of time series data and generate a temporally smooth reconstruction of the time series data for visualization in an embedded space. Specifically, a user using a client computer 102, such as a laptop device, may run a manifold regularization program 108A, 108B that may interact with a software program 114, such as a deep learning neural network program, to receive input data (such as multivariate time series data), apply a regularized-VAE to the input data, whereby the regularized-VAE includes the VAE with a manifold regularization term, to generate and display an efficient representation of the multivariate time series input data in an embedded space such that a temporal relationship/connection (i.e. smoothness) between data points in the multivariate time series data is detected and maintained when reconstructing the multivariate time series data in the embedded space.
Referring now to
Referring now to
However, and as previously described, when dealing with multivariate time series data, the VAE model 302 may not take into account the continuity or time order of the inputted multivariate time series data, and therefore, does not capture temporal relationships between data points. For example, the input data may include snapshots of the sensor data at 2 consecutive data time sequences or points. However, when receiving this sensor data, the VAE 302 may not care if two data points are next to each other in a time sequence, and in turn, the reconstruction of the data may be represented and embedded randomly in the embedded space. Conversely, the manifold regularization program 108A, 108B may take into account that a data value at a given time point, by itself, does not carry as much useful information as it would when the data value is analyzed in the context of its neighboring time values. Therefore, the manifold regularization program 108A, 108B may optimize the VAE formula 202 by introducing a manifold regularization term 304 in the VAE 202 model such that there is a temporal smoothness to the displayed reconstructed data in the embedding space, whereby the temporal smoothness represents the temporal relationship/connection between the data points. Specifically, the manifold regularization program 108A, 108B may apply the VAE 302 along with the manifold regularization term 304 (a regularized-VAE) to learn a representation of the input data such that data points from consecutive time sequences are temporally smooth in the reconstruction of the data in an embedded space, whereby the data points are represented in the embedded space in a way that indicates a temporal relationship between the data points. In turn, the reconstructed data in the embedded space may be represented in such a way that is easier to parse, identify patterns, create labeled data for supervised modeling approaches, and to summarize and explain anomalies.
Formulation of the regularized-VAE (i.e. the VAE with the manifold regularization term included) according to an embodiment of the present invention will now be described in greater detail with respect to
x∈
m×n,
and the latent representation,
z∈
k(m×n>>k),P(x,z)=P(x|z)P(z)
Given the observed data x, the variational autoencoder approximates the posterior P(z|x) via the encoder model and the likelihood P(x|z) via a generative model (decoder), for the decoder the latent variable z is drawn from the prior P(z). To learn the parameters of the encoder and the decoder models, the KL divergence between the variational posterior Q(z|x) and true posterior P(z|x) is minimized by maximizing the equivalent Evidence Lower BOund (ELBO).
ELBO(θ,ϕ)=qθ(z|x)[log pϕ(x|z)]−(qθ(z|x)∥p(z)) (1)
L=D−A (2)
S(θ)=fTLf (3)
(θ,ϕ)=λS(θ)−ELBO(θ,ϕ) (5)
A detailed description of a network architecture 400 associated with the manifold regularization program 108A, 108B according to one embodiment is depicted in
X∈
M×N
The manifold regularization program 108A, 108B may include an encoder network 402 that is composed of 4 strided 2D convolution layers 406 with ReLU activation followed by a flatten layer 416 and 2 fully connected layers 426. Each convolution layer uses filters' size (1, 3, c) where c is a channel depth of an input feature to the convolution layer. The output of the last dense layer is split into 2 parts—zμ, the mean of the embedding and the variance, as well as z|σ, the natural logarithm of the variance. The manifold regularization program 108A, 108B may also include a decoder network 404 composed of a fully connected layer followed by 4 strided de-convolution layers 436 and a 2D convolution layer 446 with filter size (1, 1, c) where c is the channel depth of the input feature to the convolution layer. All layers except the last layer uses ReLU as the activation. Each de-convolution layer uses a filter size of (1, 3, c) where c is the channel depth of the input feature to the layer. The output of the last layer includes a reconstruction of the input data represented by:
X′∈
M×N
In turn, the manifold regularization program 108A, 108B may train the regularized-VAE model using overlapping time series windows. As previously described, the manifold regularization program 108A, 108B may use the regularized-VAE to analyze multivariate time series data using overlapping, time sliding windows on sequentially monitored data. An illustration 500 of overlapping, sliding time windows 502 applied to multivariate time series data is depicted in
In turn, the manifold regularization program 108A, 108B may use the trained regularized-VAE model to summarize and visualize multivariate time series data. Specifically, for example, the manifold regularization program 108A, 108B may use the trained regularized-VAE model and one or more plotting techniques (such as a 3-Dimensional graph, a scatter plot, and/or a heat map) to visualize the summarized multivariate time series data. More specifically, according to one embodiment, the manifold regularization program 108A, 108B may receive a representation of multivariate time series data, which may be represented by the formula:
X∈
M×N
z∈
k
To further visualize different states, transitions and loops in the data, the manifold regularization program 108A, 108B may also perform principal component analysis (PCA) on the embeddings of the time series data to reduce the dimension of an embedding space.
Then, according to one embodiment, and as depicted in
It may be appreciated that
The present invention may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention. The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers, and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
Data processing system 110, 1104 is representative of any electronic device capable of executing machine-readable program instructions. Data processing system 1102, 1104 may be representative of a smart phone, a computer system, PDA, or other electronic devices. Examples of computing systems, environments, and/or configurations that may represented by data processing system 1102, 1104 include, but are not limited to, personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, network PCs, minicomputer systems, and distributed cloud computing environments that include any of the above systems or devices.
User client computer 102 (
Each set of internal components 1102a, b, also includes a R/W drive or interface 1132 to read from and write to one or more portable computer-readable tangible storage devices 1137 such as a CD-ROM, DVD, memory stick, magnetic tape, magnetic disk, optical disk or semiconductor storage device. A software program, such as a manifold regularization program 108A and 108B (
Each set of internal components 1102a, b also includes network adapters or interfaces 1136 such as a TCP/IP adapter cards, wireless Wi-Fi interface cards, or 3G or 4G wireless interface cards or other wired or wireless communication links. The manifold regularization program 108A (
Each of the sets of external components 1104a, b can include a computer display monitor 1121, a keyboard 1131, and a computer mouse 1135. External components 1104a, b can also include touch screens, virtual keyboards, touch pads, pointing devices, and other human interface devices. Each of the sets of internal components 1102a, b also includes device drivers 1140 to interface to computer display monitor 1121, keyboard 1131, and computer mouse 1135. The device drivers 1140, R/W drive or interface 1132, and network adapter or interface 1136 comprise hardware and software (stored in storage device 1130 and/or ROM 1124).
It is understood in advance that although this disclosure includes a detailed description on cloud computing, implementation of the teachings recited herein are not limited to a cloud computing environment. Rather, embodiments of the present invention are capable of being implemented in conjunction with any other type of computing environment now known or later developed.
Cloud computing is a model of service delivery for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g. networks, network bandwidth, servers, processing, memory, storage, applications, virtual machines, and services) that can be rapidly provisioned and released with minimal management effort or interaction with a provider of the service. This cloud model may include at least five characteristics, at least three service models, and at least four deployment models.
Characteristics are as follows:
On-demand self-service: a cloud consumer can unilaterally provision computing capabilities, such as server time and network storage, as needed automatically without requiring human interaction with the service's provider.
Broad network access: capabilities are available over a network and accessed through standard mechanisms that promote use by heterogeneous thin or thick client platforms (e.g., mobile phones, laptops, and PDAs).
Resource pooling: the provider's computing resources are pooled to serve multiple consumers using a multi-tenant model, with different physical and virtual resources dynamically assigned and reassigned according to demand. There is a sense of location independence in that the consumer generally has no control or knowledge over the exact location of the provided resources but may be able to specify location at a higher level of abstraction (e.g., country, state, or datacenter).
Rapid elasticity: capabilities can be rapidly and elastically provisioned, in some cases automatically, to quickly scale out and rapidly released to quickly scale in. To the consumer, the capabilities available for provisioning often appear to be unlimited and can be purchased in any quantity at any time.
Measured service: cloud systems automatically control and optimize resource use by leveraging a metering capability at some level of abstraction appropriate to the type of service (e.g., storage, processing, bandwidth, and active user accounts). Resource usage can be monitored, controlled, and reported providing transparency for both the provider and consumer of the utilized service.
Service Models are as follows:
Software as a Service (SaaS): the capability provided to the consumer is to use the provider's applications running on a cloud infrastructure. The applications are accessible from various client devices through a thin client interface such as a web browser (e.g., web-based e-mail). The consumer does not manage or control the underlying cloud infrastructure including network, servers, operating systems, storage, or even individual application capabilities, with the possible exception of limited user-specific application configuration settings.
Platform as a Service (PaaS): the capability provided to the consumer is to deploy onto the cloud infrastructure consumer-created or acquired applications created using programming languages and tools supported by the provider. The consumer does not manage or control the underlying cloud infrastructure including networks, servers, operating systems, or storage, but has control over the deployed applications and possibly application hosting environment configurations.
Infrastructure as a Service (IaaS): the capability provided to the consumer is to provision processing, storage, networks, and other fundamental computing resources where the consumer is able to deploy and run arbitrary software, which can include operating systems and applications. The consumer does not manage or control the underlying cloud infrastructure but has control over operating systems, storage, deployed applications, and possibly limited control of select networking components (e.g., host firewalls).
Deployment Models are as follows:
Private cloud: the cloud infrastructure is operated solely for an organization. It may be managed by the organization or a third party and may exist on-premises or off-premises.
Community cloud: the cloud infrastructure is shared by several organizations and supports a specific community that has shared concerns (e.g., mission, security requirements, policy, and compliance considerations). It may be managed by the organizations or a third party and may exist on-premises or off-premises.
Public cloud: the cloud infrastructure is made available to the general public or a large industry group and is owned by an organization selling cloud services.
Hybrid cloud: the cloud infrastructure is a composition of two or more clouds (private, community, or public) that remain unique entities but are bound together by standardized or proprietary technology that enables data and application portability (e.g., cloud bursting for load-balancing between clouds).
A cloud computing environment is service oriented with a focus on statelessness, low coupling, modularity, and semantic interoperability. At the heart of cloud computing is an infrastructure comprising a network of interconnected nodes.
Referring now to
Referring now to
Hardware and software layer 60 includes hardware and software components. Examples of hardware components include: mainframes 61; RISC (Reduced Instruction Set Computer) architecture based servers 62; servers 63; blade servers 64; storage devices 65; and networks and networking components 66. In some embodiments, software components include network application server software 67 and database software 68.
Virtualization layer 70 provides an abstraction layer from which the following examples of virtual entities may be provided: virtual servers 71; virtual storage 72; virtual networks 73, including virtual private networks; virtual applications and operating systems 74; and virtual clients 75.
In one example, management layer 80 may provide the functions described below. Resource provisioning 81 provides dynamic procurement of computing resources and other resources that are utilized to perform tasks within the cloud computing environment. Metering and Pricing 82 provide cost tracking as resources are utilized within the cloud computing environment, and billing or invoicing for consumption of these resources. In one example, these resources may comprise application software licenses. Security provides identity verification for cloud consumers and tasks, as well as protection for data and other resources. User portal 83 provides access to the cloud computing environment for consumers and system administrators. Service level management 84 provides cloud computing resource allocation and management such that required service levels are met. Service Level Agreement (SLA) planning and fulfillment 85 provide pre-arrangement for, and procurement of, cloud computing resources for which a future requirement is anticipated in accordance with an SLA.
Workloads layer 90 provides examples of functionality for which the cloud computing environment may be utilized. Examples of workloads and functions which may be provided from this layer include: mapping and navigation 91; software development and lifecycle management 92; virtual classroom education delivery 93; data analytics processing 94; transaction processing 95; and manifold regularization 96. A manifold regularization program 108A, 108B (
The descriptions of the various embodiments of the present invention have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.