The present invention relates generally to the field of computer model weather forecasts, and more particularly to weather forecasting using artificial neural network models.
Forecasting models may be utilized to evaluate historical weather associated with a geographic area in order to generate weather predictions relating to said geographic area. However, these weather predictions are subject to uncertainty due to atmospheric impacts derived from atmospheric factors such as sky conditions along with temporal impacts such as distinctions derived from the current time period and the time period the applicable forecast applies to. The aforementioned issues can result in forecasts that apply to misappropriate time windows or non-linear relationships across time windows. Naturally, these issues also inhibit the linking of disconnected weather anomalies across various locations referred to as teleconnections.
Accordingly, there is a need for a scalable automated means to connect weather anomalies across multiple geographic locations utilizing models that circumvent inefficiencies (e.g., incorrect predictions, misapplication of time windows, improper training, etc.)
Additional aspects and/or advantages will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the invention.
Embodiments of the present invention disclose a method, system, and computer program product for weather prediction is provided. A computer receives a first weather event associated with a first location. The computer further inputs the first weather event into a machine learning model, the machine learning model having been trained via mapping historical weather data into a latent space and via identifying, in the latent space, climate teleconnections amongst historical weather events at various locations. The computer further receives from the machine learning model a weather prediction for a second location, the weather prediction being based on a predicted climate teleconnection between the first location and the second location with respect to the first weather event, wherein the machine learning model maps the first weather event into latent code for the latent space in order to generate the weather prediction for the second location.
In some embodiments, the computing device is configured to train a neural network deep learning model to compute a time series modeling and the one or more time series forecasts. The use of a neural network increases the efficiency of the time series modeling and forecasting.
In some embodiments, the training of the deep learning model is unsupervised. The use of unsupervised training permits a broader recognition of patterns and aids in discovering hidden patterns.
In some embodiments, the system for weather prediction includes an encoder neural network and a decoder neural network configured to encode data into a latent space as data codes and decode the data codes in which the computer is configured to predict weather events pertaining to geographic locations based on teleconnections ascertained utilizing the data codes. The use of neural networks increases the efficiency of operations and facilitates training.
These and other objects, features, and advantages of the present invention will become apparent from the following detailed description of illustrative embodiments thereof, which is to be read in connection with the accompanying drawings. The various features of the drawings are not to scale as the illustrations are for clarity in facilitating one skilled in the art in understanding the invention in conjunction with the detailed description. In the drawings:
The descriptions of the various embodiments of the present invention will be presented for purposes of illustration but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to those skilled in the art that the following description of exemplary embodiments of the present invention is provided for illustration purpose only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces unless the context clearly dictates otherwise.
It should be understood that the Figures are merely schematic and are not drawn to scale. It should also be understood that the same reference numerals are used throughout the Figures to indicate the same or similar parts.
In the context of the present application, where embodiments of the present invention constitute a method, it should be understood that such a method is a process for execution by a computer, i.e. is a computer-implementable method. The various steps of the method therefore reflect various parts of a computer program, e.g. various parts of one or more algorithms.
Also, in the context of the present application, a system may be a single device or a collection of distributed devices that are adapted to execute one or more embodiments of the methods of the present invention. For instance, a system may be a personal computer (PC), a server or a collection of PCs and/or servers connected via a network such as a local area network, the Internet and so on to cooperatively execute at least one embodiment of the methods of the present invention.
As described herein, the term “latent space” refers to a multi-dimensional space including feature values that are not directly interpreted, but such feature values are used to encode a meaningful internal representation of externally observed events. In addition, the term “a lower-dimensional latent space” refers to a reduction of an original spectral dimension to increase the efficiency of a search. In other words, the latent space is a collection of deep structural patterns that include high explanatory value in portraying the variability of time series over space and time.
As described herein, a “time series” is a sequence of data points taken at successive equally spaced points in time. “Time series forecasting” relates to the use of artificial intelligence to predict future values based on previous observed values. Time series data has a natural temporal ordering.
As described herein, a “data code” is a product of dimensionality reduction in which information from an initial space is compressed into data points within a latent space in a manner in which the result of the compression is an encodable reference to said information. Due to the encoding process encoding inputs as distributions over the latent space, the data codes are samples of the respective distributions.
The following described exemplary embodiments provide a method, computer system, and computer program product for weather prediction. Modeling and forecasting across massive amounts of time-series has multiple limitations regarding factors such as computing resources, prediction accuracies, and time application (e.g., time constraints, incorrect time windows, etc.). In particular, efficiently and accurately predicting weather across multiple geographic locations encounters a myriad of issues due to volatile geographic-specific issues such as temporal shifts and weather impactors (e.g., atmospheric pressure, sea surface temperatures, etc.), in addition to the aforementioned factors also impacting the amount of time to process the voluminous amounts of data in real-time. For example, the combination of voluminous sources of historical climate data, the integration of abundant spatial-temporal data, and misapplication of causal analyses may directly impact predictions relating to a specific geographic area, much less teleconnections derived from said predictions. Teleconnections may be established via models; however, the variability of time, weather, and location not only directly impacts the accuracy of these teleconnections, but also imposes limitations on the models from a computing/processing standpoint. As such, the present embodiments have the capacity to perform weather predictions across various geographic locations in a manner that not only increases the accuracy of teleconnections by accounting for temporal shifts, but also reduces the amount of computing processing and power requirements to perform the aforementioned by utilizing lower dimensional spaces to map unconventional data codes representative of weather events and decoding the data codes which facilitates synthesized weather events.
Referring now to
In some embodiments, first geographic location historical climate data database 145 and second geographic location historical climate data database 155 are configured to include a plurality of time-series data pertaining to the respective geographic locations (or others locations, if applicable, in various embodiments of the invention). The time-series data stored by database 145 and database 155 is derived from sources including but not limited to weather/climate agencies, external databases accessed by one or more crawlers associated with server 120, climate research organizations, satellite-based systems, crowd-sourcing systems, sensor-based systems, or any other applicable weather/climate data source known to those of ordinary skill in the art. For descriptive purposes, server 120 is designed to transmit one or more data feeds sourced from first geographic location historical climate data database 145 and/or second geographic location historical climate data database 155 in order for modeling module 170 to train a neural network deep learning model to compute time series models; however, the time series models may be built based on the time series values with any suitable and known model building method. The time series models may have many forms and represent different stochastic processes. The one or more data feeds may include textual data, image data, climate data (e.g., temperature, wind speed, precipitation, etc.) or any other applicable type of data and/or combination thereof sourced from first geographic location historical climate data database 145 and/or second geographic location historical climate data database 155 in which said data feeds may represent one or more weather events and/or derivatives thereof associated with the respective geographic locations. In some embodiments the one or more data feeds include input climate data images accounting for spherical images, 2D planar images, or any other applicable data image configured to be processed by a neural network. It should be noted that modeling module 170 functions as an autoencoder configured to utilize data training to regularize encoding distribution in order to support a latent space designed for efficient data generation. In some embodiments, modeling module 170 inputs data from the data feeds within the models as time-series data accounting for periodic samples of weather events. Server 120 and/or modeling module 170 are configured to detect and extract components and/or impacts within the climate data images including but not limited to mortal contributions (e.g. greenhouse gases, deforestation, overpopulation, etc.), land surface elements (hydrology, vegetation, precipitation coverage, etc.); atmospheric impacts (nimbus-related data, etc.), sea ice elements (radiation absorption, heat exchange between ocean and atmosphere, etc.), or any other applicable ascertainable climate data image features known to those of ordinary skill in the art. In some embodiments, each data feed is configured to include a distinct type of data; however, the data feeds may be encoded into one or more vectors configured to be representative of weather events. For example, two data feeds derived from first geographic location historical climate data database 145 may have distinct types of data in which the one data feed includes textual data and the other data feed includes times series data associated with a weather event at first geographic location 140. Both forms of data may be aggregated into one or more vectors representative of the weather event at the respective geographic location.
Server 120 is configured to support functions such as natural language processing (NLP), image processing, encoding/decoding, noise reducers, or any other applicable functions configured to optimize data for transmission of data feeds from databases 145 and 155 along with applicable third party databases accessible by server 120. In addition, server 120 is configured to generate a centralized platform designed to allow users to access components of environment 100 such as user inputs (e.g., hypothesis, prior knowledge, analytics review, etc.). It should be noted that server 120 may perform aggregation, filtration, optimization, etc. of data derived from databases 145 and 155 and other applicable databases in order to generate the data feeds and store them in database 130 in which the data feeds are transmitted to modeling module 170 for training and processing. Teleconnection module 160 is communicatively coupled to modeling module 170 and is configured to identify causal connections/correlations between source and target datasets; however, one of the purposes of teleconnection module 160 is to ascertain correlated weather events based on data received from historical climate data sources and any other applicable source. For example, teleconnection module 160 may monitor extreme weather events at first geographic location 140 and other applicable geographic locations over periods of time in order to predict a weather event associated with second geographic location 150 based on a plurality of tele-connected extreme weather events. Modeling module 170 is configured to be data agnostic allowing different types of climate data to seamlessly be used within the same framework regardless of specific characteristics such as time, location, etc.
Referring now to
Modeling module 170 performs mapping in the latent space by performing contrastive learning on the climate data. As previously stated, the resulting output of encoder 210 is the plurality of data codes in which each of the data codes are encoded and are representative and/or an identifier of one or more samples of a weather event. Modeling module 170 is configured to perform the mapping to the latent space in a manner in which it regularizes the covariance matrix and the mean of the distributions returned by encoder 210 in order to prevent overfitting. Modeling module 170 performs this by enforcing distributions in close proximity to a standard normal distribution.
Modeling module 170 clusters the vectors of weather events based on similarity/commonality of one or more elements of the weather events in which the clusters are correlated groups based upon content, context, etc. For example, weather events associated with heavy rain within first geographic location 140 (e.g., precipitation levels above a threshold) may be clustered despite occurring across various periods of time. In some embodiments, the vectors are sequentially ordered based on a corresponding time stamp associated with each vector (e.g., time data was received by a sensor, day of the weather event, etc.). Clustering may be performed based one or more similarity measures including but not limited to Euclidean distance, Manhattan distance, dynamic time warping (DTW) distance, Minkowski Distance, Cosine distance, Correlation coefficients (e.g. Pearson, Spearman), expectation maximization with a Gaussian mixture model, or any other applicable similarity measuring mechanism known to those of ordinary skill in the art. It should be noted that one of the purposes of clustering weather events is to allow teleconnection module 160 to generate predictions of weather events associated with second geographic location 150 based on at least data derived from first geographic location historical climate data database 145. In some embodiments, while modeling module 170 is deploying the deep learning models server 120 and/or teleconnection module 160 is configured to extract features from first geographic location historical climate data database 145 and/or second geographic location historical climate data database 155 in which the features include but are not limited to functional dependencies, correlations, spatial-temporal data, sensor type of applicable sensor data was collected on, climate variables, area covered size, data resolution, image quality, noise level, land cover usage, or any other ascertainable features known to those of ordinary skill in the art.
Teleconnection module 160 stores the one or more predicted teleconnections in a teleconnection database 240 in which teleconnection database 240 is designed to be crawled via a teleconnections crawler configured to validate a plurality of teleconnection hypotheses. In some embodiments, a plurality of hypotheses may be automated inferences or targets generated by a hypotheses module communicatively coupled to teleconnection module 160, in which the hypotheses module is utilized by modeling module 170 during the training phase. In some embodiments, the hypotheses are derived from an input module 250 designed to ascertain the hypotheses from a plurality of inputs provided by users on a computing device operating the centralized platform, or the teleconnection hypotheses may be based on a combination of data derived from the prior knowledge database and/or input module 250. For example, the hypotheses module may receive from input module 250 user inputs such as but not limited to a geographic region of interest for where the teleconnections will be analyzed, a range of valid temporal shifts, and/or combinations of weather data pertaining to first geographic location 140 and second geographic location 150 accounting for the temporal shifts. It should be noted that the hypotheses module enables modeling module 170 to perform clustering of weather event representations within the latent space into vectors based upon context/content, location, temporal shifts, and/or a combination thereof.
In some embodiments, the clustering of vectors within the latent space results in teleconnection database 240 characterization, in which vectors may include similar differences for the latent representations of weather data events integrating the locations and the temporal shifts (i.e., all combinations of time and temporal shift within the geographic location). Knowledge module 230 is communicatively coupled to teleconnection module 160 allowing the teleconnections crawler to validate one or more hypotheses based on at least data derived from the prior knowledge database. Input module 250 is further designed to support interaction and customization of one or more attributes or mechanisms of architecture 200 via the centralized platform. For example, manipulation of data utilized by modeling module 170 may be necessary in order to ascertain the optimal accuracy for curation of teleconnections that provide the best prediction of a weather event associated with second geographic location 150 based on the results of the teleconnection crawl searching teleconnection database 240. Target prediction times specified by users or a hypothesis allow modeling module 170 to operate models that include a designated time period for which a predicted weather event occurs. For example, a cluster may contain similar differences for the latent representation of A_t vs B_(t+shift), in which t represents time of weather event, A represents a first geographic location, B represents a second geographic location, shift represents the temporal shift, and wherein the computation is performed for t, shift, and A & B within a designated region of interest. In a preferred embodiment, the predicted teleconnection is one of the estimated valid teleconnections for code A B+temporal shift. Server 120 ultimately simulates weather scenarios for the predicted B that would happen at temporal shift t.
Server 120 identifies a plurality of temporal shifts within data derived from one or more of database 130, first geographic location historical climate data database 145, the prior knowledge database, and/or input module 250 in order to normalize time associated with weather events of geographic locations. The temporal shifts account for time windows of weather events in which the ascertainable difference in time and other ascertainable weather-related data between samples of weather events associated with first geographic location 140 are integrated into the processing of modeling module 170 and used to predict a weather event associated with second geographic location 150.
Decoder 220 is designed to decode the plurality of data codes in order for a synthetic data module to generate a plurality of synthetic climate data associated with a predicted weather event pertaining to second geographic location 150. It should be noted that the plurality of synthetic climate data may be a representation of a predicted weather event associated with second geographic location 150 or data utilized by server 120 to generate a predicted weather event based on the optimal model(s) generated by modeling module 170 and teleconnections detected by the teleconnections crawler within teleconnection database 240.
Referring now to
Modeling module 170 iteratively not only trains applicable datasets and operates the respective models, but also utilizes the latent space to assist with generation of outputs that represent missing variables through successive iterations. For example, modeling module 170 may output prediction samples of weather events based on one or more latent variables in the latent space, which may be calculated from predicated latent samples. It should be noted that modeling module 170 utilizes encoder 210 to not only encode the weather events for mapping into the latent space, but also for efficiency purposes in which encoder 210 cleans/de-noises applicable data and determines applicable weights for weather data within the data feeds. The encoder 210 ultimately reduces the amount of data that decoder 220 has to decode.
Modeling module 170 selects the applicable model identified by server 120 based on the relevant weather data and associated location, which in some instances is provided by user 320 via input module 250. In some embodiments, server 120 identifies the applicable model based upon location of the weather event, content/context similarity, or any other applicable factor. Modeling module 170 may be used to fill gaps of information via predictions based on data derived from previous models. Once the applicable model is selected, the weather event and/or samples thereof are mapped in the latent space by modeling module 170 via encoder 210 encoding the applicable weather data and assigning said data the plurality of data codes. Within the latent space, teleconnection module 160 is continuously ascertaining teleconnections by monitoring teleconnection patterns within data collected from various geographic locations across various periods of time. For example, changes to the atmosphere and/or ocean may significantly impact the weather of first geographic location 140, or changes to the amount of clouds associated with first geographic location 140 may significantly impact the weather of second geographic location 150. Teleconnection module 160 is configured to account for modifications caused by temporal effects, significant climate changes, etc. Furthermore, the one or more hypotheses are utilized to map two correlated weather events closely within the latent space, while contrastive learning is utilized to optimize internally the agreement in the latent space between the two weather events (e.g., at different times)
Server 120 stores the plurality of data codes in database 130 in which the data codes are encoded samples of the weather data and the teleconnection crawler utilizes the data codes to query teleconnection database 240. In addition to including teleconnections, teleconnection database 240 may also include correlations with known teleconnection indices. In particular, the plurality of data codes may be references to the weather events in which data derived from the correlations are accounted for in the data codes prior to the data codes being decoded by decoder 220. In some embodiments, nodes of the one or more layers of the applicable neural network operated by modeling module 170 represent vectors of the time series values, in which at least one of the vectors is a temporal vector. Teleconnection database 240 outputs the applicable teleconnection; however, it is because of the cluster of paired differences within the latent space that are assigned respective data codes that allows the optimal weather event of first geographic location 140 to be ascertained. As teleconnection module 160 retrieves suitable teleconnections based on the combination of data received from input module 250 (e.g., desired geographic location, applicable timeframe, etc.), data derived from prior knowledge database, database 130, and/or any other applicable data source, modeling module 170 is continuously using inferences to support creation of vectors including paired differences via server 120.
Decoder 220 is configured to decode the plurality of data codes based upon the result of the query of teleconnection database 240. It should be noted that the decoding process ascertains one or more sets of values pertaining to second geographic location 150 integrating the plurality of temporal shifts in which the sets of values are congruent with the weather data ascertained from the plurality of data codes pertaining to first geographic location 140 except for the integration of the plurality of temporal shifts and data specific to second geographic location 150. The results of the decoded plurality of data codes are transmitted to synthetic data module 260 in which synthetic data module 260 is configured to generate a predicted weather event associated with second geographic location 150 based on the aforementioned (e.g. region of interest, temporal shifts, etc.). The predicted weather event is configured to be presented to user 320 via user interfaces on the centralized platform operated by computing device 310.
Referring now to
At step 420 of process 400, server 120 receives the plurality of hypotheses from the hypotheses module. In some embodiments, the hypotheses are derived from input module 250 in which user 320 provides at least a geographic region of interest for where the teleconnections will be analyzed and/or a range of valid temporal shifts relating to a geographic area.
At step 430 of process 400, modeling module 170 selects the applicable trained model relating to the geographic location ascertained from input module 250, in which the selection of the model may be based upon similarity of contextual information associated with the applicable weather events and/or weather data ascertained during step 410. In some embodiments, user 320 selects the applicable trained model from a list of pretrained models configured to be filtered by criteria selected by user 320 (e.g., data, type of weather event, etc.).
At step 440 of process 400, server 120 instructs modeling module 170 to train encoder 210 and decoder 220 in order for server 120 to compute the plurality of data codes, in which the training process will include one or more elements of the geographic location specified based on data derived from input module 250. For example, the training process includes weather data associated with second geographic location 150. In some embodiments, the trained models are stored in database 130 along with contextual information used during the training such as geographic location and time window. In some embodiments, the plurality of data codes are computed based on encoder 210 which has been trained by modeling module 170.
At step 450 of process 400, server 120 instructs encoder 210 to encode the weather data representative of samples of weather events derived from the applicable model selected in step 430 into the latent space. The encoding of the weather data includes encoder 210 assigning the plurality of data codes across the weather data. In a preferred embodiment, each weather event is assigned at least one data code; however, a single data code may be assigned the multiple weather events based upon one or more similarities detected among the weather events.
At step 460 of process 400, server 120 instructs teleconnection module 160 to begin the process of searching for teleconnections that align with at least one of the hypotheses derived from the hypothesis module. As described herein, a hypothesis is a user input representing future weather predictions pertaining to a geographic location. In some embodiments, the teleconnections crawler utilizes the plurality of data codes to query teleconnection database 240 and the teleconnections crawler utilizes a contrastive learning neural network or any other applicable machine learning techniques used to learn the general features of a dataset without labels via teaching the model which data points are similar or different, as part of the teleconnection database 240 crawling process. In some embodiments, teleconnection module 160 instructs the hypotheses module to perform a validation process in order for server 120 to cluster the one or more vectors. The clustering process may be based upon one or more paired differences between sets of weather data in which the paired differences may pertain to distinctions in geographic locations, time window, temporal shifts, or any other ascertainable data and/or metadata derived from database 130, first geographic location historical climate data database 145 and second geographic location historical climate data database 155, the prior knowledge database, and/or any other applicable data source.
At step 470 of process 400, teleconnection module 160 selects the applicable teleconnection within teleconnection database 240 in response to the teleconnection crawler detecting the teleconnection that aligns with the applicable data. It should be noted that the detected teleconnection is a representation of a relevant predicted weather event associated with second geographic location 150 based on weather events associated with first geographic location 140. The teleconnection is detected based upon at least one of the clustered pair differences, temporal shifts, weather event context/content, etc. Selected teleconnections may be stored in database 130.
At step 480 of process 400, decoder 220 decodes the applicable data code associated with the teleconnection selected by teleconnection module 160 into realistic data. Realistic data may include but not is not limited to synthesized data representing spatial-temporal statistical properties or any other applicable variations of data resembling that within first geographic location historical climate data database 145 and second geographic location historical climate data database 155 (e.g., historical weather data from applicable sources). In some embodiments, during this decoding step random noise samples are decoded to provide predictive distributions which are configured to be integrated into the realistic data. It should be noted that the realistic data is configured to be a representation and/or utilized to generate a weather event prediction associated with second geographic location 150. The predictions can include any properties of the joint distribution—including the mean or median, variance, different quantiles, etc. Encoder 210 and decoder 220 are designed to be communicatively coupled through modeling module 170 in order to ensure that modeling module 170 is continuously optimizing/reducing the size of output data as a result of the decoding compared to the size of the input data being encoded via encoder 210. This configuration allows not only the reduction of computing resources necessary for server 120 to sustain processing, but also increases the efficiency of operations because decoder 220 has smaller amount of data to process.
At step 490 of process 400, synthetic data module 260 utilizes the decoded data and generates synthesized realistic weather field data representing a weather event pertaining to second geographic location 150 in accordance with the data derived from input module 250. For example, the synthesized realistic weather field data may be a set of points indexed by two-dimensional coordinates (e.g., latitude and longitude) visualized by images.
At step 495 of process 400, server 120 presents the synthesized realistic weather field data to user 320 via the centralized platform operating on computing device 310. The synthesized realistic weather field data may be presented via one or more graphical representations including but not limited to graphs, charts, weather map visualizations, interactive text data, or any other applicable graphical representation known to those of ordinary skill in the art. The synthesized realistic weather field data may be used by one or more down-stream systems including but not limited to insurance weather-aware risk modeling software, agricultural counter-measuring systems, traffic data platforms, flood management systems, etc.
With the foregoing overview of the example architecture, it may be helpful now to consider a high-level discussion of an example process.
At step 510 of process 500, server 120 receives a first weather event associated with first geographic location 140. However, server 120 may receive weather events and/or an aggregation of similar weather events from a plurality of geographic locations in order to ascertain a predicted weather event for a region of interest specified by input module 250.
At step 520 of process 500, server 120 inputs the first weather event into a machine learning model operated by the modeling module 170. In some embodiments, the machine learning model having been trained by mapping historical weather data into the latent space, which allows server 120 to identify climate teleconnections amongst historical weather events at various locations within the latent space.
At step 530 of process 500, server 120 receives a weather prediction for second location 150, the weather prediction being based on a predicted climate teleconnection between the first location and the second location with respect to the first weather event, wherein the teleconnections machine learning model maps the first weather event into latent code for the latent space in order to generate the weather prediction for second geographic location 150.
Data processing system 602, 604 is representative of any electronic device capable of executing machine-readable program instructions. Data processing system 602, 604 may be representative of a smart phone, a computer system, PDA, or other electronic devices. Examples of computing systems, environments, and/or configurations that may represented by data processing system 602, 604 include, but are not limited to, personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, network PCs, minicomputer systems, and distributed cloud computing environments that include any of the above systems or devices.
The one or more servers may include respective sets of components illustrated in
Each set of components 600 also includes a R/W drive or interface 614 to read from and write to one or more portable computer-readable tangible storage devices 608 such as a CD-ROM, DVD, memory stick, magnetic tape, magnetic disk, optical disk or semiconductor storage device. A software program, such as computing event management system 210 can be stored on one or more of the respective portable computer-readable tangible storage devices 608, read via the respective RAY drive or interface 618 and loaded into the respective hard drive.
Each set of components 600 may also include network adapters (or switch port cards) or interfaces 616 such as a TCP/IP adapter cards, wireless wi-fi interface cards, or 3G or 4G wireless interface cards or other wired or wireless communication links. Applicable software can be downloaded from an external computer (e.g., server) via a network (for example, the Internet, a local area network or other, wide area network) and respective network adapters or interfaces 616. From the network adapters (or switch port adaptors) or interfaces 616, the centralized platform is loaded into the respective hard drive 608. The network may comprise copper wires, optical fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
Each of components 600 can include a computer display monitor 620, a keyboard 622, and a computer mouse 624. Components 600 can also include touch screens, virtual keyboards, touch pads, pointing devices, and other human interface devices. Each of the sets of components 600 also includes device processors 602 to interface to computer display monitor 620, keyboard 622 and computer mouse 624. The device drivers 612, R/W drive or interface 618 and network adapter or interface 618 comprise hardware and software (stored in storage device 604 and/or ROM 606).
It is understood in advance that although this disclosure includes a detailed description on cloud computing, implementation of the teachings recited herein are not limited to a cloud computing environment. Rather, embodiments of the present invention are capable of being implemented in conjunction with any other type of computing environment now known or later developed.
Cloud computing is a model of service delivery for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, network bandwidth, servers, processing, memory, storage, applications, virtual machines, and services) that can be rapidly provisioned and released with minimal management effort or interaction with a provider of the service. This cloud model may include at least five characteristics, at least three service models, and at least four deployment models.
Characteristics are as follows:
Service Models are as follows:
Deployment Models are as follows:
A cloud computing environment is service oriented with a focus on statelessness, low coupling, modularity, and semantic interoperability. At the heart of cloud computing is an infrastructure comprising a network of interconnected nodes.
Referring now to
Referring now to
Hardware and software layer 60 includes hardware and software components. Examples of hardware components include: mainframes 61; RISC (Reduced Instruction Set Computer) architecture based servers 62; servers 63; blade servers 64; storage devices 65; and networks and networking components 66. In some embodiments, software components include network application server software 66 and database software 68.
Virtualization layer 60 provides an abstraction layer from which the following examples of virtual entities may be provided: virtual servers 61; virtual storage 62; virtual networks 63, including virtual private networks; virtual applications and operating systems 64; and virtual clients 65.
In one example, management layer 80 may provide the functions described below. Resource provisioning 81 provides dynamic procurement of computing resources and other resources that are utilized to perform tasks within the cloud computing environment. Metering and Pricing 82 provide cost tracking as resources are utilized within the cloud computing environment, and billing or invoicing for consumption of these resources. In one example, these resources may include application software licenses. Security provides identity verification for cloud consumers and tasks, as well as protection for data and other resources. User portal 83 provides access to the cloud computing environment for consumers and system administrators. Service level management 84 provides cloud computing resource allocation and management such that required service levels are met. Service Level Agreement (SLA) planning and fulfillment 85 provide pre-arrangement for, and procurement of, cloud computing resources for which a future requirement is anticipated in accordance with an SLA.
Workloads layer 90 provides examples of functionality for which the cloud computing environment may be utilized. Examples of workloads and functions which may be provided from this layer include: mapping and navigation 91; software development and lifecycle management 92; virtual classroom education delivery 93; data analytics processing 94; and transaction processing 95.
Based on the foregoing, a method, system, and computer program product have been disclosed. However, numerous modifications and substitutions can be made without deviating from the scope of the present invention. Therefore, the present invention has been disclosed by way of example and not limitation.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes,” “including,” “has,” “have,” “having,” “with,” and the like, when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but does not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The descriptions of the various embodiments of the present invention have been presented for purposes of illustration but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. It will be appreciated that, although specific embodiments have been described herein for purposes of illustration, various modifications may be made without departing from the spirit and scope of the embodiments. In particular, transfer learning operations may be carried out by different computing platforms or across multiple devices. Furthermore, the data storage and/or corpus may be localized, remote, or spread across multiple systems. Accordingly, the scope of protection of the embodiments is limited only by the following claims and their equivalent.