Time series data contains sequential data points (e.g., data values) that are observed at successive time durations (e.g., hourly, daily, weekly, monthly, annually, etc.). For example, monthly sales, daily stock prices, annual profits, etc., are examples of time-series data. There are numerous types of prediction models. Time series forecasting may employ one or more processes to observe historical values of time series data and predict future values of the time series data. Time series forecasting results may be presented to a user in a visualization of the predicted future values.
In addition to accurately predicting future values, visualizations of a time series prediction determination should clearly convey the factors contributing to the predicted future values so that, for example, the visualizations can be efficiently interpreted and understood by observers thereof.
Features and advantages of the example embodiments, and the manner in which the same are accomplished, will become more readily apparent with reference to the following detailed description taken in conjunction with the accompanying drawings.
Throughout the drawings and the detailed description, unless otherwise described, the same drawing reference numerals will be understood to refer to the same elements, features, and structures. The relative size and depiction of these elements may be exaggerated or adjusted for clarity, illustration, and convenience.
In the following description, specific details are set forth in order to provide a thorough understanding of the various example embodiments. It should be appreciated that various modifications to the embodiments will be readily apparent to those skilled in the art, and the one or more principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present disclosure. Moreover, in the following description, numerous details are set forth for the purpose of explanation. However, one of ordinary skill in the art should understand that embodiments may be practiced without the use of these specific details. In other instances, well-known structures, methods, procedures, components, and circuits are not shown or described so as not to obscure the description with unnecessary detail. Thus, the present disclosure is not intended to be limited to the embodiments shown but is to be accorded the widest scope consistent with the principles and features disclosed herein.
The present disclosure relates to time series prediction (i.e., forecasting or estimating of future outcome) processes that might be implemented by a variety of devices and systems configured to execute computing applications to effectuate the time series prediction processes disclosed herein. Embodiments of the disclosed time series predictions may be provided in an on premise environment, a cloud computing environment (e.g., Software as a service (SaaS), Platform as a service (PaaS), Infrastructure as a service (IaaS), etc.), and combinations thereof. For context, a prediction module or engine of a computing system might use several modeling techniques to generate a “best” possible prediction for a set of time series data. The system may further compare the predictions generated using each of the different modeling techniques and (automatically) determine the “best” prediction based on one or more quality of prediction objectives to generate predicted values for the input set of time series data.
In some aspects, a significant feature of time series predicting is an ease or clarity of explanation associated with a time series prediction process used to predict future values for a set of time series data. For example, in an effort to support user confidence in the predicted future values generated by a particular time series prediction process and system, an entity (e.g., system or service provider personnel, machine generated explanation, etc.) might explain or otherwise indicate how the predictive model used to generate the predicated values calculated or determined the predictive values. In some aspects, it is desirable to provide an explanation of the predicted future values generated by a time series prediction process and system that are easily and readily understood (i.e., intuitive). Some time series prediction processes might be easier to explain than others.
ŷ
t
=b
t
+s
t
where ŷt represents a predicted data point, bt represents a trend (the trend is quadratic in the current example), and st represents a cycle (the cycle has a length of one year in the present example). In the present example, the modeling engine detected two components. The number of contributing components to a predicted value is not necessarily limited to two components herein, unless stated otherwise.
Referring to
In some instances, a modeling engine or process might not be able to model a trend that accurately corresponds with the entire time series. This might typically be the case when the time series is disrupted. In some such instances, a modeling engine might use a modeling technique that models the variations in the time series, as opposed to modeling the values of the time series. That is, the modeling engine models the differences between two (2) observations as represented below.
{dot over (y)}
t
=y
t
−y
t−1 (1)
where {dot over (y)}t represents the variations of the time series at time t, yr represents a data point at time t, and yt−1 represents a data point at time previous time (t−1).
Regarding a time series decomposition of the variations of the time series {dot over (y)}t, a modeling engine may model the cyclic patterns, {dot over (s)}t, detected in the variations of the time series {dot over (y)}t.
To generate a prediction for the differencing technique as illustrated by the time series example in
ŷ
t
=ŷ
t−1+{dot over (ŷ)}t
Referring to
As discussed regarding the example of
However, explaining or describing the predictive time series data for the differencing technique (e.g., the example of
In some aspects, a solution disclosed by the present disclosure includes, in some embodiments, representing a recursive equation for a prediction as an equivalent equation or expression that highlights, for example, a trend and a cyclic pattern that model the original time series instead of component(s) that model a variation of the time series. In some embodiments, the components specified in the equivalent equation or expression (e.g., a trend and a cyclic pattern) are in the same value space as the original time series.
By rewriting the recurring prediction equation, a desired result is to obtain an expression that is additive. In some embodiments, the resulting expression might be similar, in some aspects, to the regular signal decomposition technique (e.g.,
Still referring to
In an initial step (or a series of steps, executions, operations, etc.) the calculation of a predicted value at time t may be specified with a recursive equation, as follows.
ŷ
t
=ŷ
t−1
+{dot over (s)}
t (2)
As discussed above (e.g., discussion of 5), the predicted value at time t is equal to the sum of the previous predicted value and the component st that models the variations of the time series.
The recursivity of equation (2) can be broken by rewriting an equivalent equation or expression that uses a cumulative sum from the first predicted data point, F. This equivalent expression is as follows.
ŷ
t
=y
F−1+Σi=Ft{dot over (s)}i where t≥F (3)
Equations 2 and 3 are exact equivalents of each other. In equation 3, a sum of the variations from the first predicted data point (F) are used in the calculation of the predicted data point.
As an example, equation 3 can be used to calculate the first predicted data point (F), the first predicted data point that will be equal to the last actual data point (yF−1) plus the predicted variation of the time series for the first predicted data point is ({dot over (s)}F). This first data point is shown in visualization 800 of
Equation 3 yields a result equivalent to the recursive equation 2, except equation 3 is based on a cumulative sum. For example, the second predicted data point will be equal to the last actual data point (yF−1) plus the first predicted variation and the second predicted variation; the third predicted data point will equal the last actual data point (yF−1) plus the first predicted variation, the second predicted variation plus the third predicted variation; etc.
A next step focuses on the predicted data portion of the data visualization, as detailed in
where L is the length of an occurrence of cycle {dot over (s)}t. The slope m can be constant, increasing, or decreasing, depending on whether the average variation of the time series is null, strictly positive, or strictly negative, respectively, over an occurrence of cycle {dot over (s)}t.
Eq. 4 includes the variation of the time series and it is used to calculate the slope (m) of the trend. Stated another way, equation (4) is the formula that corresponds to the slope of the trend and in some aspects is an average. That is, if the average of all of the data points of an occurrence of cycle variation of time series is positive, then there is an increasing trend in the prediction; if the average is negative, then there is a decreasing trend; and if the average is null, then the trend is constant.
Thus, the trend of the forecast, bt, can be formalized as follows.
b
t
=y
F−1(t−F+1)·m (5)
where F is the first predicted data point, L is the length of an occurrence of cycle {dot over (s)}t, and m is the slope of the trend (as specified by equation 4). As seen, equation 5 of the trend starts with a baseline (i.e., the last actual data point (yF−1)) and m is multiplied by where you are in the predicted region 910.
Since the cyclic variation is regular over time (i.e., it is always the same pattern that repeats of the time), it is necessarily a linear trend.
Based on the foregoing, the prediction equation (e.g., equation 3) to calculate a predicted value at t, which breaks to recursivity in eq. (2), can be rewritten to separate the trend component, bt, from the cyclic component st, as outlined in
As a result of the calculations illustrated in
Referring to
Accordingly, we have demonstrated a process to separate the trend component and the cycle component from the predicted data series 1115. The predicted data series obtained in this manner might be exposed to the user via a visualization in terms that a user can readily comprehend and understand.
Regarding the cyclic component st, it is based on the cumulative sum of the cycles {dot over (s)}i from which the slope was extracted and can be expressed as follows.
s
t=Σi=Ft{dot over (s)}i−(t−F+1)·m where t≥F (7)
It is noted that in some embodiments formula 7 only applies to the predicted horizon from the first data point F. However, in some embodiments it might be generalized to apply to the entire data series, as represented below.
s
t=Σi=0p{dot over (s)}F+i−(p+1)·m where p=(t−F)modulo L (8)
Referring to
Note that in some embodiments the trend 1215 is valid for the predicted data region since there is a baseline for the trend. In some embodiments, the baseline for the predicted data will be from the first predicted data point 1220. Stated another way, the trend is determined for the predicted data.
As demonstrated above, the original component {dot over (s)}t that models the variations of the time series is converted into components bt and st that model the values of the time series. The trend component, bt, applies to the predicted data and the slope of bt applies over the entire time series.
At operation 1310, a trend component for the determined equivalent representation of the predicted value is determined. Details related to this operation are disclosed above regarding, for example, equations 4 and 5, as well as
At operation 1315, decomposing the determined equivalent representation of the predicted value is decomposed into a trend component and a cyclic component. Detailed aspects of this particular operation are disclosed hereinabove with respect to
Operation 1320 of process 1300 includes, in some aspects, a practical application including the generation of a visualization for a set of time series data expressed by the decomposition determined at operation 1315. The generation of the visualization based on the decomposition and other determinations of process 1300 provides a novel representation of time series data built on a differencing technique that provides insight into time series data that is, in some aspects, easily understood.
Processor 1605 also communicates with a storage device 1635. Storage device 1635 can be implemented as a single database or the different components of storage device 1635 can be distributed using multiple databases (that is, different deployment data storage options are possible). Storage device 1635 may comprise any appropriate data storage device, including combinations of magnetic storage devices (e.g., a hard disk drive), optical storage devices, mobile telephones, and semiconductor memory devices to support and facilitate a data plane as disclosed herein. Storage device 1635 stores a program 1640 and prediction modeling engine 1645 for controlling the processor 1605. Processor 1605 performs instructions of the programs 1640, 1645, and thereby operates in accordance with any of the embodiments described herein. Storage device 1635 further stores time series data 1650.
Programs 1640, 1645 may be stored in a compressed, uncompiled, encrypted, and other configured format. Programs 1640, 1645 may furthermore include other program elements, such as an operating system, clipboard application, a database management system, and device drivers used by processor 1605 to interface with peripheral devices.
As used herein, data may be “received” by or “transmitted” to, for example: (i) the platform 1600 from another device; or (ii) a software application or module within the platform 1600 from another software application, module, or any other source.
As will be appreciated based on the foregoing specification, the above-described examples of the disclosure may be implemented using computer programming or engineering techniques including computer software, firmware, hardware or any combination or subset thereof. Any such resulting program, having computer-readable code, may be embodied or provided within one or more non-transitory computer-readable media, thereby making a computer program product, i.e., an article of manufacture, according to the discussed examples of the disclosure. For example, the non-transitory computer-readable media may be, but is not limited to, a fixed drive, diskette, optical disk, magnetic tape, flash memory, external drive, semiconductor memory such as read-only memory (ROM), random-access memory (RAM), and any other non-transitory transmitting or receiving medium such as the Internet, cloud storage, the Internet of Things (IoT), or other communication network or link. The article of manufacture containing the computer code may be made and used by executing the code directly from one medium, by copying the code from one medium to another medium, or by transmitting the code over a network.
The computer programs (also referred to as programs, software, software applications, “apps”, or code) may include, for example, machine instructions for a programmable processor, and may be implemented in a high-level procedural, object-oriented programming language, assembly/machine language, etc. As used herein, the terms “machine-readable medium” and “computer-readable medium” refer to any computer program product, apparatus, cloud storage, Internet of Things, and device (e.g., magnetic discs, optical disks, memory, programmable logic devices (PLDs)) used to provide machine instructions and data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The “machine-readable medium” and “computer-readable medium,” however, do not include transitory signals. The term “machine-readable signal” refers to any signal that may be used to provide machine instructions and any other kind of data to a programmable processor.
The above descriptions and illustrations of processes herein should not be considered to imply a fixed order for performing the process steps. Rather, the process steps may be performed in any order that is practicable, including simultaneous performance of at least some steps. Although the disclosure has been described in connection with specific examples, it should be understood that various changes, substitutions, and alterations apparent to those skilled in the art can be made to the disclosed embodiments without departing from the spirit and scope of the disclosure as set forth in the appended claims.