The technology described herein relates generally to data forecasting and more specifically to the determination of desirable time periods for use in data forecasting.
Historically, time series data is considered as being continuous over time. The underlying assumption is that a smooth, continuous series exists, that this series is being sampled at regular intervals, and that the sampling error is uniform due to the sampling intervals being evenly spaced. These assumptions fit well with data such as the U.S. Gross National Product, national unemployment data, national housing starts, and other smooth aggregate series. If such data is aggregated to a quarterly series, then a smooth seasonal and trend model is apparent.
With the advent of modern computing systems, vast quantities of data are collected in real time. This data is often detailed data such as the sales volume of individual items identified by their unique UPC code. Much of this retail data is seasonal. Consumers buy sweaters in the fall and shorts in the spring. Even when data is aggregated, it can display a seasonal pattern characterized by large periods of inactivity. When an evenly spaced interval is chosen that shows detail in the active period, large numbers of zeros appear in the inactive period.
Systems and methods are provided for generating a future sales forecast for a future time period in a system where at least one converted time period is of irregular length with respect to other time periods. Past input sales data is received comprising aggregate sales values representing a volume of sales for an input sales period where all input time periods are of uniform length. The input sales data is converted into converted time period data by assigning a sum of sales from a plurality of input time periods into a single converted time period. A predictive data model is generated based on the converted time period data, and a future sales forecast is generated for a future time period based on the predictive data model.
As another example, a computer-implemented method of generating a future sales forecast for a future time period in a seasonal system is provided where the seasonal system has time measured according to a cycle time period, where the cycle time period comprises a plurality of seasonal time periods, where the future sales forecast is generated based on prior sales data from a plurality of past seasonal time periods spanning a plurality of cycle time periods, where at least one of the seasonal time periods is of irregular length with respect to the other seasonal time periods, and where at least one of the seasonal time periods is considered a negligible time period that is assumed to have zero sales during every cycle. The method may include receiving seasonal time period definition data that delineates the cycle time period into the plurality of seasonal time periods, receiving identification of at least one seasonal time period to be identified as a negligible time period that is assumed to have zero sales during every cycle, and receiving input past sales data and storing the input sales data in a computer readable memory, the input sales data comprising a plurality of aggregate sales values, each aggregate sales value representing a volume of sales for an input time period, where all input time periods are of uniform length, and where all input time periods are equal to or shorter than a shortest seasonal time period. The input past sales data may be converted into seasonal time period data, the converting comprising assigning a sum of sales from a plurality of input time periods to a single seasonal time period and assigning zero sales to the negligible time period. A predictive data model may be generated based on the seasonal time period data using a data processor, a future sales forecast may be generated for a future time period based on the predictive data model using the data processor, and the future sales forecast may be stored in the computer readable memory.
The method may further comprise purchasing inventory for a future time period based on the future sales forecast for the future time period. The generated predictive model may be generated using a regression analysis. The seasonal time period definition data may be stored in a conversion data table that stores a plurality of records, each record having a season identifier index and a season begin date, a season begin time, a season end date, or a season end time. The time period definition data may be user supplied. The uniform length may be a length of time selected from the group comprising: second, minute, half-hour, hour, day, week, bi-week, month, quarter, half-year, year, bi-year, decade, quarter-century, half-century, and century. The time period of irregular length may not be a length of time from the group comprising: second, minute, half-hour, hour, day, week, bi-week, month, quarter, half-year, year, bi-year, decade, quarter-century, half-century, and century.
As a further example, a computer-implemented system for generating a future sales forecast for a future time period is provided in a system where at least one converted time period is of irregular length with respect to other time periods. The system may include a data processor and a computer-readable memory encoded with instructions for commanding the data processor to execute steps. The steps may include receiving past input sales data comprising aggregate sales values representing a volume of sales for an input sales period where all input time periods are of uniform length, converting the input sales data into converted time period data by assigning a sum of sales from a plurality of input time periods into a single converted time period, generating a predictive data model based on the converted time period data, and generating a future sales forecast for a future time period based on the predictive data model.
As another example, a computer-implemented method segmenting time series data stored in data segments containing one or more data records may include determining a combined segment error measure based on a proposed combination of two candidate segments. An error cost to merge the two candidate segments may be determined based on a difference between the combined segment error measure and a segment error measure of one of the segments. The two candidate segments may be combined when the error cost to merge meets a merge threshold to generate a combined segment.
The following is an example of a forecast using custom intervals. This example provides a forecast of an item sold 3 weeks before Easter, the week of Easter, and one week after Easter. Other weeks of the year, the item is not sold. Using custom intervals, the 5 active weeks can be defined, and one large inactive season can also be defined. The active weeks can then be more accurately forecast, and the forecasts for the inactive weeks can be trivially forecast as zero. This is an example of a highly seasonal item with an inactive season and an active season that moves in time since Easter does not occur on the same day each year.
Time intervals are used in date and time calculations when interval lengths are required, increments must be computed, for seasonal computations, and so forth. Irregular time period modeling systems (“systems”) may provide predefined intervals for common date and time units such as month, day, weekday, quarter. However, users may find a list of predefined intervals insufficient due to calendar, or other, issues. Problems may arise with holidays, or with repeating schedules like ‘two days on, one day off.’
A system may include computations involving dates and time. Algorithms may typically involve calculations such as: what will be the date six weeks from today?; or how many weekdays are between two particular dates?
The list of standard predefined intervals provided by a system may be insufficient for many business problems. A predefined interval is fixed in advance and cannot easily accommodate local customization. For example, if the number of bank business days must be determined between two dates, ‘weekday’ is a reasonable interval to choose. Between Jul. 1, 2008 and Jul. 31, 2008 there are 23 weekdays. In the United States, however, July 4th is a bank holiday, so the number of business days is 22. In Canada, July 4th is not a bank holiday, so the answer may be 23 in Canada. Thus, the ‘weekday’ interval may provide a first approximation that requires additional, potentially complex, interval-dependent post-processing.
An irregular time period data modeler system provides a way of specifying a new and completely customized interval. The new interval system may be configured to work in a similar manner as predefined intervals. Based on the prior example, two custom intervals could be created by a user named WEEKDAYUS and WEEKDAYCAN. Then, using an INTCK function (described in further detail herein below) to return number of intervals between dates,
An example problem involves a daily time interval where weekends and certain holidays must be omitted. A custom interval is defined by a data set. Consider a table with two variables BEGIN and END. Each observation in the data set represents one interval with the BEGIN variable containing the start of the interval and the END variable containing the end of the interval. The intervals must be listed in ascending order and there must be no overlaps between intervals.
Consider the month of January 2008, for reference. The following example shows how the first half of January, 2008, would be defined to produce an interval which specifies a daily interval with weekend days included in the preceding week day (i.e., Saturday and Sunday are grouped in a single time period with Friday).
Now suppose Jan. 10, 2008, should be treated as a holiday so that incrementing one interval from Jan. 9, 2008, should return Jan. 11, 2008. This is accomplished by replacing the two observations
with
In some implementations, the END variable in a data set may be omitted. In such a case, END defaults to one day prior to BEGIN in the following observation. The following table is roughly equivalent to the table above, where Jan. 10, 2008, is not a holiday. The END date for each observation is calculated as one less than the BEGIN date for the following observation.
Optionally, a system may include an extrapolating interval and alignment. This function may be used to extend the custom interval in a uniform fashion. For instance EXTRAP=(DAY,BEGIN) would calculate intervals before and after the custom interval definition as daily.
With reference back to
The following code can be used to define a data set that describes these intervals for the years 2005 to 2009.
The resulting table describing the custom interval definition is shown below. The inactive season is given the seasonal index “0.” The forecast for the inactive season will be 0. The “season” value can be used to evaluate seasonality for the original time series. The season identified as 4 is the week containing Easter.
When forecasting using a standard weekly interval, as opposed to the intervals shown above, the large number of zero periods in the inactive season makes correct identification of the proper time series model difficult.
An irregular time period data modeler may be used in a variety of contexts. For example:
As noted above, and illustrated in the contrast between the graphs of
While custom time intervals may be useful in dealing with known periods of lesser activity or total inactivity, custom time intervals may also provide significant forecasting benefits in systems where the custom time intervals are not known prior to some data analysis. Thus, a system may include some mechanisms for defining the custom intervals in addition to capabilities for providing forecasts based on data adapted to those custom intervals.
For example, custom intervals may be utilized to improve stationarity of a received time series data set. Stationarity may be a necessary condition in order to produce a reliable statistical forecast. Oftentimes, discrete time series do not possess this stationarity characteristic. Statistical forecasting is performed by taking a sample at discrete time intervals and treating the sampled values as a discrete time series. The goal of forecasting is often to generate a model with errors that are white noise. Utilizing equally spaced intervals assumes that if the original continuous process has errors that are white noise, then the sample of such process, which is an integral of the continuous process over intervals of equal length, will also have errors that are white noise. Discrete time series are classified as stationary or non-stationary based on the properties of the errors.
Large amounts of data are collected and stored in databases as time series data. Much of this data is unsuitable for econometric modeling and forecasting in its raw format (e.g., because of a lack of stationarity in the time series data). Existing algorithms for improving characteristics of time series data tend to be top-down. Top-down algorithms try to minimize the number of points required to represent the series while also minimizing the error between the original series and the segmented series. Another approach, which is described herein, is bottom-up. This bottom-up algorithm preserves data points unless they can be represented reasonably well using a single segment. The focus of the bottom-up algorithm is to identify segments where the variance of the errors is uniform. The focus is not on the overall error, but on the nature of the errors within each segment.
A proposed segment combination 704 may be based on two or more candidate segments 712. Each of the candidate segments 712 contain one or more time series data values. The candidate segments may contain a single time series data value, or the candidate segments 712 may be combined segments themselves, which contain multiple time series data values based on one or more previous combinations.
A proposed combination 904 contains a merge of the data records contained in the candidate segments 902 from which the proposed combination 904 is derived. The segmentation engine 906 analyzes the multiple points from the candidate segments 902 to determine whether the candidate segments 902 in the proposed combination 904 should be combined to form a combined segment 908. For example, the segmentation engine 906 may calculate a combined segment function and combined segment error measure 910 for the proposed combination 904. The combined segment function is a statistical fit of the values of the multiple data records within the proposed combination 904. The combined segment error measure identifies a quality of fit of the combined segment function to the values of the multiple data records within the proposed combination 904. Error measures may take a variety of forms. For example, an error measure may be a statistic of fit measuring a quality of fit of a segment function to the values of the multiple data records within a segment or proposed combination.
The segmentation engine 906 may use the combined segment function and the combined segment error measure 910 to calculate a cost to merge 912 for the proposed combination 904. The cost to merge may be calculated in many different ways. For example, the cost to merge may be calculated as a percentage increase in error (cost) between a candidate segment 902 and the proposed combination 904. If σ represents a candidate segment error measure and σ′ represents a combined segment error measure, then di=σi′−σi represents an absolute difference between the combined segment error measure and the segment error measure. The percentage increase in cost may then be calculated as pi=d/σi′.
In some implementations, pi is considered the cost to merge 912 and is compared to the merge threshold 914. If pi is less than the merge threshold 914, then the proposed combination 904 is considered a good candidate for merging, and the proposed combination 904 is combined to form the combined segment 908. In some implementations, either or both of di and pi are considered the cost to merge 912 and are compared to appropriate merge thresholds 914. If either di or pi are less than or equal to their merge threshold 914 (also referred to as a critical value) then the candidate segments 902 in the proposed combination 904 are merged.
In some implementations, the calculation of pi may be more complex to account for special cases (e.g., where certain values are zero). For example, pi may be calculated as:
where mi is the constant term of the segment function for a candidate segment. The following table lists a number of cases for pi based on the above definition.
σ
{acute over (σ)}
i
It is noted that cases 5 and 6 cannot occur because a method that does not fit exactly n points cannot fit exactly m>n points. Cases 1 and 2 are no cost merges. They have a cost of zero and usually occur early in a segmentation process. Cases 3 and 4 are low cost merges. For case 3, merging usually occurs when e′≦CVd (critical value for d). For case 4, merging usually occurs when e′/b≦CVp (critical value for p). Cases 7 and 8 are higher cost merges. For these cases, merging usually occurs when p≦CVp.
The process is repeated in
The process is repeated three more times in
The clustering process described herein may be utilized in a variety of contexts. For example, clustering can be used in decomposing time-series data. Given a single time series (original series), decompose the original series by determining the major (long-term) feature, remove the major feature from the original series to obtain the minor (short-term) features. For example, this analysis is useful for regular/promotional price analysis where the original series is the sale price of a product, where the major features are regular prices (long-term), and where the minor features are the promotional prices (short-term).
Clustering can also be used in customizing discrete time intervals. Given a single time series, it may be useful to differentiate time periods with high and low activity in order to customize a time interval that makes the series more readily explainable and/or easier to model for subsequent forecasting. For example, given a time series of seasonal products where there are both in-season (high activity) and off-season (low activity) periods, it may beneficial to map the off-season period into a single time period.
Clustering may also be used in a turning point analysis. Given a single time series, major change points can be extracted for the time series. These change points may be useful for determining turning points in the series for subsequent turning point analysis. Turning point analysis is useful for forecasting the life-cycle of products in the marketplace.
Clustering may further be used in comparing time series. Given several time series of varying length, extract the major features using fixed length segmentation analysis. Comparing fixed length data vectors is far easier than comparing variable length data vectors. For example, this analysis is useful for time series search and retrieval, ranking, clustering, and other data mining analyses related to time series.
Clustering may also be used in visualizing time series data. Given a long time series, it may be difficult to perceive major features associated with the time series when viewing a typical (detailed) time series plot. By extracting the important features, the time series may be easier to understand when viewed with less detail.
Several algorithms may be useful for a system to perform operations using custom intervals. The following describes details of certain algorithms:
Increment k Intervals from a Starting Point t1, Return t2 Aligned to Either Beginning or End of Target Interval
(SAS function INTNX): Given a custom interval data table, first locate the observation in the interval data set whose data in the BEGIN and END variables bracket t1. The data table pointer is then moved k observations. A value of k>0 moves the data table pointer k observations forward, and k<0 moves the pointer backward. The interval at the resulting location in the data set is the desired interval after incrementing (decrementing).
Given a custom interval data table with MAXOBS observations, let Bi be the value of BEGIN at observation i, and Ei be the value of END at observation i. Then, pseudocode is:
Determine the Number of Intervals, k, Between Two Dates, t1 and t2:
If t2>t1, k>=0. If t2<t1, k<=0. (SAS function INTCK) Let the row number of the interval data table in which the BEGIN and END variables bracket t1 be ‘t1obs’. Likewise, let the observation number in which the same variables bracket the t2 date be ‘t2obs’. The number of intervals between t1 and t2 is then calculated as (t2obs−t1obs). Consistent with the increment/decrement problem, a negative value may be calculated.
The pseudocode is:
Determine Seasonal Period
(SAS function INTSEAS): Given a custom interval data table, read each observation and determine the maximum value of the SEASON variable, SeasPer, and return SeasPer. This value is the Seasonal Period of the custom interval. Given a custom interval data table with MAXOBS observations, let Si be the value of SEASON at observation i. Then, pseudocode is:
For instance, in example 2, if the name of the custom interval is EasterCycle, then the Seasonal Period of EasterCycle is 5.
Determine Seasonal Cycle
(SAS function INTCYCLE): First determine the Seasonal Period. The Seasonal Cycle is then <name of custom interval><seasonal period>. For instance, in example 2, if the name of the custom interval is EasterCycle, then the Seasonal Cycle of EasterCycle is “EasterCycle 5”.
Determine Seasonal Index
(SAS function INTINDEX): Determine the seasonal index, Seasi, given a date, t1. Given a custom interval data table, first locate the observation in the interval data set whose data in the BEGIN and END variables bracket t1. The value of SEASON at that observation is the seasonal index, Seasi. Given a custom interval data table with MAXOBS observations, let Si be the value of SEASON at observation i, let Bi be the value of BEGIN at observation i, and Ei be the value of END at observation i. Then, pseudocode is:
For instance, in example 2, if name of the custom interval is EasterCycle, then the Seasonal Index of Mar. 21, 2005 for EasterCycle is 4.
Determine Seasonal Cycle Index
(SAS function INTCINDEX): Determine the seasonal index, Ci, given a date, t1. Given a custom interval data table, locate the observation in the interval data set whose data in the BEGIN and END variables bracket t1. While locating the observation, compute the cycle using the following algorithm, each time the SEASON variable is less than the previous SEASON variable, increment the Cycle Index. Given a custom interval data table with MAXOBS observations, let Si be the value of SEASON at observation i, let Bi be the value of BEGIN at observation i, and Ei be the value of END at observation i. Then, pseudocode is:
For instance, in the example of
Test for Valid Interval
(SAS function INTTEST): Open the file specified by the custom interval and determine if the file contains a BEGIN variable. If this is successful, return 1 for a valid interval. If not, return 0 for an invalid interval.
Determine Sub-Alignment Interval
(SAS function INTSHIFT): The Sub-Alignment interval for a custom interval of the form <custom interval> or <custom interval><m> or <custom interval><m>. <s> is <custom interval>, if <custom interval> is a valid interval. For instance, in example 2, the Sub-Alignment Interval for both “EasterCycle” and “EasterCycle5” would be “EasterCycle”.
Determine Display Format
(SAS function INTFMT): Open the file containing the custom interval definition and determine the format of the BEGIN variable. This format is returned as fmt. In Example 2, if the name of the custom interval is EasterCycle, then the format of EasterCycle is “DATE.” as specified by the user in the statement: format begin end DATE.
This knowledge also indicates whether the custom interval is of type DATE, DATETIME, or OBSERVATION. This knowledge allows varying formats to be matched. For example, a timestamp of Jan. 1, 2008 2:00 PM could be matched to the first observation in Example 1, allowing matching of dates and times.
Alternate algorithms may be utilized when multiple and shifted custom intervals are used are as follows:
Increment k Intervals from a Starting Point t1, Return t2 Aligned to Either Beginning or End of Target Interval:
Here the interval is of the format <custom interval><m>. <s>. (SAS function INTNX) Given a custom interval data table, first locate the observation in the interval data set whose data in the BEGIN and END variables bracket t1. Adjust to the beginning observation of the form n*m+s where n*m+s<=i<(n+1)*m+s. The data table pointer is then moved k*m observations. A value of k>0 moves the data table pointer k*m observations forward, and k<0 moves the pointer backward. The interval at the resulting location in the data set is the desired interval after incrementing (decrementing).
Given a custom interval data table with MAXOBS observations, let Bi be the value of BEGIN at observation i, and Ei be the value of END at observation i. Then, pseudocode is:
In the example of
Determine the Number of Intervals, k, Between Two Dates, t1 and t2:
If t2>t1, k>=0. If t2<t1, k<=0. Here the interval is of the format <custom interval><m>. <s>. (SAS function INTCK) Let the row number of the interval data table in which the BEGIN and END variables bracket t1 be i and calculate ‘t1obs’=n such that (n−1)*m+s<=i<n*m+s. Likewise, let the observation number in which the same variables bracket the t2 date be i and calculate ‘t2obs’=n such that (n−1)*m+s<=i<n*m+s. The number of intervals between t1 and t2 is then calculated as (t2obs−t1obs). Consistent with the increment/decrement problem, a negative value may be calculated. Then, pseudocode is:
For INTTEST, INTFMT, INSHIFT, <custom interval><m>. <s> inherits properties from <custom interval>. For INTCYCLE, <custom interval><m>. <s> is its own cycle (trivial). For INTSEAS, INTINDEX, INTCINDEX with intervals of the format <custom interval><m>. <s>, the result is trivially 1.
A disk controller 1860 interfaces one or more optional disk drives to the system bus 1852. These disk drives may be external or internal floppy disk drives such as 1862, external or internal CD-ROM, CD-R, CD-RW or DVD drives such as 1864, or external or internal hard drives 1866. As indicated previously, these various disk drives and disk controllers are optional devices.
Each of the element managers, real-time data buffer, conveyors, file input processor, database index shared access memory loader, reference data buffer and data managers may include a software application stored in one or more of the disk drives connected to the disk controller 1860, the ROM 1856 and/or the RAM 1858. Preferably, the processor 1854 may access each component as required.
A display interface 1868 may permit information from the bus 1856 to be displayed on a display 1870 in audio, graphic, or alphanumeric format. Communication with external devices may optionally occur using various communication ports 1872.
In addition to the standard computer-type components, the hardware may also include data input devices, such as a keyboard 1872, or other input device 1874, such as a microphone, remote control, pointer, mouse and/or joystick.
This written description uses examples to disclose the invention, including the best mode, and also to enable a person skilled in the art to make and use the invention. The patentable scope of the invention may include other examples. For example, the systems and methods may include data signals conveyed via networks (e.g., local area network, wide area network, internet, combinations thereof, etc.), fiber optic medium, carrier waves, wireless networks, etc. for communication with one or more data processing devices. The data signals can carry any or all of the data disclosed herein that is provided to or from a device.
Additionally, the methods and systems described herein may be implemented on many different types of processing devices by program code comprising program instructions that are executable by the device processing subsystem. The software program instructions may include source code, object code, machine code, or any other stored data that is operable to cause a processing system to perform the methods and operations described herein. Other implementations may also be used, however, such as firmware or even appropriately designed hardware configured to carry out the methods and systems described herein.
The systems' and methods' data (e.g., associations, mappings, data input, data output, intermediate data results, final data results, etc.) may be stored and implemented in one or more different types of computer-implemented data stores, such as different types of storage devices and programming constructs (e.g., RAM, ROM, Flash memory, flat files, databases, programming data structures, programming variables, IF-THEN (or similar type) statement constructs, etc.). It is noted that data structures describe formats for use in organizing and storing data in databases, programs, memory, or other computer-readable media for use by a computer program.
The computer components, software modules, functions, data stores and data structures described herein may be connected directly or indirectly to each other in order to allow the flow of data needed for their operations. It is also noted that a module or processor includes but is not limited to a unit of code that performs a software operation, and can be implemented for example as a subroutine unit of code, or as a software function unit of code, or as an object (as in an object-oriented paradigm), or as an applet, or in a computer script language, or as another type of computer code. The software components and/or functionality may be located on a single computer or distributed across multiple computers depending upon the situation at hand.
It should be understood that as used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein and throughout the claims that follow, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise. Finally, as used in the description herein and throughout the claims that follow, the meanings of “and” and “or” include both the conjunctive and disjunctive and may be used interchangeably unless the context expressly dictates otherwise; the phrase “exclusive or” may be used to indicate situation where only the disjunctive meaning may apply.
This application claims priority to U.S. Provisional Patent Application No. 61/307,104, filed Feb. 23, 2010, entitled “Computer-Implemented Systems and Methods for Flexible Definition of Time Intervals.” The entirety of which is herein incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
5461699 | Arbabi et al. | Oct 1995 | A |
5615109 | Eder | Mar 1997 | A |
5870746 | Knutson et al. | Feb 1999 | A |
5918232 | Pouschine et al. | Jun 1999 | A |
5953707 | Huang et al. | Sep 1999 | A |
5991740 | Messer | Nov 1999 | A |
5995943 | Bull et al. | Nov 1999 | A |
6052481 | Grajski et al. | Apr 2000 | A |
6128624 | Papierniak et al. | Oct 2000 | A |
6151584 | Papierniak et al. | Nov 2000 | A |
6169534 | Raffel et al. | Jan 2001 | B1 |
6189029 | Fuerst | Feb 2001 | B1 |
6208975 | Bull et al. | Mar 2001 | B1 |
6216129 | Eldering | Apr 2001 | B1 |
6223173 | Wakio et al. | Apr 2001 | B1 |
6230064 | Nakase et al. | May 2001 | B1 |
6286005 | Cannon | Sep 2001 | B1 |
6308162 | Ouimet et al. | Oct 2001 | B1 |
6317731 | Luciano | Nov 2001 | B1 |
6334110 | Walter et al. | Dec 2001 | B1 |
6356842 | Intriligator et al. | Mar 2002 | B1 |
6397166 | Leung et al. | May 2002 | B1 |
6400853 | Shiiyama | Jun 2002 | B1 |
6526405 | Mannila et al. | Feb 2003 | B1 |
6539392 | Rebane | Mar 2003 | B1 |
6542869 | Foote | Apr 2003 | B1 |
6564190 | Dubner | May 2003 | B1 |
6591255 | Tatum et al. | Jul 2003 | B1 |
6611726 | Crosswhite | Aug 2003 | B1 |
6640227 | Andreev | Oct 2003 | B1 |
6735738 | Kojima | May 2004 | B1 |
6775646 | Tufillaro et al. | Aug 2004 | B1 |
6792399 | Phillips et al. | Sep 2004 | B1 |
6850871 | Barford et al. | Feb 2005 | B1 |
6876988 | Helsper et al. | Apr 2005 | B2 |
6878891 | Josten et al. | Apr 2005 | B1 |
6928398 | Fang et al. | Aug 2005 | B1 |
6978249 | Beyer et al. | Dec 2005 | B1 |
7072863 | Phillips et al. | Jul 2006 | B1 |
7080026 | Singh et al. | Jul 2006 | B2 |
7103222 | Peker | Sep 2006 | B2 |
7130822 | Their et al. | Oct 2006 | B1 |
7171340 | Brocklebank | Jan 2007 | B2 |
7194434 | Piccioli | Mar 2007 | B2 |
7216088 | Chappel et al. | May 2007 | B1 |
7222082 | Adhikari et al. | May 2007 | B1 |
7236940 | Chappel | Jun 2007 | B2 |
7240019 | Delurgio et al. | Jul 2007 | B2 |
7251589 | Crowe et al. | Jul 2007 | B1 |
7260550 | Notani | Aug 2007 | B1 |
7280986 | Goldberg et al. | Oct 2007 | B2 |
7433834 | Joao | Oct 2008 | B2 |
7523048 | Dvorak | Apr 2009 | B1 |
7530025 | Ramarajan et al. | May 2009 | B2 |
7565417 | Rowady, Jr. | Jul 2009 | B2 |
7570262 | Landau et al. | Aug 2009 | B2 |
7610214 | Dwarakanath et al. | Oct 2009 | B1 |
7617167 | Griffis et al. | Nov 2009 | B2 |
7660734 | Neal et al. | Feb 2010 | B1 |
7689456 | Schroeder et al. | Mar 2010 | B2 |
7693737 | Their et al. | Apr 2010 | B2 |
7702482 | Graepel et al. | Apr 2010 | B2 |
7711734 | Leonard et al. | May 2010 | B2 |
7716022 | Park et al. | May 2010 | B1 |
8005707 | Jackson et al. | Aug 2011 | B1 |
8010324 | Crowe et al. | Aug 2011 | B1 |
8010404 | Wu et al. | Aug 2011 | B1 |
8326677 | Fan et al. | Dec 2012 | B1 |
20020169657 | Singh et al. | Nov 2002 | A1 |
20030101009 | Seem | May 2003 | A1 |
20030105660 | Walsh et al. | Jun 2003 | A1 |
20030110016 | Stefek et al. | Jun 2003 | A1 |
20030154144 | Pokorny et al. | Aug 2003 | A1 |
20030187719 | Brocklebank | Oct 2003 | A1 |
20030200134 | Leonard et al. | Oct 2003 | A1 |
20030212590 | Klingler | Nov 2003 | A1 |
20040041727 | Ishii et al. | Mar 2004 | A1 |
20040172225 | Hochberg et al. | Sep 2004 | A1 |
20050055275 | Newman et al. | Mar 2005 | A1 |
20050102107 | Porikli | May 2005 | A1 |
20050114391 | Corcoran et al. | May 2005 | A1 |
20050159997 | John | Jul 2005 | A1 |
20050177351 | Goldberg et al. | Aug 2005 | A1 |
20050209732 | Audimoolam et al. | Sep 2005 | A1 |
20050249412 | Radhakrishnan et al. | Nov 2005 | A1 |
20050271156 | Nakano | Dec 2005 | A1 |
20060063156 | Willman et al. | Mar 2006 | A1 |
20060064181 | Kato | Mar 2006 | A1 |
20060085380 | Cote et al. | Apr 2006 | A1 |
20060112028 | Xiao et al. | May 2006 | A1 |
20060143081 | Argaiz | Jun 2006 | A1 |
20060241923 | Xu et al. | Oct 2006 | A1 |
20060247900 | Brocklebank | Nov 2006 | A1 |
20070094168 | Ayala et al. | Apr 2007 | A1 |
20070106550 | Umblijs et al. | May 2007 | A1 |
20070162301 | Sussman et al. | Jul 2007 | A1 |
20070203783 | Beltramo | Aug 2007 | A1 |
20070208492 | Downs et al. | Sep 2007 | A1 |
20070208608 | Amerasinghe et al. | Sep 2007 | A1 |
20070291958 | Jehan | Dec 2007 | A1 |
20080288537 | Golovchinsky et al. | Nov 2008 | A1 |
20080294651 | Masuyama et al. | Nov 2008 | A1 |
20090018996 | Hunt et al. | Jan 2009 | A1 |
20090172035 | Lessing et al. | Jul 2009 | A1 |
20110145223 | Cormode et al. | Jun 2011 | A1 |
Number | Date | Country |
---|---|---|
2005124718 | Dec 2005 | WO |
Entry |
---|
D. C. Bradley, G. M. Steil, and R. N. Bergman, “Quantitation of measurement error with Optimal Segments: basis for adaptive time course smoothing,” Am J Physiol endocrinol Metab Jun. 1, 1993 264:(6) E902-E911. |
Keogh, E.; Chu, S.; Hart, D.; Pazzani, M.; “An online algorithm for segmenting time series,” Data Mining, 2001. ICDM 2001, Proceedings IEEE International Conference on , vol., No., pp. 289-296, 2001. |
Palpanas, T.; Vlachos, M.; Keogh, E.; Gunopulos, D.; Truppel, W.; “Online amnesic approximation of streaming time series,” Data Engineering, 2004. Proceedings. 20th International Conference on , vol., No., pp. 339-349, Mar. 30-Apr. 2, 2004. |
Wang Xiao-Ye; Wang Zheng-Ou; “A structure-adaptive piece-wise linear segments representation for time series,” Information Reuse and Integration, 2004. IRI 2004. Proceedings of the 2004 IEEE International Conference on , vol., No., pp. 433-437, Nov. 8-10, 2004. |
Kalpakis, K.; Gada, D.; Puttagunta, V.; , “Distance measures for effective clustering of ARIMA time-series,” Data Mining, 2001. ICDM 2001, Proceedings IEEE International Conference on , vol., No., pp. 273-280, 2001. |
Huang, N. E., Wu, M.-L., Qu, W., Long, S. R. and Shen, S. S. P. (2003), “Applications of Hilbert—Huang transform to non-stationary financial time series analysis.” Appl. Stochastic Models Bus. Ind., 19: 245-268. |
Babu, G., “Clustering in non-stationary environments using a clan-based evolutionary approach,” Biological Cybernetics, Sep. 7, 1995, Springer Berlin / Heidelberg, pp. 367-374, vol. 73, Issue: 4. |
Atuk, Oguz et al., “Seasonal Adjustment in Economic Time Series,” Statistics Department, Discussion Paper No. 2002/1, Central Bank of the Republic of Turkey, Central Bank Review, 15 pp. (2002). |
Bruno, Giancarlo et al., “The Choice of Time Intervals in Seasonal Adjustment: Characterization and Tools,” Institute for Studies and Economic Analyses, Rome, Italy, 21 pp. (Jul. 2001). |
IBM, “IBM Systems, IBM PowerExecutive Installation and User's Guide,” Version 2.10, 62 pp. (Aug. 2007). |
Bruno, Giancarlo et al., “The Choice of Time Intervals in Seasonal Adjustment: A Heuristic Approach,” Institute for Studies and Economic Analysis, Rome Italy, 14 pp. (2004). |
Keogh, Eamonn et al., “Segmenting Time Series: A Survey and Novel Approach,” Department of Information and Computer Science, University of California, Irvine, California 92697, 15 pp. (2004). |
Aiolfi, Marco et al., “Forecast Combinations,” CREATES Research Paper 2010-21, School of Economics and Management, Aarhus University, 35 pp. (May 6, 2010). |
Automatic Forecasting Systems Inc., Autobox 5.0 for Windows User's Guide, 82 pp. (1999). |
Choudhury, J. Paul et al., “Forecasting of Engineering Manpower Through Fuzzy Associative Memory Neural Network with ARIMA: A Comparative Study”, Neurocomputing, vol. 47, Iss. 1-4, pp. 241-257 (Aug. 2002). |
Costantini, Mauro et al., “Forecast Combination Based on Multiple Encompassing Tests in a Macroeconomic DSGE System,” Reihe Okonomie/ Economics Series 251, 24 pp. (May 2010). |
Crowe, Keith E. et al., U.S. Appl. No. 11/431,089, filed May 9, 2006 entitled “Computer-Implemented System and Method For Generating Forecasts”. |
Crowe, Keith E. et al., U.S. Appl. No. 11/431,123, filed May 9, 2006 entitled “Computer-Implemented Systems and Methods For Storing Data Analysis Models”. |
Data Mining Group, available at http://www.dmg.org, printed May 9, 2005, 3 pp. |
Funnel Web, Web site Analysis. Report, Funnel Web Demonstration, Authenticated Users History, http://www.quest.com/funnel.sub.--web/analyzer/sample/UserHist.html (1 pg.), Mar. 2002. |
Funnel Web, Web site Analysis Report, Funnel Web Demonstration, Clients History, http://www/quest.com/funnel.sub.--web/analyzer/sample.ClientHist- .html (2 pp.), Mar. 2002. |
Garavaglia, Susan et al., “A Smart Guide to Dummy Variables: Four Applications and a Macro,” accessed from: http://web.archive.org/web/20040728083413/http://www.ats.ucla.edu/stat/sa- s/library/nesug98/p046.pdf, (2004). |
Guerard John B. Jr., Automatic Time Series Modeling, Intervention Analysis, and Effective Forecasting. (1989) Journal of Statistical Computation and Simulation, 1563-5163, vol. 34, Issue 1, pp. 43-49. |
Guralnik, V. and Srivastava, J., Event Detection from Time Series Data (1999), Proceedings of the 5th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 33-42. |
Harrison, H.C. et al., “An Intelligent Business Forecasting System”, ACM Annual Computer Science Conference, pp. 229-236 (1993). |
Harvey, Andrew, “Forecasting with Unobserved Components Time Series Models,” Faculty of Economics, University of Cambridge, Prepared for Handbook of Economic Forecasting, pp. 1-89 (Jul. 2004). |
Jackson, Wilma S. et al., U.S. Appl. No. 11/431,127, filed May 9, 2006 entitled “Computer-Implemented Systems and Methods for Defining Events”. |
Jacobsen, Erik et al., “Assigning Confidence to Conditional Branch Predictions”, IEEE, Proceedings of the 29th Annual International Symposium on Microarchitecture, 12 pp. (Dec. 2-4, 1996). |
Keogh, Eamonn J. et al., “Derivative Dynamic Time Warping”, In First SIAM International Conference on Data Mining (SDM'2001), Chicago, USA, pp. 1-11 (2001). |
Kobbacy, Khairy A.H., et al., Abstract, “Towards the development of an intelligent inventory management system,” Integrated Manufacturing Systems, vol. 10, Issue 6, (1999) 11 pp. |
Kumar, Mahesh, “Combining Forecasts Using Clustering”, Rutcor Research Report 40-2005, cover page and pp. 1-16 (Dec. 2005). |
Leonard, Michael et al., “Mining Transactional and Time Series Data”, abstract and presentation, International Symposium of Forecasting, 23 pp. (2003). |
Leonard, Michael et al., “Mining Transactional and Time Series Data”, abstract, presentation and paper, SUGI, 142 pp. (Apr. 10-13, 2005). |
Leonard, Michael James, U.S. Appl. No. 11/696,951, filed Apr. 5, 2007 entitled “Systems and Methods For Mining Transactional and Times Series Data”. |
Leonard, Michael, “Large-Scale Automatic Forecasting Using Inputs and Calendar Events”, abstract and presentation, International Symposium on Forecasting Conference, 56 pp. (Jul. 4-7, 2004). |
Leonard, Michael, “Large-Scale Automatic Forecasting Using Inputs and Calendar Events”, White Paper, pp. 1-27 (2005). |
Leonard, Michael, “Large-Scale Automatic Forecasting: Millions of Forecasts”, abstract and presentation, International Symposium of Forecasting, 156 pp. (2002). |
Leonard, Michael, “Predictive Modeling Markup Language for Time Series Models”, abstract and presentation, International Symposium on Forecasting Conference, 35 pp. (Jul. 4-7, 2004). |
Leonard, Michael, “Promotional Analysis and Forecasting for Demand Planning: A Practical Time Series Approach”, with exhibits 1 and 2, SAS Institute Inc., Cary, North Carolina, 50 pp. (2000). |
Lu, Sheng et al., “A New Algorithm for Linear and Nonlinear ARMA Model Parameter Estimation Using Affine Geometry”, IEEE Transactions on Biomedical Engineering, vol. 48, No. 10, pp. 1116-1124 (Oct. 2001). |
Malhotra, Manoj K. et al., “Decision making using multiple models”, European Journal of Operational Research, 114, pp. 1-14 (1999). |
McQuarrie, Allan D.R. et al., “Regression and Time Series Model Selection”, World Scientific Publishing Co. Pte. Ltd., 40 pp. (1998). |
Oates, Tim et al., “Clustering Time Series with Hidden Markov Models and Dynamic Time Warping”, Computer Science Department, LGRC University of Massachusetts, In Proceedings of the IJCAI-99, 5 pp. (1999). |
Park, Kwan Hee, Abstract “Development and evaluation of a prototype expert system for forecasting models”, Mississippi State University, 1990, 1 pg. |
Park, Youngjin et al., U.S. Appl. No. 11/431,116, filed May 9, 2006 entitled “Computer-Implemented Systems and Methods For Processing Time Series Data”. |
Product Brochure, Forecast PRO, 2000, 12 pp. |
Quest Software, “Funnel Web Analyzer: Analyzing the Way Visitors Interact with Your Web Site”, http://www.quest.com/funnel.sub.--web/analyzer (2 pp.), Mar. 2002. |
Safavi, Alex “Choosing the right forecasting software and system.” The Journal of Business Forecasting Methods & Systems 19.3 (2000): 6-10. ABI/INFORM Global, ProQuest. |
SAS Institute Inc., SAS/ETS User's Guide, Version 8, Cary NC; SAS Institute Inc., (1999) 1543 pages. |
Seasonal Dummy Variables, Mar. 2004, http://shazam.econ.ubc.ca/intro/dumseas.htm, Accessed from: http://web.archive.org/web/20040321055948/http://shazam.econ.ubc.ca/intro- /dumseas.htm. |
Simoncelli, Eero, “Least Squares Optimization,” Center for Neural Science, and Courant Institute of Mathematical Sciences, pp. 1-8 (Mar. 9, 2005). |
Tashman, Leonard J. et al., Abstract “Automatic Forecasting Software: A Survey and Evaluation”, International Journal of Forecasting, vol. 7, Issue 2, Aug. 1991, 1 pg. |
Using Predictor Variables, (1999) SAS OnlineDoc: Version 8, pp. 1325-1349, Accessed from: http://www.okstate.edu/sas/v8/saspdf/ets/chap27.pdf. |
van Wijk, Jarke J. et al., “Cluster and Calendar based Visualization of Time Series Data”, IEEE Symposium on Information Visualization (INFOVIS '99), San Francisco, pp. 1-6 (Oct. 25-26, 1999). |
Vanderplaats, Garret N., “Numerical Optimization Techniques for Engineering Design”, Vanderplaats Research & Development (publisher), Third Edition, 18 pp. (1999). |
Wang, Liang et al., “An Expert System for Forecasting Model Selection”, IEEE, pp. 704-709 (1992). |
Weiss, Jack, “Lecture 16—Wednesday, Feb. 8, 2006,” http://www.unc.edu/courses/2006spring/ecol/145/00l/docs/lectures/lecturel6.htm, 9 pp. (Feb. 9, 2006). |
Yu, Lean et al., “Time Series Forecasting with Multiple Candidate Models: Selecting or Combining?” Journal of System Science and Complexity, vol. 18, No. 1, pp. 1-18 (Jan. 2005). |
Non-Final Office Action of Oct. 31, 2007 for U.S. Appl. No. 10/402,849, 14 pages. |
Final Office Action of May 21, 2008 for U.S. Appl. No. 10/402,849, 19 pages. |
Non-Final Office Action of Feb. 20, 2009 for U.S. Appl. No. 10/402,849, 21 pages. |
Final Office Action of Jul. 1, 2010 for U.S. Appl. No. 10/402,849, 24 pages. |
Non-Final Office Action of Aug. 30, 2013 for U.S. Appl. No. 10/402,849, 29 pages. |
Non-Final Office Action of Oct. 25, 2013 for U.S. Appl. No. 13/189,131, 37 pages. |
Number | Date | Country | |
---|---|---|---|
20110208701 A1 | Aug 2011 | US |
Number | Date | Country | |
---|---|---|---|
61307104 | Feb 2010 | US |