This disclosure relates generally to demand prediction. More particularly, the invention relates to a method and a system for predicting demand for a supply chain using Machine Learning (ML).
In supply chain, demand forecasting plays an important role for decision making. Demand forecasting is a process for estimating quantity of product or services in response to customer's demand. However, in the supply change, disruptive events often threaten accurate demand forecasting and Supply Chain Management (SCM) that rely on it. Currently, many demand sensing models exist that provide near-future demand forecast to help organizations to make short-term decisions. However, conventional demand sensing models are unable to capture impact of disruption that includes economic downturns, pandemics, technological innovations etc. In addition, these conventional demand sensing models are not able to effectively provide productive SCM guidance during disruptions. Moreover, these conventional demand sensing models rely heavily on historical demand time-series data to forecast future demand.
Since these conventional demand sensing models do not consider external variables, for example macroeconomic indicators while performing demand forecasting, hence these models are unable to capture impact of disruption events and are unable to provide effective SCM guidance. Moreover, the conventional demand sensing models are unable to capture the impact of disruption events as these disruptive events manifest with sparse time-series, making them difficult to model. In addition, the conventional demand sensing models requires frequent manual intervention to constantly adjust these models for different situations. Since none of the conventional demand sensing models are capable of accurately capturing disruption dynamics, thus resulting in misguided SCM for organizations. Consequences of misguided SCM include high inventory costs, consistent stock outs, poor pricing and product strategy.
Therefore, there is a need for a method and system that is robust and efficient for predicting demand for the supply chain.
In an embodiment, a method for predicting demand for a supply chain is disclosed. In one embodiment, the method may include feeding input vectors to a trained Machine Learning (ML) model for a future time-period. It should be noted that, the input vectors comprise at least one of: an intensity vector corresponding to an intensity of a possible disruption-event at each point of time within the future time-period, a duration vector corresponding to the duration of the possible disruption-event, and one or more extrinsic data vectors corresponding to one or more possible extrinsic data parameters associated with each point of time within the future time-period. The method may further include obtaining a demand for a target product in the future time-period from the trained ML model based on the input vectors.
In another embodiment, a system for predicting demand for a supply chain is disclosed. The system includes a processor and a memory communicatively coupled to the processor, wherein the memory stores processor-executable instructions, which, on execution, causes the processor to feed input vectors to a trained Machine Learning (ML) model for a future time-period. It should be noted that, the input vectors comprise at least one of: an intensity vector corresponding to an intensity of a possible disruption-event at each point of time within the future time-period, a duration vector corresponding to the duration of the possible disruption-event, and one or more extrinsic data vectors corresponding to one or more possible extrinsic data parameters associated with each point of time within the future time-period. The processor-executable instructions further cause the processor to obtain a demand for a target product in the future time-period from the trained ML model based on the input vectors.
In yet another embodiment, a non-transitory computer-readable medium storing computer-executable instruction for is disclosed. The stored instructions, when executed by a processor, may cause the processor to perform operations including feeding input vectors to a trained Machine Learning (ML) model for a future time-period. It should be noted that, the input vectors comprise at least one of: an intensity vector corresponding to an intensity of a possible disruption-event at each point of time within the future time-period, a duration vector corresponding to the duration of the possible disruption-event, and one or more extrinsic data vectors corresponding to one or more possible extrinsic data parameters associated with each point of time within the future time-period. The operations may further include obtaining a demand for a target product in the future time-period from the trained ML model based on the input vectors.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles.
The following description is presented to enable a person of ordinary skill in the art to make and use the invention and is provided in the context of particular applications and their requirements. Various modifications to the embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the invention. Moreover, in the following description, numerous details are set forth for the purpose of explanation. However, one of ordinary skill in the art will realize that the invention might be practiced without the use of these specific details. In other instances, well-known structures and devices are shown in block diagram form in order not to obscure the description of the invention with unnecessary detail. Thus, the invention is not intended to be limited to the embodiments shown but is to be accorded the widest scope consistent with the principles and features disclosed herein.
While the invention is described in terms of particular examples and illustrative figures, those of ordinary skill in the art will recognize that the invention is not limited to the examples or figures described. Those skilled in the art will recognize that the operations of the various embodiments may be implemented using hardware, software, firmware, or combinations thereof, as appropriate. For example, some processes can be carried out using processors or other digital circuitry under the control of software, firmware, or hard-wired logic. (The term “logic” herein refers to fixed hardware, programmable logic and/or an appropriate combination thereof, as would be recognized by one skilled in the art to carry out the recited functions.) Software and firmware can be stored on computer-readable storage media. Some other processes can be implemented using analog circuitry, as is well known to one of ordinary skill in the art. Additionally, memory or other storage, as well as communication components, may be employed in embodiments of the invention.
A system 100 configured for predicting demand for a supply chain is illustrated in
In addition, the one or more extrinsic data vectors may be fed corresponding to one or more possible extrinsic data parameters associated with each point of time within the future time-period. The extrinsic data parameters may include, but is not limited to, competitors and market data parameters, macroeconomic data parameters, socio-economic data parameters, and consumer-specific data parameters associated with a target industry. Upon feeding the input vectors, the predicting device 102 may obtain a demand for a target product in the future time-period from the trained ML model based on the input vectors. It should be noted that the, the target product may be associated with the target industry. Once the demand is obtained from the ML model 104, the predicting device 102 may compare the predicted demand with an actual demand via the ML model 104. The comparison may be done to determine a magnitude of error of prediction. Upon determining the magnitude of error of prediction, the predicting device 102 may retrain the ML model 104 based on the magnitude of error of prediction.
Examples of the predicting device 102 may include, but are not limited to, a server, a desktop, a laptop, a notebook, a tablet, a smartphone, a mobile phone, an application server, or the like. The predicting device 102 may further include a memory 106, a processor 108, and the display 110. The display 110 may further include a user interface 112. As described above, the user may interact with the predicting device 102 and vice versa through the display 110.
By way of an example, the display 110 may be used to display intermediate results (i.e., historical demand data, disruption data, one or more extrinsic data parameters, sparse multivariate time series, training data vectors, loss function, etc.,) based on actions performed by the predicting device 102, to a user. Moreover, the display 110 may be used to display final result, i.e., the demand obtained for the target product and the magnitude of error of prediction.
By way of another example, the user interface 112 may be used by the user to provide inputs to the predicting device 102. Thus, for example, in some embodiment, the user may ingest an input via the predicting device 102 that may include the input vectors. In another embodiment, the user may ingest input via the predicting device 102 that may include training data for training the ML model 104. Further, for example, in some embodiments, the predicting device 102 may render intermediate results (e.g., historical demand data, disruption data, one or more extrinsic data parameters, sparse multivariate time series, training data vectors, loss function, etc.,) or final results (e.g., the demand obtained for the target product and the magnitude of error of prediction) to the user via the user interface 112.
The memory 106 may store instructions that, when executed by the processor 108, may cause the processor 108 to obtain the demand for the target product. As will be described in greater detail in conjunction with
The memory 106 may also store various data (e.g., the historical demand data, the disruption data, the one or more extrinsic parameters, the intensity vector, the duration vector, the one or more extrinsic data vectors, the demand obtain for the target product, etc.,) that may be captured, processed, and/or required by the predicting device 102. The memory 106, in some embodiments, may also include the trained ML model 104. The memory 106 may be a non-volatile memory (e.g., flash memory, Read Only Memory (ROM), Programmable ROM (PROM), Erasable PROM (EPROM), Electrically EPROM (EEPROM) memory, etc.) or a volatile memory (e.g., Dynamic Random-Access Memory (DRAM), Static Random-Access memory (SRAM), etc.).
Further, the predicting device 102 may interact with a server 114 or user devices 120 over a network 118 for sending and receiving various data. The user devices 120 may be used by a plurality of users to provide their inputs, such as, the input vectors, to the predicting device 102. Examples of the user devices 120 may include, but is not limited to, laptop, desktop, smartphone, tablet. The network 118, for example, may be any wired or wireless communication network and the examples may include, but may be not limited to, the Internet, Wireless Local Area Network (WLAN), Wi-Fi, Long Term Evolution (LTE), Worldwide Interoperability for Microwave Access (WiMAX), and General Packet Radio Service (GPRS).
In some embodiment, the predicting device 102 may fetch the historical demand data, the disruption data, the one or more extrinsic data parameters from the server 114. In addition, the server 114 may provide access of information (i.e., the input vectors) to the user. The server 114 may further include a database 116. The database 116 may store the historical demand data, the disruption data, the one or for a reference time-period. By way of an example, the database 116 may store information associated disruption event such as economic downturns in market. The database 116 may be periodically updated with new information available for the disruption events.
Referring now to
Initially, the input 202 may be received by the data receiving module 204 via an interface ‘I1’. The input 202 may include intrinsic data and extrinsic data, such as, historical sales data (H), disruptive events data, and industry specific data (ID). In an embodiment, the historical sales data may also be referred as the historical demand data. Further, the disruptive events data may also be referred as the disruption data. In addition, the industry specific data may also be referred as the one or more extrinsic data parameters. This received input 202 may have to be collated in a sparse multivariate time series by the sparse multivariate data collation module 212. The collated data may be served as a feed to a neural network model at any point of time. In reference to
Historical sales data,H:[H](a×1) (1)
In the equation (1), ‘[H]’ may represent the historical sales data as a time series and ‘a’ may represent each point in time within the reference time-period of the historical sales data. Further, the disruptive event data, i.e., the disruption data collected for the reference time-period may be represented as depicted via the equation (2) below:
Disruptive events data,DS:[I D](a×2) (2)
In equation (2), ‘I’ may represent an intensity of a disruption-event at each point of time within the reference time-period. In addition, ‘D’ may represent a duration of the disruption—event. Further, ‘a’ may represent each point in time within the reference time-period of the disruptive event data. Further, the industry specific data, i.e., the extrinsic data parameters may be represented as depicted via equation (3) below:
Industry Specific Data,ID:[CM,M,SE,C](a×b) (3)
In equation (3), ‘CM’ may correspond to competitors and market data parameters as a time series. ‘M’ may be macroeconomic data parameters as a time series. ‘SE’ may be socio-economic data parameters as a time series. And ‘C’ may be consumer specific data parameters as a time series. Further, ‘a’ may represent each point in time with the reference time-period of industry specific data and ‘b’ may be number of features derived from ‘CM’, ‘M’, ‘SE’, and ‘C’. Further, a final input, i.e., a training data vectors, that may be fed to the neural network model for training of the neural network model may be represented as depicted via equation (4) below:
Final Input,IN:[H I D ID](a×(3+b)) (4)
In equation (4), the final input ‘IN’ may include a historical data vector, an intensity vector, a duration vector, and one or more extrinsic data vectors corresponding to one or more extrinsic data parameters at each point of time within the reference time-period. Further, (3+b) may be number of nodes of the final input derived from the historic sales data (H), the disruptive events data (DS), and the industry specific data (ID).
The data receiving module 204 may be configured to receive the input 202. In other words, the data receiving module 204 may the intrinsic data, i.e., data internal to an organization (i.e., the target industry) via the interface ‘I1’. In addition, the data receiving module 204 may receive extrinsic data, i.e., data external to the organization that can affect future demand. In an embodiment, the intrinsic data may include the historical demand data of the organization. The historical demand data may include store-level data, product-level data, and stock keeping unit (SKU) level data of various channels and regions of the organization. The intrinsic data received may be used to extract normal trends and patterns related with historical sales of the organization. Further, the extrinsic data may include the disruption data and the one or more extrinsic data parameters. The disruption data may include the intensity and the duration of the disruption event. In addition, the one or more extrinsic data parameters may include, competitors and market data parameter, macroeconomic data parameters, socio-economic data parameters, and consumer specific data parameters. The extrinsic data received may be used to sense deviations from normal trends and adjust demand forecasts accordingly. Upon receiving the input 202, the data receiving module 204 may be configured to provide the received input 202 to other corresponding modules of the system 200 for further processing.
The data receiving module 204 may provide the intrinsic data to the intrinsic data processing module 206 via a connection ‘C1’. Upon receiving the intrinsic data, the intrinsic data processing module 206 may be configured to process the intrinsic data in order to obtain the historical demand data as time series. In an embodiment, the historical demand data may include the store-level data, the product-level data, and the SKU—level data of various channels and regions of the organization. Upon obtaining the historical demand data as the time series, the intrinsic data processing module 206 may be configured to provide the obtained time series of the historical demand data to the sparse multivariate data collation module 212 via a connection ‘C4’.
Further, the data receiving module 204 may be configured to provide the one or more extrinsic data parameters to the industry specific data processing module 208 via a connection ‘C2’. Upon receiving the one or more extrinsic data parameters, the industry specific data processing module 208 may be configured to processes the one or more extrinsic data parameters in order to identify variations in demand specific to industry in question, i.e., the target industry. In an embodiment, the one or more extrinsic data parameters may be considered particular to the target industry and may be used as pointers to explain variation in demand for the target industry. The one or more extrinsic data parameters may need to be contextualized specific to the target industry. The one or more extrinsic data parameters may include competitors and market data parameters, macroeconomic data parameters, socio-economic data parameters, consumer specific data parameters etc. as time series. In reference to above explanation in the present
Further, the data receiving module 204 may be configured to provide the disruption data to the disruption data processing module 210. Upon receiving the disruption data, the disruption data processing module 210 may process the disruption data with disruption factors in order to identify a disruption-events within the reference time-period. Upon identifying the disruption event, the disruption data processing module 210 may be configured to provide information related with the identified disruption event to the disruption specific module 214 via a connection ‘C6’.
Upon receiving information related with the disruption event, the disruption specific module 214 may be configured to builds a disruption specific model. The disruption specific model may help to capture variation in the demand because of the disruption-event based on the information of the disruptive-event. As will be appreciated, the disruption specific model may be required in case when the identified disruption-event is not impulsive and may last for certain period of time, for instance, COVID-19, weather specific disruptions etc. In addition to building the disruption specific model, the disruption specific model may also be used for modeling based on impulsiveness of the disruption event. The built disruption specific model may be able to provide information related to the intensity of the disruption-event at any point of time. Moreover, the disruption specific model may provide information related with the duration of the disruption-event, i.e., time duration for which the disruption-event lasted with current trend in order to identify magnitude of impact of the disruption-event.
In an embodiment, the intensity and the duration information of the disruption event may help in performing what-if-analysis based on uncertainty of the disruption-event. Further, future projections may help in building the neural network model for demand projection. In reference to above explanation in the present
I:[0,0,0 . . . ,It,It+1,It+2. . . ,It+n,0] (5)
In equation (5), ‘t’ may represent each point of time of the disruption-event, where ‘t’ depicts start time of the disruption-event. In addition, ‘t+n’ may represent end time of the disruption-event. Further, the duration of disruption, i.e., ‘D’ may be represented as depicted via equation (6) below:
D:[0,0,0 . . . ,Dt,Dt+1,Dt+2. . . ,Dt+n,0] (6)
In equation (6), ‘t’ may represent each point of time of the disruption-event, where ‘t’ depicts start time at which the disruption-event. In addition, ‘t+n’ may represent end time of the disruption-event. Upon identifying the intensity and the duration of the disruption-event, the disruption specific module 214 may be configured to provide this information to the sparse multivariate data collation module 212, via a connection ‘C7’.
The sparse multivariate data collation module 212 may be configured to receive the historical demand data, the one or more extrinsic data parameters, and the disruption data, from the intrinsic data processing module 206, the industry specific data processing module 208, and the disruption specific module 214 over the connection ‘C4’, ‘C5’, and ‘C7’ respectively. Upon collation of the historical demand data, the one or more extrinsic data parameters, and the disruption data, the sparse multivariate data collation module 212 may be configured to generate the sparse multivariate time series. Once the sparse multivariate time series is generated, the sparse multivariate data collation module 212 may be configured to provide the sparse multivariate time series to the demand prediction module 216 via a connection ‘C8’. The sparse multivariate time series may be provided to the demand prediction module 216 for training the neural network model for performing future demand prediction. In reference to
The demand prediction module 216 may be configured to train and build the neural network model that can provide demand prediction with maximum accuracy. In order to train the neural network model, the sparse multivariate time series may be used to generate the training data vectors. The generated training data vectors may be used as an input for the neural network model. The generated training data vectors may correspond to the final input as depicted via the equation (4). The neural network model may include a plurality of hidden layers for capturing localized and relevant sequence of the sparse multivariate time series in order to capture unexpected variations in the demand. In an embodiment, the neural network model may correspond to the forward-looking model that may adapt to new environment. In an embodiment, when path of the disruption-event starts recovering, the neural network model starts transitioning from newly adapted path to actual path by means of adjusting weight between historical path and new path formed based on the disruption data and the one or more extrinsic data parameters. Based on processing of the final input by the demand prediction module 216 via the neural network model, the output 220, i.e., the predicted demand may be generated and rendered at each point in time via an interface ‘I2’. Based on the output 220 generated, a loss function may need to be specified for each point in time within the reference time-period. Once the loss function is specified, the demand prediction module 216 may be configured to train the neural network model on provided inputs (i.e., the final inputs) until the specified loss function is minimized. Further, the demand prediction module 216 may be configured to share the predicted demand with the evaluation module 218 via a bidirectional connection ‘C9’.
The evaluation module 218 may be configured to receive the predicted demand from the demand prediction module 216 in order to evaluate the predicted demand against the actual demand. In order to evaluate the predicted demand for each store, product, or SKU, the evaluation module 218 may use a measurement unit, such as absolute percentage error. Based on the measurement unit, the evaluation module 218 may perform evaluation of the predicted demand to determine whether the magnitude of error of prediction is in line with business requirements. In other words, recovery path of the predicted demand may be evaluated against historic learning path in order to provide explainability for weights associated with the disruption-event which eventually fades off as values of the final input may change back to zero.
Further, the output 220 generated based on processing done by the demand prediction module 216 may represent prediction function with its shape equivalent to multi-step prediction parameter. The prediction function may be represented as: [P](step×1), where ‘P’ is the function of [‘H’,‘I’,‘D,‘ID’].
An advantage of proposed mechanism unlike conventional mechanisms may be that the proposed mechanism leverages use of Artificial Intelligence (AI) to accurately forecast demand for various sales channels at any given point of time. This may be done by capturing impact of the disruption-event on changing consumer behavior and purchase patterns. By effectively capturing the demand, the proposed mechanism may provide AI-driven outcomes related to the SCM The AI-driven outcomes may include, but is not limited to, vendor and inventory management, pricing insights, and product prioritization strategies. As the AI-driven outcomes exists along upstream and downstream components of the supply chain, the proposed mechanism may provide organizations with the supply chain readiness and diversity against various types of disruption.
Additionally, the proposed mechanism may provide accurate demand prediction for different sales channels like stores, products, SKU's, retail, and e-commerce, by incorporating factors specific to the disruption-event and change in behavior of the consumer subjected to changing dynamics. Further, the historical demand data may be used in tandem with different types of industry and market data. Moreover, in order to understand functioning of the target industry, and economic and geographic specific dynamics during the disruption-event, the extrinsic data parameters may be used. The extrinsic data parameters may include, but is not limited to, competitors and market data parameters, macroeconomic data parameters, socio-economic data parameters, and consumer-specific data parameters associated with the target industry. As will be appreciated using the proposed mechanism, all collected information (i.e., the input 202) may be utilized using a novel methodology to capture relevant signals in the sparse multivariate time series. Moreover, an architecture used of the neural network model for the sparse multivariate time series model may provide accuracy for irregular sparse multivariate time series or when historical information is not available for some features.
Referring now to
Once the ML model is trained, at step 304, input vectors may be fed to the trained ML model for a future time-period. The input vectors may include at least one of an intensity vector, a duration vector, and one or more extrinsic data vectors. In an embodiment, the intensity vector may correspond to an intensity of a possible disruption-event at each point of time within the future time-period. Further, the duration may correspond to the duration of the possible disruption-event. In addition, the one or more extrinsic data parameters may correspond to one or more possible extrinsic data parameters associated with each point of time within the future time-period. Upon feeding the input vectors to the trained ML model, at step 306, a demand for a target product in the future time-period may be obtained from the trained ML model based on the input vectors. In an embodiment, the target product may be associated with the target industry.
Referring now to
Upon generating the sparse multivariate time series, at step 410, training data vectors may be generated based on the sparse multivariate time series. The generated training data vectors may include a historical data vector, an intensity vector, a duration vector, and one or more extrinsic data vectors. In an embodiment, the historical data vector may be generated corresponding to historical demand data at each point of time within the reference time-period. Further, the intensity vector may be generated corresponding to the intensity of the disruption-event at each point of time within the reference time-period. In addition, the duration vector may be generated corresponding to the duration of the disruption-event. Moreover, the one or more extrinsic data vectors may be generated corresponding to one or more extrinsic data parameters at each point of time within the reference time-period.
Referring now to
{(IN1,L1(Dactual,Dpredicted)), . . . ,(INi,Li(Dactual,Dpredicted))} (7)
Once the loss function is specified, at step 504, the specified loss function may be used to train the ML model until the loss function for each point of time in the reference time-period is minimized. In other words, the specified loss function may be backpropagate to the ML model in order to adjust the weighted matrix ‘θ’. By way of an example, for each input matrix and the associated loss function, for example, (‘INi’, ‘Li’) pair, the ML model may train itself until the loss function minimization is achieved. The minimized loss function may be represented as min Li(Dactual, Dpredicted). Once all (‘INi’, ‘Li’) pairs are backpropagated and the loss function minimization is achieved, the ML model may be ready for predicting the demand (also referred as demand forecasting). In reference to
Referring now to
Once the ML model is trained, the trained ML model may be used to predict future demand as per the provided input vectors. The input vectors may include the intensity vector, the duration vector, and the one or more extrinsic data vectors. The trained ML model may be used to estimate both short term as well as long term demand based on business needs of the target industry. Further, depending upon uncertainty of the demand prediction for the future time-period, what-if-analysis may be simulated using the trained ML model. The what-if-analysis may be simulated based on factors considered for building the ML model and accordingly business outcomes (i.e., the demand prediction) may be obtained for supporting business decisions. Since the ML model is a forward-looking model, so at each stage, once the actual demand is available, the available actual demand may be incorporated in the input vectors as a feedback channel to update the ML model with latest behavior change and improve performance of the ML model. Further, the demand predictions available for various stores, products, SKU's from various channels and regions of the target industry, may be used to derive actionable insights in order to help in better decision making with respect to inventory planning, pricing of products, and sales strategy. In reference to
Referring now to
Upon identifying and collecting the intrinsic data and the extrinsic data, at step 704, the disruption specific model may be build based on the disruption data. In order to build the disruption specific model, the disruption data may be processed to identify the disruption-event that may be used by the disruption specific model. As explained in the
Once the disruption specific model is build, at step 706, the one or more extrinsic data parameters (i.e., the industry specific data) may be processed to identify variations in the demand specific to the target industry. The one or more extrinsic data parameters may be considered particular to the target industry and are used as pointers to explain the variation in the demand for the target industry. The one or more extrinsic data parameters may need to be contextualized specific to the target industry. The extrinsic data parameters may include, but is not limited to, competitors and market data parameters, macroeconomic data parameters, socio-economic data parameters, and consumer-specific data parameters associated with the target industry. In reference to
Upon processing the one or more extrinsic data parameters, at step 708, the neural network model may be trained in order to obtain the demand prediction with maximum accuracy. In reference to
In order to train the neural network model, the final input generated may be used as an input for the neural network model. The final input may be provided as an input matrix ‘IN’ to the neural network model. Further, upon receiving the final input, the neural network model may identify weights to express optimal relationship between each variable and demand. These weights can be represented via matrix θ as depicted via equation (8):
Upon receiving the input matrix ‘IN’ and the weight matrix ‘θ’, the neural network model may define a mapping between the input matrix ‘IN’ and the weight matrix ‘θ’ as represented via equation (9) below:
[P](Step×1)=f(IN;θ) (9)
In the above equation (9), ‘f’ may depict the function for mapping the input matrix ‘IN’ with the weight matrix ‘θ’. Once the mapping is generated, the neural network model may predict the demand (D) via ‘f(IN; θ)’ for each time-period ‘t’. The predicted demand (D) may be represented as depicted via equation (10) below:
{Dpredicted(t1),Dpredicted(t2), . . . +Dpredicted(tn)}- (10)
Once the neural network model is trained, at step 710, the trained neural network model may be used to predict the demand for the future time-period. In order to predict the demand, the input vectors may be fed as an input to the trained neural network model. In an embodiment, the input vectors may include the intensity vector, the duration vector, and the one or more extrinsic data parameters. Upon receiving the input vectors, the trained neural network model may obtain the demand for the target product in the future time-period. In other words, based on the received input vectors, the neural network model may be used to obtain the demand for both short term and long term based on business requirement of the target industry.
Since the neural network model is the forward-looking model, hence once the demand is obtained, the neural network model may be retrained based on the input vectors and the obtained demand in order to update the neural network model for a future time-period. The obtained demand for the stores, the products, or the SKU's from various channels may be used to derive actionable insights in order to make better decision with respect to inventory planning, pricing of products, and sales strategy.
Further, at step 712, the predicted demand (i.e., the obtained demand) may be compared against the actual demand by utilizing various measurements, e.g., absolute percentage error. The comparison is done to determine whether the magnitude of error of prediction is in line with the business requirements of the target industry. Based on the determination of the magnitude of error of prediction, the neural network model may be retrained. This has been already explained above in reference to the
Referring now to
Processor 804 may be disposed in communication with one or more input/output (1/O) devices via an I/O interface 806. The I/O interface 806 may employ communication protocols/methods such as, without limitation, audio, analog, digital, monoaural, RCA, stereo, IEEE-1394, serial bus, universal serial bus (USB), infrared, PS/2, BNC, coaxial, component, composite, digital visual interface (DVI), high-definition multimedia interface (HDMI), RF antennas, S-Video, VGA, IEEE 802.n/b/g/n/x, Bluetooth, cellular (for example, code-division multiple access (CDMA), high-speed packet access (HSPA+), global system for mobile communications (GSM), long-term evolution (LTE), WiMax, or the like), etc.
Using I/O interface 806, computer system 802 may communicate with one or more I/O devices. For example, an input device 808 may be an antenna, keyboard, mouse, joystick, (infrared) remote control, camera, card reader, fax machine, dongle, biometric reader, microphone, touch screen, touchpad, trackball, sensor (for example, accelerometer, light sensor, GPS, gyroscope, proximity sensor, or the like), stylus, scanner, storage device, transceiver, video device/source, visors, etc. An output device 810 may be a printer, fax machine, video display (for example, cathode ray tube (CRT), liquid crystal display (LCD), light-emitting diode (LED), plasma, or the like), audio speaker, etc. In some embodiments, a transceiver 812 may be disposed in connection with processor 804. Transceiver 812 may facilitate various types of wireless transmission or reception. For example, transceiver 812 may include an antenna operatively connected to a transceiver chip (for example, TEXAS® INSTRUMENTS WILINK WL1286© transceiver, BROADCOM® BCM45501UB8® transceiver, INFINEON TECHNOLOGIES® X-GOLD 618-PMB9800® transceiver, or the like), providing IEEE 802.6a/b/g/n, Bluetooth, FM, global positioning system (GPS), 2G/3G HSDPA/HSUPA communications, etc.
In some embodiments, processor 804 may be disposed in communication with a communication network 814 via a network interface 816. Network interface 816 may communicate with communication network 814. Network interface 816 may employ connection protocols including, without limitation, direct connect, Ethernet (for example, twisted pair 10/100/1000 Base T), transmission control protocol/internet protocol (TCP/IP), token ring, IEEE 802.11 a/b/g/n/x, etc. Communication network 814 may include, without limitation, a direct interconnection, local area network (LAN), wide area network (WAN), wireless network (for example, using Wireless Application Protocol), the Internet, etc. Using network interface 816 and communication network 814, computer system 802 may communicate with devices 818, 820, and 822. These devices may include, without limitation, personal computer(s), server(s), fax machines, printers, scanners, various mobile devices such as cellular telephones, smartphones (for example, APPLE© IPHONE® smartphone, BLACKBERRY® smartphone, ANDROID® based phones, etc.), tablet computers, eBook readers (AMAZON® KINDLE® reader, NOOK® tablet computer, etc.), laptop computers, notebooks, gaming consoles (MICROSOFT® XBOX® gaming console, NINTENDO© DS© gaming console, SONY® PLAYSTATION® gaming console, etc.), or the like. In some embodiments, computer system 802 may itself embody one or more of these devices.
In some embodiments, processor 804 may be disposed in communication with one or more memory devices (for example, RAM 826, ROM 828, etc.) via a storage interface 824. Storage interface 824 may connect to memory 830 including, without limitation, memory drives, removable disc drives, etc., employing connection protocols such as serial advanced technology attachment (SATA), integrated drive electronics (IDE), IEEE-1394, universal serial bus (USB), fiber channel, small computer systems interface (SCSI), etc. The memory drives may further include a drum, magnetic disc drive, magneto-optical drive, optical drive, redundant array of independent discs (RAID), solid-state memory devices, solid-state drives, etc.
Memory 830 may store a collection of program or database components, including, without limitation, an operating system 832, user interface 834, web browser 836, mail server 838, mail client 840, user/application data 842 (for example, any data variables or data records discussed in this disclosure), etc. Operating system 832 may facilitate resource management and operation of computer system 802. Examples of operating systems 832 include, without limitation, APPLE® MACINTOSH® OS X platform, UNIX platform, Unix-like system distributions (for example, Berkeley Software Distribution (BSD), FreeBSD, NetBSD, OpenBSD, etc.), LINUX distributions (for example, RED HAT©, UBUNTU®, KUBUNTU®, etc.), IBM© OS/2 platform, MICROSOFT® WINDOWS® platform (XP, Vista/7/8, etc.), APPLE© IOS® platform, GOOGLE® ANDROID® platform, BLACKBERRY® OS platform, or the like. User interface 834 may facilitate display, execution, interaction, manipulation, or operation of program components through textual or graphical facilities. For example, user interfaces may provide computer interaction interface elements on a display system operatively connected to computer system 802, such as cursors, icons, check boxes, menus, scrollers, windows, widgets, etc. Graphical user interfaces (GUls) may be employed, including, without limitation, APPLE® Macintosh® operating systems' AQUA® platform, IBM® OS/2® platform, MICROSOFT® WINDOWS® platform (for example, AERO® platform, METRO® platform, etc.), UNIX X-WINDOWS, web interface libraries (for example, ACTIVEX® platform, JAVA® programming language, JAVASCRIPT© programming language, AJAX® programming language, HTML, ADOBE® FLASH® platform, etc.), or the like.
In some embodiments, computer system 802 may implement a web browser 836 stored program component. Web browser 836 may be a hypertext viewing application, such as MICROSOFT® INTERNET EXPLORER® web browser, GOOGLE® CHROME® web browser, MOZILLA® FIREFOX® web browser, APPLE® SAFARI® web browser, etc. Secure web browsing may be provided using HTTPS (secure hypertext transport protocol), secure sockets layer (SSL), Transport Layer Security (TLS), etc. Web browsers may utilize facilities such as AJAX, DHTML, ADOBE® FLASH® platform, JAVASCRIPT® programming language, JAVA® programming language, application programming interfaces (APIs), etc. In some embodiments, computer system 802 may implement a mail server 838 stored program component. Mail server 838 may be an Internet mail server such as MICROSOFT© EXCHANGE© mail server, or the like. Mail server 838 may utilize facilities such as ASP, ActiveX, ANSI C++/C#, MICROSOFT .NET® programming language, CGI scripts, JAVA© programming language, JAVASCRIPT® programming language, PERL® programming language, PHP® programming language, PYTHON© programming language, WebObjects, etc. Mail server 838 may utilize communication protocols such as internet message access protocol (IMAP), messaging application programming interface (MAPI), Microsoft Exchange, post office protocol (POP), simple mail transfer protocol (SMTP), or the like. In some embodiments, computer system 802 may implement a mail client 840 stored program component. Mail client 840 may be a mail viewing application, such as APPLE MAIL® mail-client, MICROSOFT ENTOURAGE® mail client, MICROSOFT OUTLOOK® mail client, MOZILLA THUNDERBIRD® mail client, etc.
In some embodiments, computer system 802 may store user/application data 842, such as the data, variables, records, etc. as described in this disclosure. Such databases may be implemented as fault-tolerant, relational, scalable, secure databases such as ORACLE® database OR SYBASE® database. Alternatively, such databases may be implemented using standardized data structures, such as an array, hash, linked list, struct, structured text file (for example, XML), table, or as object-oriented databases (for example, using OBJECTSTORE® object database, POET® object database, ZOPE® object database, etc.). Such databases may be consolidated or distributed, sometimes among the various computer systems discussed above in this disclosure. It is to be understood that the structure and operation of the any computer or database component may be combined, consolidated, or distributed in any working combination.
Various embodiments provide method and system for predicting demand for a supply chain. The disclosed method and system may feed input vectors to a trained Machine Learning (ML) model for a future time-period. The input vectors may at least one of: an intensity vector corresponding to an intensity of a possible disruption-event at each point of time within the future time-period, a duration vector corresponding to the duration of the possible disruption-event, and one or more extrinsic data vectors corresponding to one or more possible extrinsic data parameters associated with each point of time within the future time-period. Further, the disclosed method and system may obtain a demand for a target product in the future time-period from the trained ML model based on the input vectors.
The method and system provide some advantages like, the method and system may capture impact of various disruption event on future demand by providing forecast for performing detailed level of analysis related to stores, products, or SKU's. Further, the method and system may enable user to easily capture impact of disruption event due to use of Artificial Intelligence in time-modelling. In addition, the method and system may provide higher accuracy in determination of impact of disruption event during the disruption event and after completion.
It will be appreciated that, for clarity purposes, the above description has described embodiments of the invention with reference to different functional units and processors. However, it will be apparent that any suitable distribution of functionality between different functional units, processors or domains may be used without detracting from the invention. For example, functionality illustrated to be performed by separate processors or controllers may be performed by the same processor or controller. Hence, references to specific functional units are only to be seen as references to suitable means for providing the described functionality, rather than indicative of a strict logical or physical structure or organization.
Although the present invention has been described in connection with some embodiments, it is not intended to be limited to the specific form set forth herein. Rather, the scope of the present invention is limited only by the claims. Additionally, although a feature may appear to be described in connection with particular embodiments, one skilled in the art would recognize that various features of the described embodiments may be combined in accordance with the invention.
Furthermore, although individually listed, a plurality of means, elements or process steps may be implemented by, for example, a single unit or processor. Additionally, although individual features may be included in different claims, these may possibly be advantageously combined, and the inclusion in different claims does not imply that a combination of features is not feasible and/or advantageous. Also, the inclusion of a feature in one category of claims does not imply a limitation to this category, but rather the feature may be equally applicable to other claim categories, as appropriate.
Number | Date | Country | Kind |
---|---|---|---|
202241023237 | Apr 2022 | IN | national |