The subject matter described herein relates, in general, to predicting a state-of-health (SOH) of a battery at a future time and, more particularly, to predicting a SOH of a battery using an ensemble of machine learning models.
As popularity in electric vehicles (EVs) rises, the burden to replace faulty batteries will rapidly increase. Moreover, current regulations require original equipment manufacturers (OEMs) to provide accurate information to their customers about the state of their batteries, including information about the state-of-health (SOH) of the batteries. Accordingly, systems and methods to evaluate the state of batteries are widely used, and accuracy of such systems and methods is very important. Traditional systems and methods used to evaluate the state of batteries often rely on electrochemical lab measurements, which are not only expensive and time-consuming to collect, but also performed only in qualified labs. In recent years, digital modeling and simulating of battery states have improved, as such methods either simplify the battery in an equivalent circuit or consider more complex reaction and mass transportation within batteries. Such methods, however, still rely on the knowledge of the physical condition of each battery, for example, the resistance and impedance, which is costly to measure in a lab.
In one embodiment, example systems and methods relate to predicting a future state-of-health SOH of a battery. In one approach, a prediction module employs a multi-part process for predicting a future SOH of a battery.
In a first part, in one approach, a prediction module trains a machine learning-based estimation model to predict current SOH values of batteries using lab-measured SOH values. The estimation model permits generating historical SOH data for the batteries so that time consuming, expensive lab measurements of the SOH values do not have to be collected. To train the estimation model, in one example, the prediction module uses battery data of the batteries at the same time point as the lab measurements as the training data for the estimation model. Additionally, in one approach, the prediction module uses the lab-measured SOH values as a supervising signal for the estimation model to generate a loss value according to corresponding predictions and update the estimation model.
In a second part, in one approach, the prediction module applies the estimation model to the battery data, which includes, in one example, data from the production date of the batter until the current moment, in order to acquire the historical SOH data. The prediction module uses the historical SOH data later as training data to train an ensemble of machine-learning models used to predict a future SOH of a new battery. In one approach, the prediction module then constructs a SOH history curve using the historical SOH data and acquires a mileage curve, which is a curve of the mileage of respective vehicles associated with the batteries as a function of time. In one approach, the prediction module filters the historical SOH data to ensure the historical SOH data is suitable for use as the training data for the ensemble of models. In one example, filtering the historical SOH data includes identifying outlying data points on the SOH history curve, as well as cross-checking the historical SOH data with the mileage curve to identify significant changes in use of the battery that might affect the smoothness of the historical SOH data.
In one approach, as mentioned above, the prediction module then uses the filtered historical SOH data as training data for the ensemble of models, which are then used to predict future SOH values according to battery data. In one example, the prediction module characterizes the historical SOH data as either early time sequency data or as late time sequency data. In one approach, the battery data corresponding to the early time sequency data is used as training data for each model in the ensemble of models, and the late time sequency data is used as the supervising signal for each model in the ensemble of models to generate a loss value for each model and update each model. For at least some of the models, in one approach, the prediction module characterizes the historical SOH data differently with respect to the SOH history curve in order to obtain an ensemble of differently-trained models, which may be advantageous to account for variations in the data of the new battery for which the prediction module will predict a future SOH. In other words, in one example, the early time sequency data and the late time sequency data for each model are located at different time points of the SOH history curve.
In a third part, in one approach, the prediction module generates a prediction error for each model in the ensemble. In one example, the prediction error is a measurement of the accuracy of each model based on various prediction criteria, including the time length of the battery data and the time to the future SOH prediction. Based on the prediction error, in one approach, the prediction module creates an assignment of each model to the prediction criteria in order to determine the most accurate model for the prediction criteria of a new battery. In one approach, the prediction module then identifies a selected model for the new battery based on the assignment and uses new battery data as the input to the selected model to predict a future SOH for the new battery.
According to the systems and methods described herein, the ability to accurately predict the SOH of batteries at future times will provide additional avenues to create more valuable products such as better warranty and insurance policies, enable secondary usage of batteries, and help prevent life-threatening accidents due to battery failure. Accordingly, the systems and methods described herein provide cost-friendly and accurate predictions of the future SOH of a battery without electrochemical lab testing of the battery itself. The systems and methods described herein have the benefit of evaluating the battery individually and independently of other batteries to avoid errors caused by averaging the behavior of a group of batteries, as well as using data collected during driving of the vehicle containing the battery in order to consider the environmental effects on the battery SOH to provide a more accurate SOH prediction.
In one embodiment, a system includes a memory communicably coupled to a processor and storing instructions that, when executed by the processor, cause the processor to acquire battery data of a battery for which a future state-of-health (SOH) is to be predicted. The instructions also cause the processor to identify a selected model from an ensemble of models according to at least a time to the future SOH and a time length of the battery data. The instructions further cause the processor to predict the future SOH using the battery data as an input to the selected model.
In another embodiment, a non-transitory computer-readable medium includes instructions that, when executed by one or more processors, cause the one or more processors to acquire battery data of a battery for which a future state-of-health (SOH) is to be predicted. The instructions also cause the one more processors to identify a selected model from an ensemble of models according to at least a time to the future SOH and a time length of the battery data. The instructions further cause the one or more processors to predict the future SOH using the battery data as an input to the selected model.
In yet another embodiment, a method includes acquiring battery data of a battery for which a future state-of-health (SOH) is to be predicted. The method also includes identifying a selected model from an ensemble of models according to at least a time to the future SOH and a time length of the battery data. The method further includes predicting the future SOH using the battery data as an input to the selected model.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate various systems, methods, and other embodiments of the disclosure. It will be appreciated that the illustrated element boundaries (e.g., boxes, groups of boxes, or other shapes) in the figures represent one embodiment of the boundaries. In some embodiments, one element may be designed as multiple elements, or multiple elements may be designed as one element. In some embodiments, an element shown as an internal component of another element may be implemented as an external component and vice versa. Furthermore, elements may not be drawn to scale.
Systems, methods, and other embodiments associated with predicting a future state-of-health (SOH) of a battery are disclosed. In a first part, in one approach, a prediction module of a prediction system for predicting a future SOH of a battery trains an estimation model to predict current SOH values of batteries using lab-measured SOH values of the batteries. Using the estimation model, the prediction module generates historical SOH data for the batteries. The prediction module, in one example, trains the estimation model using battery data as the training data for the estimation model. The prediction module, in one approach, also uses the lab-measured data as a supervising signal for the estimation model.
In one approach of a second part, the prediction module applies the estimation model to the battery data to generate historical SOH data for later use as training data for an ensemble of machine-learning models used to predict a future SOH of a new battery. In one approach, using the historical SOH data, the prediction module constructs a SOH history curve and acquires a mileage curve of the mileage of the batteries as a function of time. In one approach, the prediction module filters the historical SOH data before use as the training data for the ensemble of models. The prediction module, in one approach, then uses the filtered historical SOH data as training data for each model in the ensemble. Before use as the training data, in one example, the prediction module characterizes the historical SOH data as either early time sequency data or as late time sequency data. In one approach, the battery data corresponding to the early time sequency data is used as training data for each model in the ensemble of models, and the late time sequency data is used as the supervising signal for each model in the ensemble of models to generate a loss value for each model and update each model. In one approach, the prediction module characterizes the historical SOH data differently for each model in the ensemble. In other words, in one example, the early time sequency data and the late time sequency data for each model are located at different time points of the SOH history curve.
In one approach of a third part, for each model, the prediction module generates a prediction error, which in one example is a measurement of the accuracy of each model based on various prediction criteria. In one approach, the prediction criteria include the time length of the battery data and the time to the future SOH prediction. Based on the prediction error, in one approach, the prediction module creates an assignment of each model to the prediction criteria in order to determine the most accurate model for the prediction criteria of a new battery. In one approach, the prediction module then identifies a selected model for the new battery based on the assignment and uses new battery data as the input to the selected model to predict a future SOH for the new battery.
The systems and methods described herein provide the ability to accurately predict the SOH of a battery at a future time. Accurate future SOH predictions of batteries may lead to the creation of better warranty and insurance policies, enable secondary usage of batteries, and help prevent accidents due to battery failure. The systems and methods described herein also provide cost-friendly predictions of the future SOH of a battery without expensive lab testing of the battery itself. The systems and methods described herein further have the benefit of evaluating the battery individually and independently of other batteries to avoid errors caused by averaging the behavior of a group of batteries, as well as using data collected during driving of the vehicle containing the battery in order to consider the environmental effects on the battery SOH to provide a more accurate SOH prediction.
Referring now to
With further reference to
Furthermore, the prediction system 100 includes a data store 140. The data store 140 is, in one arrangement, an electronic data structure stored in the memory 120 or another electronic medium, and that is configured with routines that can be executed by the processor 110 for analyzing stored data, providing stored data, organizing stored data, and so on. Thus, in one embodiment, the data store 140 stores data used by the prediction module 130 in executing various functions. For example, as depicted in
Continuing with the elements shown in the data store 140, the battery data 150 includes, for example, data regarding the batteries of EVs during the driving and resting of the EVs. As described in further detail below, in one approach, the processor 110 uses the battery data 150 as the training data for an estimation model. As described in further detail below, the prediction module 130 trains the estimation model to estimate a SOH for each battery using the battery data 150. The battery data 150 includes data of a number of batteries of EVs. For example, the battery data 150 includes data of around 10,000 batteries. The battery data 150 includes data collected over a data collection time period. For example, the battery data 150 includes data collected over a time period of about 6 months, about 1 year, about 10 years, or another length of time. The battery data 150 includes data collected during the data collection time period at a frequency with enough data to characterize the behavior of each battery. For example, the battery data 150 includes data collected during the data collection time period at a frequency of about once a month. The batteries include a type of battery used in a type of EV, for example, electric passenger vehicles and SUVs. In one approach, the processor 110 receives the battery data 150 from the battery management systems (BMSs) of the EVs corresponding to the batteries. One example of the battery data 150 will be described in further detail below in connection with
In one approach, after receiving the battery data 150, the processor 110 preprocesses the battery data 150 in a preprocessing phase before use as the training data for the estimation model. In some instances, the preprocessing phase helps ensure greater accuracy of the estimation model. The preprocessing phase includes, in some instances, filtering the battery data 150 to obtain the most relevant features of the battery data 150 for use as the training data for the estimation model. The most relevant features include, for example, relevant battery features and relevant vehicle features at each time point. The relevant battery features include, for example, the battery temperature, battery charge and discharge information, the battery current, the battery charging capacity, the battery resistance, etc. The relevant vehicle features include, for example, the vehicle mileage, the vehicle running distance, the vehicle ignition status (vehicle ignition on or vehicle ignition off), etc. In some instances, the preprocessing phase also includes applying empirical equations 160 to the battery data 150 to account for the physics and/or physical characteristics of the battery. As mentioned above, the data store 140 includes empirical equations 160 relating to the physics of the battery. In one approach, the processor 110 applies the empirical equations 160 to the battery data 150 before using the battery data 150 as the training data for the estimation model.
It should be understood that the depiction of each time point in
In one approach, in conjunction with the collection of the battery data 150, the batteries are also electrochemically tested in a lab to determine the true SOH of each battery. In some instances, the true SOH is the true SOH at the time point when the lab testing occurs. Accordingly, as mentioned above, the data store 140 also includes lab-measured data 170 regarding the batteries. The lab-measured data 170 may be obtained by a method involving the electrochemical testing of the individual cells in each battery. The SOH of each battery is thus an accumulation of the SOH measurements of each individual cell.
It should be understood that the depiction of the lab-measured data 170 time point may represent multiple data points. For example, the lab-measured data 170 time point represents each of the lab-measured, true SOH values of each battery that is electrochemically tested. For example, if there are 10,000 EVs, the lab-measured data 170 time point represents 10,000 true SOH values. More specifically, in this example, t6 represents 10,000 true SOH values.
As mentioned above, the data store 140 includes new battery data 180, which includes battery data of a new battery not of which the prediction module 130 will predict a future SOH. In one approach, the processor 110 receives the new battery data 180 from the BMS of the vehicle associated with the new battery and includes data regarding the new battery during driving and resting of the vehicle. In some instances, the new battery data 180 includes similar relevant features described above in connection with the battery data 150, and, in some instances, the new battery data 180 is preprocessed in a similar manner to the battery data 150. Accordingly, in some arrangements, the new battery data 180 includes the most relevant new battery data 180 features, for example, relevant new battery features and relevant new vehicle features. The relevant new battery features include, for example, the battery temperature, battery charge and discharge information, the battery current, the battery charging capacity, the battery resistance, etc. The relevant new vehicle features include, for example, the vehicle mileage, the vehicle running distance, the vehicle ignition status (vehicle ignition on or vehicle ignition off), etc. As described in further detail below with reference to
As mentioned above, the data store 140 also includes historical SOH data 190. The historical SOH data 190 includes historical SOH estimates for each time period of the battery data 150. In one example, and as described in further detail below, the prediction module 130 generates the historical SOH data 190 in a second part of a method for predicting a future SOH of a vehicle battery.
Referring now to
At 210, the prediction module 130 builds the estimation model using the battery data 150 as training data and the lab-measured data 170 as the supervising signal. At 220, the prediction module 130 applies the estimation model to the battery data 150 to obtain the historical SOH data 190, which the prediction module 130 uses to train an ensemble of models. At 230, the prediction module 130 selects the best model from the ensemble for the prediction of a future SOH of the new battery, and the prediction module 130 uses the selected model to predict the future SOH of the new battery. Each of the parts of the method will be described in further detail below with additional reference to
Referring now to the first part 210 of the method 200,
At 310, the SOH prediction module 130 acquires the empirical equations 160. In one approach, the prediction module 130 receives the empirical equations 160 through active acquisition (i.e., controlling a device to acquire the empirical equations 160). For example, the prediction module 130 controls the processor 110 to acquire the empirical equations 160 from the data store 140. In another approach, the prediction module 130 acquires the empirical equations 160 through communication of the empirical equations 160 via a data communication link. For example, the processor 110 initiates communication of the empirical equations 160 from the data store 140 to the prediction module 130.
At 320, the prediction module 130 preprocesses the battery data 150. As described above, the prediction module 130 preprocesses the battery data 150 by filtering out the relevant battery features and/or relevant vehicle features at each time point and/or applying the empirical equations 160 to the battery data 150. The most relevant features include, for example, the battery temperature, battery charge and discharge information, the battery current, the battery charging capacity, the battery resistance, the vehicle mileage, the vehicle running distance, the vehicle ignition status (vehicle ignition on or vehicle ignition off), etc. In one approach, the prediction module 130 applies the empirical equations 160 to the battery data 150 before using the battery data 150 as the training data for the estimation model.
At 330, the prediction module 130 acquires the lab-measured data 170. In one approach, the prediction module 130 receives the lab-measured data 170 through active acquisition (i.e., controlling a device to acquire the lab-measured data 170). For example, the prediction module 130 controls the processor 110 to acquire the lab-measured data 170 from the data store 140. In another approach, the prediction module 130 acquires the lab-measured data 170 through communication of the lab-measured data 170 via a data communication link. For example, the processor 110 initiates communication of the lab-measured data 170 from the data store 140 to the prediction module 130.
At 340, as described in further detail below, the prediction module 130 trains the estimation model to estimate a SOH for each battery based on the battery data 150. In one approach, training the estimation model includes providing input data, which includes, for example, the preprocessed battery data 150, and executing the estimation model to train the estimation model to estimate a SOH for each battery. Training the estimation model also includes, in one approach, receiving output data, which includes, for example, estimated SOH values for each battery, and comparing the output data to ground truth data to generate a loss value. In one approach, the ground truth data is the lab-measured data 170, and the prediction module uses the loss value to backpropagate the loss through the estimation model to update each iteration of the estimation model.
In one approach, as mentioned above, to train the estimation model, in one example, the prediction module 130 uses the preprocessed battery data 150 as the input data to the estimation model.
To train the estimation model, in one approach, the prediction module 130 uses, as the input data, the battery data 150 corresponding to the time point at which the lab-measured data 170 was collected as the training data. For example, as illustrated in
In the arrangements described herein, the estimation model is a Least Absolute Shrinkage and Selection Operator (LASSO) regression. However, it should be understood that the estimation model can be another type of model, for example, an Artificial Neural Network (ANN), a reinforcement learning model, a deep learning model, a support vector machine, a logistic regression, a linear regression, a random forest, etc. Moreover, while the estimation model is described herein as a supervised machine-learning model, it should be understood that in one or more other approaches, the estimation model can be an unsupervised machine-learning model or a reinforced machine-learning model.
Referring now to the second part 220 of the method 200,
At 410, the SOH prediction module 130 uses the historical SOH data 190 to construct a SOH history curve for the batteries. The SOH history curve is a curve of the SOH of each battery as a function of time (for example, the number of days since the production of the EV associated with the battery). At 420, the SOH prediction module 130 acquires a milage curve. In one embodiment, the processor 110 constructs a mileage curve using the battery data 150. In one example, the mileage curve is a curve of the milage of each battery as a function of time (for example, the number of days since the production of the EV associated with the battery).
At 430, the SOH prediction module 130 filters the historical SOH data 190 using the SOH history curve and/or the mileage curve to identify batteries having a smooth SOH history curve and/or a smooth mileage curve to identify constantly behaved EVs. As used herein, a “smooth” curve means that there are sufficient recorded data points in the curve such that the curve looks smooth and continuous and that there are no significantly outlying data points. As such, the data filtering phase of 430 is important because in some arrangements it helps improve the accuracy of the ensemble of models when trained using the historical SOH data 190, as will be described in further detail below. In one approach, filtering the historical SOH data 190 includes removing significantly outlying data points on the SOH history curve. Additionally, in one approach, filtering the historical SOH data 190 includes filtering out data from batteries having a mileage curve that is not sufficiently smooth. For example, filtering the historical SOH data 190 may involve filtering out data from batteries for which the mileage curve indicates there was a significant change in driving habits or driving location of the vehicle associated with the batteries. For example, filtering the historical SOH data 190 may involve filtering out data regarding an EV that was driven around 20 miles a day for a few years and then driven about 50 miles a day for a few years after. As mentioned above, the data filtering phase ensures the quality of the training data used to train the ensemble of models.
At 440, the prediction module 130 characterizes the filtered historical SOH data 190 as early time sequency data or late time sequency data. In one approach, as described in further detail below, the prediction module 130 uses the battery data 150 corresponding to the early time sequency data as training data for each model in the ensemble of models and the late time sequency data as supervising signals to train each model in the ensemble of models. In one approach, early time sequency data includes data that lies earlier on the SOH history curve or that is at least earlier than the late time sequency data. Similarly, in one approach, late time sequency data includes data that lies later on the SOH history curve or that is at least later than the early time sequency data. For example, referring now to
At 450, as mentioned above, the prediction module 130 uses the battery data 150 corresponding to the early time sequency data as training data for each model in the ensemble and the late time sequency SOH data as supervising signals to train each model in the ensemble. In one approach, training each model in the ensemble includes providing input data, which includes, for example, the battery data 150 corresponding to the early time sequency data, and executing each model to train each model to estimate a SOH for each battery. Training each model also includes, in one approach, receiving output data, which includes, for example, estimated SOH values for each battery, and comparing the output data to ground truth data to generate a loss value. In one approach, the ground truth data is the output 350 of the estimation model, and the prediction module uses the loss value to backpropagate the loss through the estimation model to update each iteration of each model.
In one approach, the prediction module 130 trains each model in the ensemble differently using different characterizations of the time sequency data. For example, if the SOH at t6 is to be predicted, various combinations of the battery data 150 of t1, t2, t3, t4, and t5 and the historical SOH data 190 can be used to train each model. For example, the prediction module 130 can train a first model using battery data 150 from t1 and t2 as training data and historical SOH data 190 from t6 as the supervising signal. In another example, the prediction module 130 can train a second model using battery data 150 from t2 and t3 as training data and historical SOH data 190 from t6 as the supervising signal. In yet another example, the prediction module 130 can train a third model using battery data 150 from t1 and t4 as training data and historical SOH data 190 from t6 as the supervising signal. It should be understood that the aforementioned characterizations of early time sequency data and late time sequency data are merely a few examples and that other characterizations are possible as well. In some instances, training each model in the ensemble of models differently is beneficial because the best model for a future SOH prediction of new battery may depend on various factors, including the time length of the new battery data 180 and the time to the future SOH prediction. For example, as described in further detail below, use of the same model to predict a future SOH for a battery having a two-year driving history and a battery having a four-year driving history may result in different future SOH predictions having different prediction accuracies.
In one embodiment, each model in the ensemble of models is the same type of model. For example, in the embodiments described herein, each model is a Long Short-Term Memory (LSTM) algorithm. However, it should be understood that each model can be another type of model, for example, a LASSO regression, an Artificial Neural Network (ANN), a reinforcement learning model, a deep learning model, a support vector machine, a logistic regression, a linear regression, a random forest, etc. While each model is described herein as being the same type of model, it should be understood that in other instances, two or more of the models may be different types of models. Moreover, while each model in the ensemble of models is described herein as a supervised machine-learning model, it should be understood that in one or more other approaches, one or more of the models can be an unsupervised machine-learning model or a reinforced machine-learning model.
Turning now to
In some instances, the prediction accuracy for each model will vary based on the time to the SOH prediction and the time length of the battery data (prediction criteria). Accordingly, it is advantageous to determine which model in the ensemble will provide the most accurate SOH prediction for a new battery. Thus, at 510, using the prediction accuracy for each model, the SOH prediction module 130 creates an assignment of each model to the prediction criteria.
In the example shown in
It should be understood that the actions taken at 500 and 510, generating the prediction error and creating an assignment of models, are, in one approach, prior functions to the actions taken at 520, 530, 540, and other subsequent steps of the method 230. More specifically, in one approach, the actions taken at 500 and 510 pre-configure the prediction module 130 so that the prediction module 130 can iteratively repeat the actions taken at 520, 530, 540, and other subsequent steps of the method 230.
At 520, the prediction module 130 acquires the new battery data 180, as described above. In one approach, the prediction module 130 receives the new battery data 180 through active acquisition (i.e., controlling a device to acquire the new battery data 180). For example, the prediction module 130 controls the processor 110 to acquire the new battery data 180 from the data store 140. In another approach, the prediction module 130 acquires the new battery data 180 through communication of the new battery data 180 via a data communication link. For example, the processor 110 initiates communication of the new battery data 180 from the data store 140 to the prediction module 130.
At 530, based on the above-described assignment, the prediction module 130 identifies a selected model for a future SOH prediction of the new battery. For example, if the new battery data 180 has a time length of 3 years and the processor 110 has instructed the prediction module 130 to predict the future SOH of the new battery in 2 years, the prediction module 130 may identify model A as the selected model. Thus, the prediction module 130 considers attributes of the battery data from which the module 130 selects a particular model to perform the prediction.
At 540, the prediction module 130 uses the new battery data 180 as the input to the selected model to generate a future SOH prediction for the new battery. For example, if the selected model is model A, the prediction module 130 inputs the new battery data 180 into model A to generate the future SOH prediction. Thus, the prediction module 130 is able to apply a model that provides the most accurate result, thereby improving predictions of SOH. In some arrangements, the new battery data 180 includes data for the time points mentioned above, for example, data at time points t1, t2, t3, t4, t5, and t6. In some instances, the new battery data 180 that is input to the model to generate a future SOH prediction corresponds to data at two or more of the time points. For example, the prediction module 130 can use the new battery data 180 at t1 and t3 to predict the future SOH. In another example, the prediction module 130 can use the new battery data 180 at t2 and t4 to predict the future SOH. Other combinations of the new battery data 180 are also possible to predict the future SOH.
The arrangements described herein have the benefit of providing systems and methods for accurately predicting a future SOH of a vehicle battery with a limited need for expensive, time-consuming lab measurements. The arrangements described herein also have the benefit of treating the vehicle battery individually and independently of other vehicle batteries to avoid error caused by averaging the behavior of a group of batteries for the purpose of a future SOH prediction. Finally, the arrangements described herein have the benefit of considering the environmental effects on the battery in the future SOH prediction.
Detailed embodiments are disclosed herein. However, it is to be understood that the disclosed embodiments are intended only as examples. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the aspects herein in virtually any appropriately detailed structure. Further, the terms and phrases used herein are not intended to be limiting but rather to provide an understandable description of possible implementations. Various embodiments are shown in
The systems, components, and/or processes described above can be realized in hardware or a combination of hardware and software and can be realized in a centralized fashion in one processing system or in a distributed fashion where different elements are spread across several interconnected processing systems. Any kind of processing system or another apparatus adapted for carrying out the methods described herein is suited. A typical combination of hardware and software can be a processing system with computer-usable program code that, when being loaded and executed, controls the processing system such that it carries out the methods described herein. The systems, components, and/or processes also can be embedded in a computer-readable storage, such as a computer program product or other data programs storage device, readable by a machine, tangibly embodying a program of instructions executable by the machine to perform methods and processes described herein. The aforementioned elements can also be embedded in an application product that comprises all the features enabling the implementation of the methods described herein and which, when loaded in a processing system, is able to carry out the methods.
Furthermore, arrangements described herein may take the form of a computer program product embodied in one or more computer-readable media having computer-readable program code embodied, e.g., stored, thereon. Any combination of one or more computer-readable media may be utilized. The computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium. The phrase “computer-readable storage medium” means a non-transitory storage medium. A computer-readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or another combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: a portable computer diskette, a hard disk drive (HDD), a solid-state drive (SSD), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a portable compact disc read-only memory (CD-ROM), a digital versatile disc (DVD), an optical storage device, a magnetic storage device, or another combination of the foregoing.
In the context of this document, a computer-readable storage medium may be any tangible medium that can contain or store a program or use by or in connection with an instruction execution system, apparatus, or device. Generally, “module,” as used herein, includes routines, programs, objects, components, data structures, and so on that perform particular tasks or implement particular data types. In further aspects, a memory generally stores the noted modules. The memory associated with a module may be a buffer or cache embedded within a processor, a RAM, a ROM, a flash memory, or another electronic storage medium. In still further aspects, a module as envisioned by the present disclosure is implemented as an application-specific integrated circuit (ASIC), a hardware component of a system on a chip (SoC), as a programmable logic array (PLA), or as another hardware component that is embedded with a defined configuration set (e.g., instructions) for performing the disclosed functions. The terms “operatively connected” and “communicatively coupled.” as used throughout this description, can include direct or indirect connections, including connections without direct physical contact.
Program code embodied on a computer-readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or another combination of the foregoing. Computer program code for carrying out operations for aspects of the present arrangements may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java™, Smalltalk, C++, or the like and conventional programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a standalone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
The terms “a” and “an,” as used herein, are defined as one or more than one. The term “plurality,” as used herein, is defined as two or more than two. The term “another.” as used herein, is defined as at least a second or more. The terms “including” and/or “having.” as used herein, are defined as comprising (i.e., open language). The phrase “at least one of . . . and . . . ” as used herein, refers to and encompasses any and all possible combinations of one or more of the associated listed items. As an example, the phrase “at least one of A, B, and C” includes A only, B only, C only, or any combination thereof (e.g., AB, AC, BC, OR ABC).
Aspects herein can be embodied in other forms without departing from the spirit or essential attributes thereof. Accordingly, reference should be made to the following claims, rather than to the foregoing specification, as indicating the scope thereof.