This application claims the benefit under 35 USC § 119(a) of Korean Patent Application No. 10-2022-0114863, filed on Sep. 13, 2022, and Korean Patent Application No. 10-2022-0180837, filed on Dec. 21, 2022, in the Korean Intellectual Property Office, the entire disclosures of which are incorporated herein by reference for all purposes.
The following disclosure relates to a method and device with battery mode optimization.
Typical battery state estimation may be performed based on an equivalent circuit model (ECM). An ECM is an empirical model that simulates an electrical behavior of a battery using passive elements (e.g., a resistor, an inductor, a capacitor, etc.), but the use of ECMs is limited. To complement the ECM, an electrochemical model is designed to describe the electrochemical behavior of the battery based on the basic principles (e.g., the law of conservation of mass, the law of conservation of charge, etc.).
An electrochemical model may attempt to simulate the behavior of the battery and thus be useful for diagnosing and predicting the state of the battery in combination with a thermal model and a degradation model. However, since the electrochemical model includes many parameters and different combinations thereof depending on the type of battery, use of the electrochemical model is often problematic due to the prerequisite that each of the parameters be accurately identified.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In one general aspect, a processor-implemented method may include performing first parameter optimization of a battery model through a first predetermined optimization technique; switching, based on a count accumulated while performing the first parameter optimization indicating that a switching criterion has been met, from the first optimization technique to a second predetermined optimization technique; performing second parameter optimization of the battery model through the second predetermined optimization technique; and determining a final parameter combination as an optimized parameter of the battery model, in response to an occurrence of an optimization end event during the performance of the second parameter optimization.
The performing of the first parameter optimization may include generating, in a current iteration of the first parameter optimization, objective function values for parameter combinations to which the first optimization technique is applied; selecting one of the generated objective function values; comparing the selected one objective function value with a previously determined best objective function value, determined in a previous iteration of the first parameter optimization; accumulating the count when the selected one objective function value is determined to be greater than or equal to the previously determined best objective function value; and determining the selected one objective function value as a new best objective function value when the selected one objective function value is determined to be less than the previously determined best objective function value.
The generating of the objective function values for the parameter combinations may include calculating voltages using a simulator configured to simulate the battery model, the parameter combinations, and reference current data; and calculating the objective function values for the parameter combinations using the calculated voltages and reference voltage data.
The switching criterion may be satisfied when the accumulated count reaches a threshold value.
The switching may include determining one of a plurality of baseline functions using parameter combinations determined when the switching criterion is satisfied, objective function values for the determined parameter combinations, and a neural network; and selecting the second optimization technique based on the determined baseline function.
The determining of one of the plurality of baseline functions may include determining the baseline function in consideration of a distribution of the determined parameter combinations and the objective function values through the neural network.
The selecting of the second optimization technique may include determining whether the determined baseline function corresponds to a baseline function for evaluating a performance of the second optimization technique among a plurality of optimization techniques; and selecting the second optimization technique when the determined baseline function corresponds to the baseline function.
The method may further include initializing the accumulated count.
The performing of the second parameter optimization may include updating parameter combinations determined when the switching criterion is satisfied through the second optimization technique.
The performing of the second parameter optimization may include, in response to a predetermined condition being satisfied, extracting parameter combinations from an area other than a distribution area of parameter combinations determined when the switching criterion is satisfied.
In another general aspect, an electronic device may include one or more processors configured to execute instructions; and a memory configured to store the instructions, wherein the execution of the instructions by the one or more processors configures the one or more processors to perform first parameter optimization of a battery model through a first predetermined optimization technique; switch, based on a count accumulated while performing the first parameter optimization indicating that a switching criterion has been met, from a first optimization technique to a second predetermined optimization technique; perform second parameter optimization of the battery model through the second optimization technique; and determine a final parameter combination as an optimized parameter of the battery model, in response to an occurrence of an optimization end event during the performance of the second parameter optimization.
The one or more processors may be further configured to generate, in a current iteration of the first parameter optimization, objective function values for parameter combinations to which the first optimization technique is applied; select one of the generated objective function values; compare the selected one objective function value with a previously determined best objective function value, determined in a previous iteration; accumulate the count when the selected one objective function value is determined to be greater than or equal to the previously determined best objective function value; and determine the selected one objective function value as a new best objective function value when the selected one objective function value is determined to be less than the previously determined best objective function value.
The one or more processors may be further configured to calculate voltages using a simulator configured to simulate the battery model, the parameter combinations, and reference current data; and calculate the objective function values for the parameter combinations using the calculated voltages and reference voltage data.
The one or more processors may be further configured to determine that the switching criterion is satisfied, when the accumulated count reaches a threshold value.
The one or more processors may be further configured to determine one of a plurality of baseline functions using parameter combinations determined when the switching criterion is satisfied, objective function values for the parameter combinations, and a neural network; and select the second optimization technique based on the determined baseline function.
The one or more processors may be further configured to determine the baseline function in consideration of a distribution of the parameter combinations and the objective function values through the neural network.
The one or more processors may be further configured to determine whether the determined baseline function corresponds to a baseline function for evaluating a performance of the second optimization technique among a plurality of optimization techniques; and select the second optimization technique when the determined baseline function corresponds to the baseline function.
The one or more processors may be further configured to initialize the accumulated count.
The one or more processors may be further configured to update parameter combinations determined when the switching criterion is satisfied through the second optimization technique.
The one or more processors may be further configured to, in response to a predetermined condition being satisfied, extract parameter combinations from an area other than a distribution area of parameter combinations determined when the switching criterion is satisfied.
Other features and aspects will be apparent from the following detailed description, the drawings, and the claims.
Throughout the drawings and the detailed description, unless otherwise described or provided, the same drawing reference numerals may be understood to refer to the same or like elements, features, and structures. The drawings may not be to scale, and the relative size, proportions, and depiction of elements in the drawings may be exaggerated for clarity, illustration, and convenience.
The following detailed description is provided to assist the reader in gaining a comprehensive understanding of the methods, apparatuses, and/or systems described herein. However, various changes, modifications, and equivalents of the methods, apparatuses, and/or systems described herein will be apparent after an understanding of the disclosure of this application. For example, the sequences of operations described herein are merely examples, and are not limited to those set forth herein, but may be changed as will be apparent after an understanding of the disclosure of this application, with the exception of operations necessarily occurring in a certain order. Also, descriptions of features that are known after an understanding of the disclosure of this application may be omitted for increased clarity and conciseness.
The features described herein may be embodied in different forms and are not to be construed as being limited to the examples described herein. Rather, the examples described herein have been provided merely to illustrate some of the many possible ways of implementing the methods, apparatuses, and/or systems described herein that will be apparent after an understanding of the disclosure of this application. The use of the term “may” herein with respect to an example or embodiment, e.g., as to what an example or embodiment may include or implement, means that at least one example or embodiment exists where such a feature is included or implemented, while all examples are not limited thereto.
The terminology used herein is for describing various examples only and is not to be used to limit the disclosure. The articles “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. As non-limiting examples, terms “comprise” or “comprises,” “include” or “includes,” and “have” or “has” specify the presence of stated features, numbers, operations, members, elements, and/or combinations thereof, but do not preclude the presence or addition of one or more other features, numbers, operations, members, elements, and/or combinations thereof, or the alternate presence of an alternative stated features, numbers, operations, members, elements, and/or combinations thereof. Additionally, while one embodiment may set forth such terms “comprise” or “comprises,” “include” or “includes,” and “have” or “has” specify the presence of stated features, numbers, operations, members, elements, and/or combinations thereof, other embodiments may exist where one or more of the stated features, numbers, operations, members, elements, and/or combinations thereof are not present.
As used herein, the term “and/or” includes any one and any combination of any two or more of the associated listed items. The phrases “at least one of A, B, and C”, “at least one of A, B, or C’, and the like are intended to have disjunctive meanings, and these phrases “at least one of A, B, and C”, “at least one of A, B, or C’, and the like also include examples where there may be one or more of each of A, B, and/or C (e.g., any combination of one or more of each of A, B, and C), unless the corresponding description and embodiment necessitates such listings (e.g., “at least one of A, B, and C”) to be interpreted to have a conjunctive meaning.
Throughout the specification, when a component or element is described as being “connected to,” “coupled to,” or “joined to” another component or element, it may be directly “connected to,” “coupled to,” or “joined to” the other component or element, or there may reasonably be one or more other components or elements intervening therebetween. When a component or element is described as being “directly connected to,” “directly coupled to,” or “directly joined to” another component or element, there can be no other elements intervening therebetween. Likewise, expressions, for example, “between” and “immediately between” and “adjacent to” and “immediately adjacent to” may also be construed as described in the foregoing. It is to be understood that if a component (e.g., a first component) is referred to, with or without the term “operatively” or “communicatively,” as “coupled with,” “coupled to,” “connected with,” or “connected to” another component (e.g., a second component), it means that the component may be coupled with the other component directly (e.g., by wire), wirelessly, or via a third component.
Although terms such as “first,” “second,” and “third”, or A, B, (a), (b), and the like may be used herein to describe various members, components, regions, layers, or sections, these members, components, regions, layers, or sections are not to be limited by these terms. Each of these terminologies is not used to define an essence, order, or sequence of corresponding members, components, regions, layers, or sections, for example, but used merely to distinguish the corresponding members, components, regions, layers, or sections from other members, components, regions, layers, or sections. Thus, a first member, component, region, layer, or section referred to in the examples described herein may also be referred to as a second member, component, region, layer, or section without departing from the teachings of the examples.
Unless otherwise defined, all terms, including technical and scientific terms, used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains and based on an understanding of the disclosure of the present application. Terms, such as those defined in commonly used dictionaries, are to be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and the disclosure of the present application and are not to be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Typical parameter identification requires measure parameters through operations such as electrochemical impedance spectroscopy (EIS), X-ray diffraction (XRD), scanning electron microscope (SEM), and the like. As these typical operations not only have certain limitations in measuring parameters, but also time consuming and costly, it is found herein to be beneficial to use at least an alternate method and device to identify parameters using various optimization techniques.
Referring to
The battery model 110 may include an electrochemical model. As non-limiting examples, the electrochemical model may simulate an internal state of a battery using, for example, one or more parameters and one or more mathematical equations (e.g., governing equations). The electrochemical model may include, for example, a pseudo-2-dimensional (P2D) model, a reduced order model (ROM), a single particle model (SPM), and the like, but is not limited thereto. Hereinafter, a battery simulated by the battery model 110 will be referred to as a “target battery”.
The electronic device 100 may optimize the battery model 110 by optimizing one or more parameters of the battery model 110. Table 1 below shows an example of one or more parameters of the battery model 110.
In Table 1 above, p and n denote a cathode and an anode of the target battery, respectively, s and e denote a separator and an electrolyte of the target battery, respectively, init denotes an initial value, SEI denotes a solid electrolyte interface, and ISC denotes an internal short circuit.
The one or more parameters of the battery model 110 are not limited to those shown in Table 1 above.
The electronic device 100 may be configured to perform parameter optimization using one of a plurality of optimization techniques (or metaheuristic techniques) (e.g., swarm intelligence-based algorithm (SIA) techniques).
As non-limiting examples, the optimization techniques may include particle swarm optimization (PSO) algorithm, bald eagle search (BES) algorithm, gray wolf optimization (GWO) algorithm, honey badger algorithm (HBA), salp swarm algorithm (SSA), etc. However, the optimization techniques are not limited to the algorithms mentioned above. Each algorithm may determine an optimal solution to a given problem in a distinct and independent manner.
The electronic device 100 may be configured with baseline functions for evaluating the performances of the optimization techniques. Tables 2 and 3 below show examples of baseline functions of the electronic device 100. In Fi (where i=1 to 23) in Tables 2 and 3 below, i denotes an index of a baseline function.
As will be described later, the electronic device 100 may be configured to determine or identify, using a machine leaning model (e.g., a neural network), as a non-limiting example, a baseline function having a distribution similar to the distribution of given parameter combinations and the objective function value of each parameter combination.
The electronic device 100 may be configured to classify the plurality of optimization techniques and identify/determine/select an optimization technique having relatively better optimization performance (compared to the other optimization techniques) for a corresponding baseline function Fi. Table 4 below shows an example of classification results. In SIAj in Table 4 below (where j=1 to 5), j denotes an index of an optimization technique.
The cost functions in Table 4 above represent the baseline functions described above.
The BES algorithm may have relatively better optimization performance in baseline functions F1, F2, F3, F4, F9, F10, and F11 among all the baseline functions F1-F23. In other words, the electronic device 100 may evaluate the performance of the BES algorithm through each of the baseline functions F1 through F23, and the performance of the BES algorithm may be relatively highly evaluated in the baseline functions F1, F2, F3, F4, F9, F10, and F11 compared to the other baseline functions F5-F8 and F12-F23. In this example, the electronic device 100 may map the baseline functions F1, F2, F3, F4, F9, F10, and F11 to the BES algorithm as shown in Table 4 above.
The GWO algorithm may have relatively better optimization performance in baseline functions F5, F13, F15, and F20 compared to the other baseline functions F1-F4, F6-F12, F14, F16-F19 and F21-F23, and thus, the electronic device 100 may map the baseline functions F5, F13, F15, and F20 to the GWO algorithm, as shown in Table 4 above.
The HBA algorithm may have relatively better optimization performance in baseline functions F7 and F12 compared to the other baseline functions F1-F6, F8-F11 and F13-F23, and thus, the electronic device 100 may map the baseline functions F7 and F12 to the HBA algorithm. The PSO algorithm may have relatively better optimization performance in baseline functions F8, F12, and F19 compared to the other baseline functions F1-F7, F9-F11, F13-F18 and F20-F23, and thus, the electronic device 100 may map the baseline functions F8, F12, and F19 to the PSO algorithm. The SSA algorithm may have relatively better optimization performance in baseline functions F6, F16, F17, F18, F21, F22, and F23 compared to the other baseline functions F1-F5, F7-F15, F19 and F20, and thus, the electronic device 100 may map the baseline functions F6, F16, F17, F18, F21, F22, and F23 to the SSA algorithm.
In an embodiment, the electronic device 100 may be configured to evaluate parameter combinations through an objective function. In other words, the electronic device 100 may be configured to calculate objective function values for the parameter combinations. In one example, each parameter combination may include one or more parameters (e.g., the parameters in Table 1 above). Equation 1 below shows an example objective function.
In Equation 1 above, θ denotes a parameter combination, and ƒ(θ) may be referred to as an objective function value of the parameter combination. θ may include one or more of the parameters in Table 1 above. Vsim denotes a simulated voltage. The electronic device 100 may be configured to store a simulator corresponding to the battery model 110. The simulator may calculate and/or output Vsim by simulating the operation of the battery model 110 based on θ and Iref. In Equation 1above, Vref and Iref denote reference voltage data and reference current data, respectively.
The electronic device 100 may be configured to generate the distribution of the parameter combinations and the objective function value of each parameter combination using a neural network, which will be described later, and thus identify the most effective optimization technique for the generated distribution and switch from the current optimization technique to the identified optimization technique. The electronic device 100 may be configured to strategically perform the switching of the optimization technique through quantitative analysis of a current situation (or current state) (e.g., the distribution of the parameter combinations and the objective function value of each parameter combination).
Hereinafter, a battery model optimization method will be described with reference to
Referring to
P may denote a set of parameter combinations (or a matrix including parameter combinations). Pinit denotes initial parameter combinations, and may be parameter combinations initially randomly extracted by the electronic device 100 in a multidimensional space. ƒbest may be the best objective function value obtained when a battery model optimization method (or parameter optimization) is performed up to a given iteration. In operation 210, the electronic device 100 may set ƒbest to ∞. θbest may be the best parameter combination determined when a battery model optimization method (or parameter optimization) is performed to a given iteration, and may be a parameter combination having ƒbest. i may denote an index of a baseline function, and j may denote an index of an optimization technique.
In operation 213, the electronic device 100 may perform an evaluation. In one example, the electronic device 100 may be configured to calculate objective function values for the parameter combinations in P.
P may be in the form of a matrix P 300, as shown in
The electronic device 100 may be configured to calculate voltages (or simulated voltages) using a simulator for simulating the operation of the battery model 110, the parameter combinations (e.g., θ1, θ2, . . . , θN), and the reference current data Iref.
In the example shown in
The electronic device 100 may be configured to calculate the objective function values of the parameter combinations θ1, θ2, . . . , θN using the calculated voltages Vsim,1, . . . , Vsim,N, the reference voltage data Vref, and an objective function (e.g., the objective function of Equation 1 above). The objective function may be a function that calculates a root-mean-square error (RMSE) between the voltage calculated by the simulator 410 and the reference voltage data. The electronic device 100 may calculate the objective function values of the parameter combinations (e.g., θ1, θ2, . . . , θN), by applying the calculated voltages Vsim,1, . . . Vsim,N and the reference voltage data Vref to the objective function. For example, the electronic device 100 may calculate
as the objective function value ƒ(θ1) of the parameter combination θ1, and calculate
as the objective function value ƒ(θN) of the parameter combination θN.
Thus, as described above, the parameter combination may be evaluated better as the objective function value is calculated smaller.
Returning to
In operation 217, the electronic device 100 may be configured to determine if ƒ(θbest,p)<ƒbest. If itr=1, ƒbest is ∞ as described in operation 210, and thus, the electronic device 100 may determine ƒ(θbest,p) is less than ƒbest. If ƒ(θbest,p)=ƒ(θN) at itr=4, the electronic device 100 may determine whether ƒ(θN) is less than ƒbest. In one example, ƒbest may correspond to the best objective function value when the method is performed from itr=1 through itr=3.
If ƒ(θbest,p) is greater than or equal to ƒbest, the electronic device 100 may be configured to update (or accumulate) the count c (e.g., c=c+1) in operation 219. When the evaluation of θbest is not as good as the evaluation of θbest,p, the electronic device 100 may update (or accumulate) the count c so as to accumulate a penalty for the optimization technique that is currently performed.
If ƒ(θbest,p) is less than ƒbest, the electronic device 100 may be configured to determine θbest,p as θbest and ƒ(θbest,p) as (best in operation 221.
In operation 223, the electronic device 100 may be configured to determine whether the count c (or the accumulated count c) corresponds to a threshold value (e.g., α). The electronic device 100 may determine whether a switching criterion is satisfied.
When the count c (or the accumulated count c) does not correspond to a threshold value (e.g., when the count c is less than the threshold value (e.g., α)), the electronic device 100 may be configured to perform sample update or resampling in operation 231. In one example, the sample update may be updating P, and the resampling may be extracting (or sampling) again new parameter combinations by the electronic device 100 from an area other than an area in which the parameter combinations in P are distributed.
When the count c (or the accumulated count c) corresponds to the threshold value, the electronic device 100 may be configured to determine a baseline function and an optimization technique in operation 225. In an example given iteration, the electronic device 100 may determine one of the baseline functions in consideration of the distribution of the parameter combinations and the objective function values in the given iteration through a neural network (e.g., a neural network 510 to be described later with reference to
The electronic device 100 may be configured to select one from a plurality of optimization techniques based on the determined baseline function. For example, when the electronic device 100 identifies/determines the baseline function F5, the electronic device 100 may select the GWO algorithm mapped with the baseline function F5. The selected GWO algorithm may be the same or different from the optimization technique in the previous iteration. If the baseline function at itr=3 is the baseline function F2, the optimization algorithm when itr=3 may be the BES algorithm. In one example, the GWO algorithm selected at itr=4 is different from the optimization algorithm at itr=3. At itr=4, the optimization technique may be switched. However, if the baseline function at itr=3 is the baseline function F13, the optimization algorithm at itr=3 is the GWO algorithm. In one example, the GWO algorithm selected at itr=4 is the same as the optimization algorithm at itr=3, and thus, the optimization technique may not be switched at itr=4.
An example neural network, which may be configured to identify/determine the baseline function, and identify/determine the corresponding optimization technique, will be described later.
In operation 227, the electronic device 100 may be configured to determine whether the identified baseline function is the same as the previous baseline function. In one example, the previous baseline function may be the baseline function in the previous iteration, and thus the electronic device 100 may determine whether the baseline function is changed. For example, at itr=4, the electronic device 100 may determine the baseline function F5. The electronic device 100 may determine whether the determined baseline function F5 is the same as the baseline function at itr=3.
The electronic device 100 may be configured to perform operation 231 when the determined baseline function is the same as the previous baseline function.
When the determined baseline function is not the same as the previous baseline function, the electronic device 100 may be configured to initialize the count c (e.g., c=0) in operation 229. When the count c is initialized, the electronic device 100 may perform operation 231
At itr=1, the count c may be set to a threshold value (e.g., α) according to the initial values set in operation 210. In one example, the electronic device 100 may determine that the count c is equal to the threshold value in operation 223 and perform operation 225 and operation 227. Since itr=1, a previous baseline function may be absent. Thus, the electronic device 100 may initialize the count c in operation 229.
As non-limiting examples, the order of operations 225, 227, and 229 may be changed. For example, at itr=4, when the count c corresponds to the threshold value, the electronic device 100 may be configured to determine one of the baseline functions in consideration of the distribution of the parameter combinations (e.g., θ1, θ2, . . . , θN) and the objective functions (ƒ(θ1), . . . , ƒ(θN)) at itr=4 through the neural network. When the baseline function is determined, the electronic device 100 may determine whether the determined baseline function is the same as the previous baseline function. When the determined baseline function is not the same as the previous baseline function, the electronic device 100 may determine an optimization technique based on the determined baseline function and initialize the count c. When the count c is initialized, the electronic device 100 may perform operation 231. The electronic device 100 may perform operation 231 when the determined baseline function is the same as the previous baseline function.
In operation 231, the electronic device 100 may be configured to update P or perform resampling. In one embodiment, during a portion of itrs previous to the current itr, ƒbest may not change. In response to detecting that ƒbest is not changed during the portion of itrs previous to he current itr (e.g., c≥3α), the electronic device 100 may perform resampling at the current itr, and otherwise (e.g., c<3α), may update P.
For example, the current itr may be “100”. During a portion (e.g., itr=80 through itr=99) of itrs (itr=1 through itr=99) previous to the current itr=100, ƒbest may not change. Since the parameter combinations of P are clustered around local optima, ƒbest may not change during the portion (e.g., itr=80 through itr=99) of previous itrs. This may cause a situation where c≥3α at the current itr=100. In this case, it may be difficult for the electronic device 100 to obtain better results even if parameter optimization continues. As non-limiting examples, the electronic device 100 may extract or sample again parameter combinations by searching again the area (or space) other than the area where the parameter combinations of P are distributed. The electronic device 100 may extract or sample new parameter combinations, thereby improving the possibility of obtaining better solutions.
The electronic device 100 may be configured to update P if resampling is not to be performed (e.g., if c<3α). According to an embodiment, when the optimization technique is switched, the electronic device 100 may update P in consideration of the switched optimization technique.
When P is updated or resampling is performed, the electronic device 100 may determine whether itr is greater than or equal to itrmax in operation 233. Here, itrmax may be the maximum number of iterations.
If itr is less than itrmax, the electronic device 100 may be configured to update itr (e.g., itr=itr+1) in operation 235.
If itr is greater than or equal to itrmax, the electronic device 100 may be configured to return θbest in operation 237.
In one example, when θbest at a predetermined iteration corresponds to desired particular combinations, the electronic device 100 may be configured to stop iteration and return θbest at the predetermined iteration.
θbest may be the optimal parameter of the battery model 110, and the electronic device 100 may be configured to apply θbest to the battery model 110. The battery model 110 to which θbest is applied may be provided in various devices (e.g., an electric vehicle, a smartphone, etc.), and may estimate state information (e.g., voltages, states of charge (SOC), states of health (SOH), degrees of deterioration, etc.) of the devices, as non-limiting examples.
Referring to
In a given iteration itr, when the count c reaches the threshold value, the example electronic device (e.g., the electronic device 100 in
The electronic device 100 may determine a baseline function in consideration of a distribution of P (e.g., P 300 in
The neural network 510 may generate and output the index i of the baseline function as shown in
In a non-limiting example, the neural network 510 may be a neural network that is trained through a virtual data distribution generated from baseline functions. Supervised learning may be iteratively performed in a manner that parameter combinations and objective function values corresponding to the respective parameter combinations are input to an untrained or in-training neural network and an index of a baseline function is output from the untrained/in-training neural network. As a non-limiting example, the supervised learning may include performing back-propagation learning, such as gradient descent backpropagation learning, based on calculated losses of the output of the untrained or in-training neural network. The neural network 510 may be generated through such supervised learning, and the electronic device 100 may use the neural network 510 to determine (i.e., infer) a baseline function.
An example neural network 600 is shown in
As shown in
The example neural network 600 may include a softmax layer that may output an index i of a baseline function having a pattern that matches the input (e.g., P and/or f(P)) of the neural network 600.
For example, as shown in
The electronic device 100 may be configured to select an optimization technique using the output (e.g., the index of the baseline function) of the neural network 600. For example, when the neural network 600 outputs the index “16”, the electronic device 100 may select the SSA algorithm as an optimization technique mapped with the baseline function F16 using Table 4 above. When the neural network 600 outputs the index “1”, the electronic device 100 may select the BES algorithm as an optimization technique mapped with the baseline function F1 using Table 4 above.
Referring to
The description of the operation of the simulator 410 of
The description of evaluating parameter combinations by the electronic device 100 (e.g., the description of calculating objective function values of the parameter combinations) may be applicable to evaluation 820.
The description of operation 231 of
The description of operation 225 provided above may apply to baseline function identification 840 and optimization technique determination 850. In an example, FuncNet in baseline function identification 840 may denote the neural network 510 (e.g., the neural network 600) described above. Matching in optimization technique determination 850 may denote an operation of finding an optimization technique that matches (or is mapped to) a baseline function F.
According to optimization technique determination 850, the electronic device 100 may return an index j of the optimization technique. The electronic device 100 may set or initialize the count c to “0” when the baseline function Fi is changed.
Table 5 below shows an example pseudo codes of the battery model optimization method, and Table 6 below shows an example pseudo codes of PUR, according to one or more embodiments.
In Table 5 above, P2D may denote an electrochemical model, N may denote the number of rows (or the number of parameter combinations) of P (e.g., P 300 of
In Table 5 above, β is a constant between “0” and “1”.
While the battery model optimization method is performed, a voltage error (e.g., voltage RMSE) is shown in
As shown in
Until itr=x1, an electronic device (e.g., the electronic device 100 of
The electronic device 100 may be configured to switch or change the optimization technique from the PSO algorithm to the GWO algorithm as the switching criterion is satisfied at itr=x2.
The electronic device 100 may be configured to switch or change the optimization technique from the GWO algorithm to the PSO algorithm as the switching criterion is satisfied at itr=x3 and perform resampling.
The electronic device 100 may be configured to switch or change the optimization technique from the PSO algorithm to the BES algorithm as the switching criterion is satisfied at itr=x4.
The electronic device 100 may be configured to switch or change the optimization technique from the BES algorithm to the GWO algorithm as the switching criterion is satisfied at itr=x5.
The electronic device 100 may be configured to perform parameter optimization through the GWO algorithm, and terminate parameter optimization when an optimization end event occurs (e.g., when the maximum number of iterations is reached or when a desired parameter combination is derived).
Referring to
The voltage error of SSM at the same itr may be lower than the voltage errors of the typical optimization techniques, and the voltage error of SSM may decrease more quickly than the voltage errors of the typical optimization techniques throughout all itrs. This may indicate that the objective function value of SSM converges quickly and accurately compared to the objective function values of the typical optimization techniques. SSM may perform optimization faster than typical optimization techniques.
Referring to
Referring to
The memory 1220 may store one or more instructions (e.g., instructions related to a battery model optimization method) to be executed by the processor 1210.
The memory 1220 may store an operation result of the processor 1210.
The processor 1210 may be configured to perform any one or any combinations of operations implemented in the battery model optimization method by executing the one or more instructions.
The processor 1210 may perform first parameter optimization of the battery model 110 through a first optimization technique. In an example, the processor 1210 may calculate objective function values of parameter combinations to which the first optimization technique is applied, in a given iteration. For example, the processor 1210 may calculate voltages (e.g., simulated voltages) using the simulator 410 for simulating the battery model 110, the parameter combinations, and reference current data. The processor 1210 may calculate the objective function values of the parameter combinations using the calculated voltages and reference voltage data. The processor 1210 may select one of the calculated objective function values. The processor 1210 may compare the selected objective function value (e.g., ƒ(θbest,p)) with a best objective function value (e.g., ƒbest of operation 217 of
The processor 1210 may determine whether a switching criterion for switching an optimization technique is satisfied, based on a count (e.g., the count c described above) accumulated while performing the first parameter optimization. For example, the processor 1210 may determine that the switching criterion is satisfied, when the accumulated count reaches a threshold value.
The processor 1210 may switch from the first optimization technique to a second optimization technique when the switching criterion is satisfied. In an example, the processor 1210 may determine one of a plurality of baseline functions, using parameter combinations determined when the switching criterion is satisfied (e.g., parameter combinations θ1, θ2, . . . , N1 at itr=4 when the switching criterion is satisfied at itr=4), objective function values (e.g., ƒ(θ1), . . . , ƒ(θN)) of the parameter combinations, and the neural network 510. For example, the processor 1210 may determine the baseline function in consideration of the distribution of the parameter combinations and the objective function values through the neural network 510. The processor 1210 may select a second optimization technique based on the determined baseline function. For example, the processor 1210 may determine whether the determined baseline function corresponds to a baseline function for evaluating a performance of the second optimization technique among a plurality of optimization techniques. The processor 1210 may select the second optimization technique when the determined baseline function corresponds to a baseline function for evaluating the performance of the second optimization technique.
The processor 1210 may initialize the accumulated count when switching from the first optimization technique to the second optimization technique.
The processor 1210 may perform second parameter optimization of the battery model 110 through the second optimization technique. In an example, the processor 1210 may update parameter combinations determined when the switching criterion is satisfied through the second optimization technique. The processor 1210 may extract parameter combinations from an area other than a distribution area of parameter combinations determined when the switching criterion is satisfied, in response to a predetermined condition being satisfied (e.g., if c≥3α).
The processor 1210 may detect an optimization end event while performing the second parameter optimization. The optimization end event may include, for example, a case where itr reaches itrmax or a case where θbest corresponds to a desired parameter combination, but is not limited thereto.
When the processor 1210 detects an optimization end event, the best parameter combination determined while performing the battery model optimization method (e.g., θbest) may be determined as an optimal parameter of the battery model 110.
The description provided with reference to
Referring to
In operation 1320, the electronic device 100 may determine whether a switching criterion for switching an optimization technique is satisfied, based on a count accumulated while performing the first parameter optimization.
In operation 1330, the electronic device 100 may switch from the first optimization technique to a second optimization technique when the switching criterion is satisfied.
In operation 1340, the electronic device 100 may perform second parameter optimization of the battery model through the second optimization technique.
In operation 1350, the electronic device 100 may detect an optimization end event.
In operation 1360, the electronic device 100 may determine a best parameter combination determined while performing the battery model optimization method as an optimal parameter of the battery model, when the optimization end event is detected.
The description provided with reference to
The processors, memories, electronic devices, apparatuses, electronic devices 100 and 1000, and other apparatuses, devices, and components described herein with respect to
The methods illustrated in
Instructions or software to control computing hardware, for example, one or more processors or computers, to implement the hardware components and perform the methods as described above may be written as computer programs, code segments, instructions or any combination thereof, for individually or collectively instructing or configuring the one or more processors or computers to operate as a machine or special-purpose computer to perform the operations that are performed by the hardware components and the methods as described above. In one example, the instructions or software include machine code that is directly executed by the one or more processors or computers, such as machine code produced by a compiler. In another example, the instructions or software includes higher-level code that is executed by the one or more processors or computer using an interpreter. The instructions or software may be written using any programming language based on the block diagrams and the flow charts illustrated in the drawings and the corresponding descriptions herein, which disclose algorithms for performing the operations that are performed by the hardware components and the methods as described above.
The instructions or software to control computing hardware, for example, one or more processors or computers, to implement the hardware components and perform the methods as described above, and any associated data, data files, and data structures, may be recorded, stored, or fixed in or on one or more non-transitory computer-readable storage media. Examples of a non-transitory computer-readable storage medium include read-only memory (ROM), random-access programmable read only memory (PROM), electrically erasable programmable read-only memory (EEPROM), random-access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), flash memory, non-volatile memory, CD-ROMs, CD-Rs, CD+Rs, CD-RWs, CD+RWs, DVD-ROMs, DVD-Rs, DVD+Rs, DVD-RWs, DVD+RWs, DVD-RAMs, BD-ROMs, BD-Rs, BD-R LTHs, BD-REs, blue-ray or optical disk storage, hard disk drive (HDD), solid state drive (SSD), flash memory, a card type memory such as multimedia card micro or a card (for example, secure digital (SD) or extreme digital (XD)), magnetic tapes, floppy disks, magneto-optical data storage devices, optical data storage devices, hard disks, solid-state disks, and any other device that is configured to store the instructions or software and any associated data, data files, and data structures in a non-transitory manner and provide the instructions or software and any associated data, data files, and data structures to one or more processors or computers so that the one or more processors or computers can execute the instructions. In one example, the instructions or software and any associated data, data files, and data structures are distributed over network-coupled computer systems so that the instructions and software and any associated data, data files, and data structures are stored, accessed, and executed in a distributed fashion by the one or more processors or computers.
While this disclosure includes specific examples, it will be apparent after an understanding of the disclosure of this application that various changes in form and details may be made in these examples without departing from the spirit and scope of the claims and their equivalents. The examples described herein are to be considered in a descriptive sense only, and not for purposes of limitation. Descriptions of features or aspects in each example are to be considered as being applicable to similar features or aspects in other examples. Suitable results may be achieved if the described techniques are performed in a different order, and/or if components in a described system, architecture, device, or circuit are combined in a different manner, and/or replaced or supplemented by other components or their equivalents.
Therefore, in addition to the above disclosure, the scope of the disclosure may also be defined by the claims and their equivalents, and all variations within the scope of the claims and their equivalents are to be construed as being included in the disclosure.
Number | Date | Country | Kind |
---|---|---|---|
10-2022-0114863 | Sep 2022 | KR | national |
10-2022-0180837 | Dec 2022 | KR | national |