This disclosure relates generally to control systems and, more particularly, to mathematical model based control systems.
Modern work machines often require complex control systems to control a wide range of operations. For example, the operation of an engine system of a modern work machine may be controlled by a complex engine control system. The engine control system may provide a series control functions, such as closed loop combustion control, fuel control, gear control, torque control, and/or other engine controls, etc. Physical or process models may be used to implement some functions of the engine control system. A process model may refer to a physics or mathematics based model where the state of a process is monitored by measuring process variables. A process variable may refer to a variable related to conditions of the engine under control. The engine control system may use the value of the process variable and the model to control the operation of the engine within a normal range.
Conventional process model based control systems, such as described in U.S. Pat. No. 6,823,675 to Brunell et al. on Nov. 30, 2004, often relate individual input parameters or process variables to control functions without addressing inter-correlation between individual input parameters, especially at the time of generation and/or optimization of such process models. Thus, these systems may fail to simultaneously optimize input parameter distribution requirements.
Methods and systems consistent with certain features of the disclosed systems are directed to solving one or more of the problems set forth above.
One aspect of the present disclosure includes a method for a control system. The method may include obtaining data records associated with one or more input variables and one or more output parameters, and selecting one or more input parameters from the one or more input variables. The method may also include generating a computational model indicative of interrelationships between the one or more input parameters and the one or more output parameters based on the data records, and determining desired respective statistical distributions of the one or more input parameters of the computational model.
Another aspect of the present disclosure includes a control system. The control system may include one or more input elements configured to accept respective one or more input parameters to the control system and one or more output elements configured to accept respective one or more output parameters. The control system may also include a processor configured to control one or more hardware devices using the one or more output parameters based on a control model. The control model may be created by obtaining data records associated with the one or more input parameters and the one or more output parameters, and generating a computational model indicative of interrelationships between the one or more input parameters and the one or more output parameters based on the data records. The creation steps of the control model may also include determining desired respective statistical distributions of the one or more input parameters of the computational model, and recalibrating the one or more input parameters based on the desired statistical distributions.
Another aspect of the present disclosure includes a work machine. The work machine may include an engine and an engine control system. The engine control system may include a processor configured to obtain respective values of one or more input parameters and to derive respective values of one or more output parameters based on the values of input parameters and a first computational model indicative of interrelationships between the one or more input parameters and the one or more output parameters. The processor may also be configured to control the engine using the values of the one or more output parameters.
Another aspect of the present disclosure includes a computer system. The computer system may include a database, containing data records associated with one or more input variables and one or more output parameters, and a processor. The processor may be configured to select one or more input parameters from the one or more input variables and to generate a computational model indicative of interrelationships between the one or more input parameters and the one or more output parameters based on the data records. The processor may also be configured to determine desired respective statistical distributions of the one or more input parameters of the computational model and to recalibrate the one or more input parameters based on the desired statistical distributions.
Reference will now be made in detail to exemplary embodiments, which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
As shown in
As shown in
Processor 202 may include any appropriate type of general purpose microprocessor, digital signal processor, or microcontroller. Processor 202 may be configured as a separate processor module dedicated to controlling engine 110. Alternatively, processor 202 may be configured as a shared processor module for performing other functions unrelated to engine control.
Memory module 204 may include one or more memory devices including, but not limited to, a ROM, a flash memory, a dynamic RAM, and a static RAM. Memory module 204 may be configured to store information used by processor 202. Database 206 may include any type of appropriate database containing information on characteristics of input parameters, output parameters, mathematical models, and/or any other control information. Further, I/O interfaces 206 may also be connected to various sensors or other components (not shown) to monitor and control operations of engine 110. Network interface 210 may include any appropriate type of network adaptor capable of communicating with other computer systems based on one or more communication protocols.
Engine control system 120 may include a control model reflecting relationships between input parameters to engine control system 120 and output parameters. Output parameters may be used to control different engine components.
As shown in
At the beginning of the model generation and optimization process, processor 202 may obtain data records associated with input parameters and output parameters (step 402). The data records may be previously collected during a certain time period from a test engine or from a plurality of work machines and engines. The data records may also be collected from experiments designed for collecting such data. Alternatively, the data records may be generated artificially by other related processes, such as a design process. The data records may reflect characteristics of the input parameters and output parameters, such as statistic distributions, normal ranges, and/or tolerances, etc.
Once the data records are obtained (step 402), processor 202 may pre-process the data records to clean up the data records for obvious errors and to eliminate redundancies (step 404). Processor 202 may remove approximately identical data records and/or remove data records that are out of a reasonable range in order to be meaningful for model generation and optimization. After the data records have been pre-processed, processor 202 may then select proper input parameters by analyzing the data records (step 406).
The data records may include many input variables. The number of input variables may be greater than the number of the input parameters or variables used for control model 300. For example, in addition to values corresponding to input parameters or variables of gas pedal indication, gear selection, atmospheric pressure, and engine temperature, the data records may also include input variables such as fuel indication, tracking control indication, and/or other engine parameters.
In certain situations, the number of input variables may exceed the number of the data records and lead to sparse data scenarios. Some of the extra input variables may be omitted in certain mathematical models. The number of the input variables may need to be reduced to create mathematical models within practical computational time limits.
Processor 202 may select input parameters according to predetermined criteria. For example, processor 202 may choose input parameters by experimentation and/or expert opinions. Alternatively, in certain embodiments, processor 202 may select input parameters based on a mahalanobis distance between a normal data set and an abnormal data set of the data records. The normal data set and abnormal data set maybe defined by processor 202 by any proper method. For example, the normal data set may include characteristic data associated with the input parameters that produce desired output parameters. On the other hand, the abnormal data set may include any characteristic data that may be out of tolerance or may need to be avoided. The normal data set and abnormal data set may be predefined by processor 202.
Mahalanobis distance may refer to a mathematical representation that may be used to measure data profiles based on correlations between parameters in a data set. Mahalanobis distance differs from Euclidean distance in that mahalanobis distance takes into account the correlations of the data set. Mahalanobis distance of a data set X (e.g., a multivariate vector) may be represented as
MDi=(Xi−μx)Σ−1(Xi−μx)′ (1)
where μx is the mean of X and Σ−1 is an inverse variance-covariance matrix of X. MDi weights the distance of a data point Xi from its mean μx such that observations that are on the same multivariate normal density contour will have the same distance. Such observations may be used to identify and select correlated parameters from separate data groups having different variances.
Processor 202 may select a desired subset of input parameters such that the mahalanobis distance between the normal data set and the abnormal data set is maximized or optimized. A genetic algorithm may be used by processor 202 to search the input parameters for the desired subset with the purpose of maximizing the mahalanobis distance. Processor 202 may select a candidate subset of the input parameters based on a predetermined criteria and calculate a mahalanobis distance MDnormal of the normal data set and a mahalanobis distance MDabnormal of the abnormal data set. Processor 202 may also calculate the mahalanobis distance between the normal data set and the abnormal data (i.e., the deviation of the mahalanobis distance MDx=MDnormal−MDnormal). Other types of deviations, however, may also be used.
Processor 202 may select the candidate subset of the input parameters if the genetic algorithm converges (i.e., the genetic algorithm finds the maximized or optimized mahalanobis distance between the normal data set and the abnormal data set corresponding to the candidate subset). If the genetic algorithm does not converge, a different candidate subset of the input parameters may be created for further searching. This searching process may continue until the genetic algorithm converges and a desired subset of the input parameters is selected.
After selecting input parameters (e.g., gas pedal indication, gear selection, atmospheric pressure, and temperature, etc.), processor 202 may generate a computational model to build interrelationships between input parameters and output parameters (step 408). Any appropriate type of neural network may be used to build the computational model. The type of neural network models used may include back propagation, feed forward models, cascaded neural networks, and/or hybrid neural networks, etc. Particular types or structures of the neural network used may depend on particular applications. Other types of models, such as linear system or non-linear system models, etc., may also be used.
The neural network computational model may be trained by using selected data records. For example, the neural network computational model may include a relationship between output parameters (e.g., boot control, throttle valve setting, etc.) and input parameters (e.g., gas pedal indication, gear selection, atmospheric pressure, and engine temperature, etc). The neural network computational model may be evaluated by predetermined criteria to determine whether the training is completed. The criteria may include desired ranges of accuracy, time, and/or number of training iterations, etc.
After the neural network has been trained (i.e., the computational model has initially been established based on the predetermined criteria), processor 202 may statistically validate the computational model (step 410). Statistical validation may refer to an analyzing process to compare outputs of the neural network computational model with actual outputs to determine the accuracy of the computational model. Part of the data records may be reserved for use in the validation process. Alternatively, processor 202 may also generate simulation or test data for use in the validation process.
Once trained and validated, the computational model may be used to determine values of output parameters when provided with values of input parameters. For example, processor 202 may use the computation model to determine throttle valve setting and boot control based on input values of gas pedal indication, gear selection, atmospheric pressure, and engine temperature, etc. The values of output parameters may then be used to control hardware devices of engine control system 120 or engine 110. Further, processor 202 may optimize the model by determining desired distributions of the input parameters based on relationships between the input parameters and desired distributions of the output parameters (step 412).
Processor 202 may analyze the relationships between desired distributions of the input parameters and desired distributions of the output parameters based on particular applications. For example, if a particular application requires a higher fuel efficiency, processor 202 may use a small range for the throttle valve setting and use a large range for the boost control. Processor 202 may then run a simulation of the computational model to find a desired statistical distribution for an individual input parameter (e.g., gas pedal indication, gear selection, atmospheric pressure, or engine temperature, etc). That is, processor 202 may separately determine a distribution (e.g., mean, standard variation, etc.) of the individual input parameter corresponding to the normal ranges of the output parameters. Processor 202 may then analyze and combine the desired distributions for all the individual input parameters to determined desired distributions and characteristics for the input parameters.
Alternatively, processor 202 may identify desired distributions of input parameters simultaneously to maximize the possibility of obtaining desired outcomes. In certain embodiments, processor 202 may simultaneously determine desired distributions of the input parameters based on zeta statistic. Zeta statistic may indicate a relationship between input parameters, their value ranges, and desired outcomes. Zeta statistic may be represented as
where
Processor 202 may identify a desired distribution of the input parameters such that the zeta statistic of the neural network computational model (i.e., the control model) is maximized or optimized. An appropriate type of genetic algorithm may be used by processor 202 to search the desired distribution of input parameters with the purpose of maximizing the zeta statistic. Processor 202 may select a candidate set of input parameters with predetermined search ranges and run a simulation of the control model to calculate the zeta statistic parameters based on the input parameters, the output parameters, and the neural network computational model. Processor 202 may obtain
Processor 202 may select the candidate set of input parameters if the genetic algorithm converges (i.e., the genetic algorithm finds the maximized or optimized zeta statistic of the control model corresponding to the candidate set of input parameters). If the genetic algorithm does not converge, a different candidate set of input parameters may be created by the genetic algorithm for further searching. This searching process may continue until the genetic algorithm converges and a desired set of the input parameters is identified. Processor 202 may further determine desired distributions (e.g., mean and standard deviations) of input parameters based on the desired input parameter set.
After control model 300 is optimized (step 412), processor 202 may recalibrate input parameters (step 414). For example, if the desired distribution for input parameter 306 atmospheric pressure is large (i.e., a wide range), processor 202 may allow various values of input parameter 306 with less limitations. On the other hand, if the desired distribution for input parameter 306 is small (i.e., a narrow range), processor 202 may limit and/or adjust values of input parameters such that normal output parameters may be generated. Such optimization processes may be performed in real-time by processor 202 to adopt a different requirement. For example, if higher power is required in another application, processor 202 may optimize the model according to desired distributions for higher power.
The generated and optimized computational model may be used in operation. The derived throttle valve setting and boost control indication may be provided via I/O devices 208 to control relevant hardware devices and/or subsystems of engine 110. Optionally, control model may include a second control model used in combination with control model 300, as illustrated in
As shown in
Alternatively, control model 500 may be used as a reference model. When used as a reference model, control model 500 may be generated and optimized according to reference distributions of the input parameters (e.g., gas pedal indication, gear selection, atmospheric pressure, and engine temperature, etc). Control model 500 may simultaneously produce output parameters (e.g., throttle valve setting, boost control indication, etc.) independent from control model 300. Processor 202 may use logic 502 to compare the output parameters from control model 300 with the output parameters from control model 500. Logic 502 may include any appropriate type of computer hardware component or software program configured to determine a difference between output parameters from control models 300 and 500. If the difference is beyond a predetermined threshold, processor 202 may determine that control model 300 has failed to react to a particular set of input parameters. Processor 202 may choose to continue the operation using previous output parameter values and discard the out-of-range output parameters, or to use output parameters from control model 500. If the number of times for which control model 300 produces out-of-range output parameters exceed a certain limitation, processor 202 may determine that control model 300 has failed, and may generate and optimize a new control model to replace failed control model 300.
The disclosed systems and methods may provide an efficient and optimized solution to a wide range of control systems, such as engine control systems and other work machine control systems. Complex interrelationships may be analyzed during the generation of computational models to optimize the models by identifying desired distributions of input parameters to the models to obtain desired outputs. The accuracy and efficiency of control systems may be significantly improved by using the disclosed systems and methods.
The disclosed systems and methods may also provide high reliability by using two or more computational models similarly generated and optimized. The outputs of the models may be analyzed in real-time to determine the status of the model and/or the desired outputs.
Other embodiments, features, aspects, and principles of the disclosed exemplary systems will be apparent to those skilled in the art and may be implemented in various environments and systems.
| Number | Name | Date | Kind |
|---|---|---|---|
| 3316395 | Lavin | Apr 1967 | A |
| 4136329 | Trobert | Jan 1979 | A |
| 4533900 | Muhlberger et al. | Aug 1985 | A |
| 5014220 | McMann et al. | May 1991 | A |
| 5163412 | Neu et al. | Nov 1992 | A |
| 5262941 | Saladin et al. | Nov 1993 | A |
| 5341315 | Niwa et al. | Aug 1994 | A |
| 5386373 | Keeler et al. | Jan 1995 | A |
| 5434796 | Weininger | Jul 1995 | A |
| 5539638 | Keeler et al. | Jul 1996 | A |
| 5548528 | Keeler et al. | Aug 1996 | A |
| 5561610 | Schricker et al. | Oct 1996 | A |
| 5566091 | Schricker et al. | Oct 1996 | A |
| 5585553 | Schricker | Dec 1996 | A |
| 5594637 | Eisenberg et al. | Jan 1997 | A |
| 5598076 | Neubauer et al. | Jan 1997 | A |
| 5604306 | Schricker | Feb 1997 | A |
| 5604895 | Raimi | Feb 1997 | A |
| 5608865 | Midgely et al. | Mar 1997 | A |
| 5666297 | Britt et al. | Sep 1997 | A |
| 5682317 | Keeler et al. | Oct 1997 | A |
| 5698780 | Mizutani et al. | Dec 1997 | A |
| 5727128 | Morrison | Mar 1998 | A |
| 5750887 | Schricker | May 1998 | A |
| 5752007 | Morrison | May 1998 | A |
| 5835902 | Jannarone | Nov 1998 | A |
| 5842202 | Kon | Nov 1998 | A |
| 5914890 | Sarangapani et al. | Jun 1999 | A |
| 5925089 | Fujime | Jul 1999 | A |
| 5950147 | Sarangapani et al. | Sep 1999 | A |
| 5966312 | Chen | Oct 1999 | A |
| 5987976 | Sarangapani | Nov 1999 | A |
| 6086617 | Waldon et al. | Jul 2000 | A |
| 6092016 | Sarangapani et al. | Jul 2000 | A |
| 6119074 | Sarangapani | Sep 2000 | A |
| 6145066 | Atkin | Nov 2000 | A |
| 6195648 | Simon et al. | Feb 2001 | B1 |
| 6199007 | Zavarehi et al. | Mar 2001 | B1 |
| 6208982 | Allen, Jr. et al. | Mar 2001 | B1 |
| 6223133 | Brown | Apr 2001 | B1 |
| 6236908 | Cheng et al. | May 2001 | B1 |
| 6240343 | Sarangapani et al. | May 2001 | B1 |
| 6269351 | Black | Jul 2001 | B1 |
| 6298718 | Wang | Oct 2001 | B1 |
| 6370544 | Krebs et al. | Apr 2002 | B1 |
| 6405122 | Yamaguchi | Jun 2002 | B1 |
| 6438430 | Martin et al. | Aug 2002 | B1 |
| 6442511 | Sarangapani et al. | Aug 2002 | B1 |
| 6466859 | Fujime | Oct 2002 | B1 |
| 6477660 | Sohner | Nov 2002 | B1 |
| 6513018 | Culhane | Jan 2003 | B1 |
| 6546379 | Hong et al. | Apr 2003 | B1 |
| 6584768 | Hecker et al. | Jul 2003 | B1 |
| 6594989 | Hepburn et al. | Jul 2003 | B1 |
| 6698203 | Wang | Mar 2004 | B2 |
| 6711676 | Zomaya et al. | Mar 2004 | B1 |
| 6721606 | Kaji et al. | Apr 2004 | B1 |
| 6725208 | Hartman et al. | Apr 2004 | B1 |
| 6763708 | Ting et al. | Jul 2004 | B2 |
| 6775647 | Evans et al. | Aug 2004 | B1 |
| 6785604 | Jacobson | Aug 2004 | B2 |
| 6804600 | Uluyol et al. | Oct 2004 | B1 |
| 6810442 | Lin et al. | Oct 2004 | B1 |
| 6823675 | Brunell et al. | Nov 2004 | B2 |
| 6859770 | Ramsey | Feb 2005 | B2 |
| 6859785 | Case | Feb 2005 | B2 |
| 6865883 | Gomulka | Mar 2005 | B2 |
| 6882929 | Liang et al. | Apr 2005 | B2 |
| 6895286 | Kaji et al. | May 2005 | B2 |
| 6935313 | Jacobson | Aug 2005 | B2 |
| 6941287 | Vaidyanathan et al. | Sep 2005 | B1 |
| 6952662 | Wegerich et al. | Oct 2005 | B2 |
| 6976062 | Denby et al. | Dec 2005 | B1 |
| 7000229 | Gere | Feb 2006 | B2 |
| 7024343 | El-Ratal | Apr 2006 | B2 |
| 7027953 | Klein | Apr 2006 | B2 |
| 7035834 | Jacobson | Apr 2006 | B2 |
| 7117079 | Streichsbier et al. | Oct 2006 | B2 |
| 7124047 | Zhang et al. | Oct 2006 | B2 |
| 7127892 | Akins et al. | Oct 2006 | B2 |
| 7174284 | Dolansky et al. | Feb 2007 | B2 |
| 7178328 | Solbrig | Feb 2007 | B2 |
| 7191161 | Rai et al. | Mar 2007 | B1 |
| 7194392 | Tuken et al. | Mar 2007 | B2 |
| 7213007 | Grichnik | May 2007 | B2 |
| 7356393 | Schlatre et al. | Apr 2008 | B1 |
| 7369925 | Morioka et al. | May 2008 | B2 |
| 20020014294 | Okano et al. | Feb 2002 | A1 |
| 20020016701 | Duret et al. | Feb 2002 | A1 |
| 20020042784 | Kerven et al. | Apr 2002 | A1 |
| 20020049704 | Vanderveldt et al. | Apr 2002 | A1 |
| 20020103996 | LeVasseur et al. | Aug 2002 | A1 |
| 20020198821 | Munoz | Dec 2002 | A1 |
| 20030018503 | Shulman | Jan 2003 | A1 |
| 20030055607 | Wegerich et al. | Mar 2003 | A1 |
| 20030093250 | Goebel | May 2003 | A1 |
| 20030126053 | Boswell et al. | Jul 2003 | A1 |
| 20030126103 | Chen et al. | Jul 2003 | A1 |
| 20030130855 | Babu et al. | Jul 2003 | A1 |
| 20030167354 | Peppers et al. | Sep 2003 | A1 |
| 20030187567 | Sulatisky et al. | Oct 2003 | A1 |
| 20030187584 | Harris | Oct 2003 | A1 |
| 20030200296 | Lindsey | Oct 2003 | A1 |
| 20040030420 | Ulyanov et al. | Feb 2004 | A1 |
| 20040034857 | Mangino et al. | Feb 2004 | A1 |
| 20040059518 | Rothschild | Mar 2004 | A1 |
| 20040077966 | Yamaguchi et al. | Apr 2004 | A1 |
| 20040122702 | Sabol et al. | Jun 2004 | A1 |
| 20040122703 | Walker et al. | Jun 2004 | A1 |
| 20040128058 | Andres et al. | Jul 2004 | A1 |
| 20040135677 | Asam | Jul 2004 | A1 |
| 20040138995 | Hershkowitz et al. | Jul 2004 | A1 |
| 20040153227 | Hagiwara et al. | Aug 2004 | A1 |
| 20040230404 | Messmer et al. | Nov 2004 | A1 |
| 20040267818 | Hartenstine | Dec 2004 | A1 |
| 20050047661 | Mauer | Mar 2005 | A1 |
| 20050055176 | Clarke et al. | Mar 2005 | A1 |
| 20050091093 | Bhaskaran et al. | Apr 2005 | A1 |
| 20050209943 | Ballow et al. | Sep 2005 | A1 |
| 20050210337 | Chester et al. | Sep 2005 | A1 |
| 20050240539 | Olavson | Oct 2005 | A1 |
| 20050261791 | Chen et al. | Nov 2005 | A1 |
| 20050262031 | Saidi et al. | Nov 2005 | A1 |
| 20050278227 | Esary et al. | Dec 2005 | A1 |
| 20050278432 | Feinleib et al. | Dec 2005 | A1 |
| 20060010057 | Bradway et al. | Jan 2006 | A1 |
| 20060010142 | Kim et al. | Jan 2006 | A1 |
| 20060010157 | Dumitrascu et al. | Jan 2006 | A1 |
| 20060025897 | Shostak et al. | Feb 2006 | A1 |
| 20060026270 | Sadovsky et al. | Feb 2006 | A1 |
| 20060026587 | Lemarroy et al. | Feb 2006 | A1 |
| 20060064474 | Feinleib et al. | Mar 2006 | A1 |
| 20060068973 | Kappauf et al. | Mar 2006 | A1 |
| 20060129289 | Kumar et al. | Jun 2006 | A1 |
| 20060130052 | Allen et al. | Jun 2006 | A1 |
| 20060229753 | Seskin et al. | Oct 2006 | A1 |
| 20060229769 | Grichnik et al. | Oct 2006 | A1 |
| 20060229852 | Grichnik et al. | Oct 2006 | A1 |
| 20060229854 | Grichnik et al. | Oct 2006 | A1 |
| 20060230018 | Grichnik et al. | Oct 2006 | A1 |
| 20060230097 | Grichnik et al. | Oct 2006 | A1 |
| 20060230313 | Grichnik et al. | Oct 2006 | A1 |
| 20060241923 | Xu et al. | Oct 2006 | A1 |
| 20060247798 | Subbu et al. | Nov 2006 | A1 |
| 20070061144 | Grichnik et al. | Mar 2007 | A1 |
| 20070094048 | Grichnik | Apr 2007 | A1 |
| 20070094181 | Tayebnejad et al. | Apr 2007 | A1 |
| 20070118338 | Grichnik et al. | May 2007 | A1 |
| 20070124237 | Sundararajan et al. | May 2007 | A1 |
| 20070150332 | Grichnik et al. | Jun 2007 | A1 |
| 20070168494 | Liu et al. | Jul 2007 | A1 |
| 20070179769 | Grichnik et al. | Aug 2007 | A1 |
| 20070203864 | Grichnik | Aug 2007 | A1 |
| 20080154811 | Grichnik et al. | Jun 2008 | A1 |
| Number | Date | Country |
|---|---|---|
| 0 959 414 | Nov 1999 | EP |
| 1103926 | May 2001 | EP |
| 1367248 | Dec 2003 | EP |
| 1418481 | May 2004 | EP |
| 10-332621 | Dec 1998 | JP |
| 11-351045 | Dec 1999 | JP |
| 2002-276344 | Sep 2002 | JP |
| WO97042581 | Nov 1997 | WO |
| WO02057856 | Jul 2002 | WO |
| WO2006017453 | Feb 2006 | WO |
| Number | Date | Country | |
|---|---|---|---|
| 20060229769 A1 | Oct 2006 | US |