This disclosure relates generally to process modeling techniques and, more particularly, to methods and computer systems for process modeling error correction.
Predictive modeling refers to generating a model from a given set of data records of both input parameters and output parameters and predicting actual output parameters corresponding to actual input parameters based on the model. Predictive models may be built by using various methods from data for many different families of models, such as decision trees, decision lists, linear equations, and neural networks.
The data records used to build a model are known as training data records. In certain situations, the training data records may be unable to cover the entire input space of the input parameters or the training data records may be discrete such that uniform relationships represented by a single predictive model between input parameters and output parameters may be unavailable across the entire input space and/or output space.
Techniques such as boosting and/or bagging may be used to divide the input space and/or output space by applying a large number of mathematical models. Each mathematical model may only cover a part of the input space and/or output space. For example, U.S. Pat. No. 6,546,379 (the '379 patent) issued to Hong et al. on Apr. 8, 2003, discloses a cascade boosting method for boosting predictive models for resolving the interpretability problem of previous boosting methods and mitigating the fragmentation problem when applied to decision trees.
However, such conventional techniques, while involving a large number of models, may cause coarse transitions from the large number of models. These coarse transitions may reduce the accuracy of the overall predictive model and may also cause confusion for the users of the overall predictive model.
Methods and systems consistent with certain features of the disclosed systems are directed to solving one or more of the problems set forth above.
One aspect of the present disclosure includes a method for a virtual sensor system. The method may include establishing a first process model indicative of interrelationships between a plurality of input parameters and a plurality of output parameters and establishing a second process model indicative of interrelationships between at least the plurality of input parameters and modeling errors of the first process model. The method may also include operating the first process model to generate values of the plurality of output parameters and simultaneously operating the second model to generate estimated deviations between the values of the plurality of output parameters and desired values of the plurality of output parameters. Further, the method may include compensating the values of the plurality of output parameters with the estimated deviations to generate the desired values of the plurality of output parameters.
Another aspect of the present disclosure includes a computer system. The computer system may include a database configured to store information relevant to a virtual sensor system and a processor. The processor may be configured to operate a first process model to generate values of a plurality of output parameters and to simultaneously operate a second model to generate estimated deviations between the values of the plurality of output parameters and desired values of the plurality of output parameters. The processor may also be configured to compensate the values of the plurality of output parameters with the estimated deviations to generate the desired values of the plurality of output parameters.
Another aspect of the present disclosure includes a work machine. The work machine may include a power source configured to provide power to the work machine, a control system configured to control the power source, and a virtual sensor system. The virtual sensor system may include a first process model and a second process model. Further, the virtual sensor system may be configured to operate the first process model to generate values of a plurality of sensing parameters and to simultaneously operate the second model to generate estimated deviations between the values of the plurality of sensing parameters and desired values of the plurality of sensing parameters. The virtual sensor system may also be configured to compensate the values of the plurality of sensing parameters with the estimated deviations to generate the desired values of the plurality of sensing parameters. The control system may control the power source based upon the desired values of the plurality of sensing parameters.
Another aspect of the present disclosure includes a computer-readable medium for use on a computer system. The computer-readable medium may include computer-executable instructions for performing a method. The method may include establishing a first process model indicative of interrelationships between a plurality of input parameters and a plurality of output parameters and establishing a second process model indicative of interrelationships between at least the plurality of input parameters and modeling errors of the first process model. The method may also include operating the first process model to generate values of the plurality of output parameters and simultaneously operating the second model to generate estimated deviations between the values of the plurality of output parameters and desired values of the plurality of output parameters. Further, the method may include compensating the values of the plurality of output parameters with the estimated deviations to generate the desired values of the plurality of output parameters.
Reference will now be made in detail to exemplary embodiments, which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
Work machine 100 may also include an engine 110, an engine control module (ECM) 120, physical sensors 140 and 142, and a data link 150. Engine 110 may include any appropriate type of engine or power source that generates power for work machine 100, such as an internal combustion engine or fuel cell generator. ECM 120 may include any appropriate type of engine control system configured to perform engine control functions such that engine 110 may operate properly. ECM 120 may include any number of devices, such as microprocessors or microcontrollers, memory modules, communication devices, input/output devices, storages devices, etc., to perform such control functions. Further, ECM 120 may also control other systems of work machine 100, such as transmission systems, and/or hydraulics systems, etc. Computer software instructions may be stored in or loaded to ECM 120. ECM 120 may execute the computer software instructions to perform various control functions and processes.
ECM 120 may be coupled to data link 150 to receive data from and send data to other components, such as engine 110, physical sensors 140 and 142, virtual sensor system 130, and/or any other components (not shown) of work machine 100. Data link 150 may include any appropriate type of data communication medium, such as cable, wires, wireless radio, and/or laser, etc. Physical sensor 140 may include one or more sensors provided for measuring certain parameters of work machine operating environment. For example, physical sensor 140 may include emission sensors for measuring emissions of work machine 100, such as Nitrogen Oxides (NOx), Sulfur Dioxide (SO2), Carbon Monoxide (CO), total reduced Sulfur (TRS), etc. In particular, NOx emission sensing and reduction may be important to normal operation of engine 110. Physical sensor 142, on the other hand, may include any appropriate sensors that are used inside engine 110 or other work machine components (not show) to provide various measured parameters about engine 110 or other components, such as temperature, speed, etc.
Virtual sensor system 130 may include any appropriate type of control system having one or more process models. The process models may be trained to generate values of sensing parameters based on a plurality of measured parameters. The sensing parameters may refer to those measurement parameters that are directly measured by a particular physical sensor. For example, a physical NOx emission sensor may measure the NOx emission level of work machine 100 and provide values of NOx emission level, the sensing parameter, to other components, such as ECM 120. Sensing parameters, however, may also include any output parameters that may be measured indirectly by physical sensors and/or calculated based on readings of physical sensors.
On the other hand, the measured parameters may refer to any parameters relevant to the sensing parameters and indicative of the state of a component or components of work machine 100, such as engine 110. For example, for the sensing parameter NOx emission level, measured parameters may include various parameters such as compression ratios, turbocharger efficiency, aftercooler characteristics, temperature values, pressure values, ambient conditions, fuel rates, and engine speeds, etc.
Further, virtual sensor system 130 may be configured as a separate control system or, alternatively, may coincide with other control systems such as ECM 120.
As shown in
Processor 202 may include any appropriate type of general purpose microprocessor, digital signal processor, or microcontroller. Processor 202 may be configured as a separate processor module dedicated to controlling engine 110. Alternatively, processor 202 may be configured as a shared processor module for performing other functions unrelated to virtual sensors.
Memory module 204 may include one or more memory devices including, but not limited to, a ROM, a flash memory, a dynamic RAM, and a static RAM. Memory module 204 may be configured to store information used by processor 202. Database 206 may include any type of appropriate database containing information on characteristics of measured parameters, sensing parameters, mathematical models, and/or any other control information.
Further, I/O interface 208 may also be connected to data link 150 to obtain data from various sensors or other components (e.g., physical sensors 140 and 142) and/or to transmit data to these components and to ECM 120. Network interface 210 may include any appropriate type of network device capable of communicating with other computer systems based on one or more communication protocols. Storage 212 may include any appropriate type of mass storage provided to store any type of information that processor 202 may need to operate. For example, storage 212 may include one or more hard disk devices, optical disk devices, or other storage devices to provide storage space.
As explained above, virtual sensor system 130 may include one or more process models with error correction capabilities to provide values of certain sensing parameters to ECM 120.
As shown in
In certain embodiments, virtual sensor system 130 may provide levels of NOx emitted from an exhaust system (not shown) of work machine 100. Input parameters 302 may include any appropriate type of data associated with NOx emission levels. For example, input parameters 302 may include parameters that control operations of various response characteristics of engine 110 and/or parameters that are associated with conditions corresponding to the operations of engine 110. For example, input parameters 302 may include fuel injection timing, compression ratios, turbocharger efficiency, aftercooler characteristics, temperature values (e.g., intake manifold temperature), pressure values (e.g., intake manifold pressure), ambient conditions (e.g., ambient humidity), fuel rates, and engine speeds, etc. Other parameters, however, may also be included. Input parameters 302 may be measured by certain physical sensors, such as physical sensor 142, or created by other control systems such as ECM 120. Virtual sensor system 130 may obtain values of input parameters 302 via an input 310 coupled to data link 150.
On the other hand, output parameters 306 may correspond to sensing parameters. For example, output parameters 306 of a NOx virtual sensor may include NOx emission level, and/or any other types of output parameters used by NOx virtual sensing application. Output parameters 306 (e.g., NOx emission level) may be sent to ECM 120 via output 320 coupled to data link 150.
After virtual sensor process model 304 is established, values of input parameters 302 may be provided to virtual sensor process model 304 to generate values of output parameters 306 based on the given values of input parameters 302 and the interrelationships between input parameters 302 and output parameters 306 established by the virtual sensor process model 304. For example, virtual sensor system 130 may include a NOx virtual sensor to provide levels of NOx emitted from an exhaust system (not shown) of work machine 100.
Virtual sensor process model 304 may include any appropriate type of mathematical or physical model indicating interrelationships between input parameters 302 and output parameters 306. For example, virtual sensor process model 304 may be a neural network based mathematical model that is trained to capture interrelationships between input parameters 302 and output parameters 306. Other types of mathematic models, such as fuzzy logic models, linear system models, and/or non-linear system models, etc., may also be used. Virtual sensor process model 304 may be trained and validated using data records collected from a particular engine application for which virtual sensor process model 304 is established. That is, virtual sensor process model 304 may be established according to particular rules corresponding to a particular type of application using the data records, and the interrelationships of virtual sensor process model 304 may also be verified by using part of the data records.
After virtual sensor process model 304 is trained and validated, virtual sensor process model 304 may be optimized to define a desired input space of input parameters 302 and/or a desired distribution of output parameters 306. The validated or optimized virtual sensor process model 304 may be used to produce corresponding values of output parameters 306 when provided with a set of values of input parameters 102. In the above example, virtual sensor process model 304 may be used to produce NOx emission level based on measured parameters, such as ambient humidity, intake manifold pressure, intake manifold temperature, fuel rate, and engine speed, etc.
The establishment and operations of virtual sensor process model 304 may be carried out by processor 202 based on computer programs stored on and/or loaded to virtual sensor system 130. Alternatively, the establishment of virtual sensor process model 304 may be realized by other computer systems, such as ECM 120 or a separate general purpose computer configured to create process models.
Processor 202 may perform a virtual sensor process model generation and optimization process to generate and optimize virtual sensor process model 304.
As shown in
The data records may also be collected from experiments designed for collecting such data. Alternatively, the data records may be generated artificially by other related processes, such as other emission modeling or analysis processes. The data records may also include training data used to build virtual sensor process model 304 and testing data used to validate virtual sensor process model 304. In addition, the data records may also include simulation data used to observe and optimize virtual sensor process model 304.
The data records may reflect characteristics of input parameters 302 and output parameters 306, such as statistic distributions, normal ranges, and/or precision tolerances, etc. Once the data records are obtained (step 402), processor 202 may pre-process the data records to clean up the data records for obvious errors and to eliminate redundancies (step 404). Processor 202 may remove approximately identical data records and/or remove data records that are out of a reasonable range in order to be meaningful for model generation and optimization. After the data records have been pre-processed, processor 202 may select proper input parameters by analyzing the data records (step 406).
The data records may be associated with many input variables, such as variables corresponding to fuel injection timing, compression ratios, turbocharger efficiency, aftercooler characteristics, various temperature parameters, various pressure parameters, various ambient conditions, fuel rates, and engine speeds, etc. The number of input variables may be greater than the number of a particular set of input parameters 302 used for virtual sensor process model 304, that is, input parameters 302 may be a subset of the input variables. For example, input parameter 302 may include intake manifold temperature, intake manifold pressure, ambient humidity, fuel rate, and engine speed, etc., of the input variables.
A large number of input variables may significantly increase computational time during generation and operations of the mathematical models. The number of the input variables may need to be reduced to create mathematical models within practical computational time limits. Additionally, in certain situations, the number of input variables in the data records may exceed the number of the data records and lead to sparse data scenarios. Some of the extra input variables may have to be omitted in certain mathematical models such that practical mathematical models may be created based on reduced variable number.
Processor 202 may select input parameters 302 from the input variables according to predetermined criteria. For example, processor 202 may choose input parameters 302 by experimentation and/or expert opinions. Alternatively, in certain embodiments, processor 202 may select input parameters based on a mahalanobis distance between a normal data set and an abnormal data set of the data records. The normal data set and abnormal data set may be defined by processor 202 using any appropriate method. For example, the normal data set may include characteristic data associated with input parameters 302 that produce desired output parameters. On the other hand, the abnormal data set may include any characteristic data that may be out of tolerance or may need to be avoided. The normal data set and abnormal data set may be predefined by processor 202.
Mahalanobis distance may refer to a mathematical representation that may be used to measure data profiles based on correlations between parameters in a data set. Mahalanobis distance differs from Euclidean distance in that mahalanobis distance takes into account the correlations of the data set. Mahalanobis distance of a data set X (e.g., a multivariate vector) may be represented as
MDi=(Xi−μx)Σ−1(Xi−μx)′ (1)
where μx is the mean of X and Σ−1 is an inverse variance-covariance matrix of X. MDi weights the distance of a data point Xi from its mean μx such that observations that are on the same multivariate normal density contour will have the same distance. Such observations may be used to identify and select correlated parameters from separate data groups having different variances.
Processor 202 may select input parameter 302 as a desired subset of input variables such that the mahalanobis distance between the normal data set and the abnormal data set is maximized or optimized. A genetic algorithm may be used by processor 202 to search input variables for the desired subset with the purpose of maximizing the mahalanobis distance. Processor 202 may select a candidate subset of the input variables based on a predetermined criteria and calculate a mahalanobis distance MDnormal of the normal data set and a mahalanobis distance MDabnormal of the abnormal data set. Processor 202 may also calculate the mahalanobis distance between the normal data set and the abnormal data (i.e., the deviation of the mahalanobis distance MDx=MDnormal−MDabnormal). Other types of deviations, however, may also be used.
Processor 202 may select the candidate subset of input variables if the genetic algorithm converges (i.e., the genetic algorithm finds the maximized or optimized mahalanobis distance between the normal data set and the abnormal data set corresponding to the candidate subset). If the genetic algorithm does not converge, a different candidate subset of input variables may be created for further searching. This searching process may continue until the genetic algorithm converges and a desired subset of input variables (e.g., input parameters 302) is selected.
Optionally, mahalanobis distance may also be used to reduce the number of data records by choosing a part of data records that achieve a desired mahalanobis distance, as explained above.
After selecting input parameters 302 (e.g., intake manifold temperature, intake manifold pressure, ambient humidity, fuel rate, and engine speed, etc.), processor 202 may generate virtual sensor process model 304 to build interrelationships between input parameters 302 and output parameters 306 (step 408). In certain embodiments, virtual sensor process model 304 may correspond to a computational model, such as, for example, a computational model built on any appropriate type of neural network. The type of neural network computational model that may be used may include back propagation, feed forward models, cascaded neural networks, and/or hybrid neural networks, etc. Particular type or structures of the neural network used may depend on particular applications. Other types of computational models, such as linear system or non-linear system models, etc., may also be used.
The neural network computational model (i.e., virtual sensor process model 304) may be trained by using selected data records. For example, the neural network computational model may include a relationship between output parameters 306 (e.g., NOx emission level, etc.) and input parameters 302 (e.g., intake manifold temperature, intake manifold pressure, ambient humidity, fuel rate, and engine speed, etc.). The neural network computational model may be evaluated by predetermined criteria to determine whether the training is completed. The criteria may include desired ranges of accuracy, time, and/or number of training iterations, etc.
After the neural network has been trained (i.e., the computational model has initially been established based on the predetermined criteria), processor 202 may statistically validate the computational model (step 410). Statistical validation may refer to an analyzing process to compare outputs of the neural network computational model with actual or expected outputs to determine the accuracy of the computational model. Part of the data records may be reserved for use in the validation process.
Alternatively, processor 202 may also generate simulation or validation data for use in the validation process. This may be performed either independently of a validation sample or in conjunction with the sample. Statistical distributions of inputs may be determined from the data records used for modeling. A statistical simulation, such as Latin Hypercube simulation, may be used to generate hypothetical input data records. These input data records are processed by the computational model, resulting in one or more distributions of output characteristics. The distributions of the output characteristics from the computational model may be compared to distributions of output characteristics observed in a population. Statistical quality tests may be performed on the output distributions of the computational model and the observed output distributions to ensure model integrity.
Once trained and validated, virtual sensor process model 304 may be used to predict values of output parameters 306 when provided with values of input parameters 302. Further, processor 202 may optimize virtual sensor process model 304 by determining desired distributions of input parameters 302 based on relationships between input parameters 302 and desired distributions of output parameters 306 (step 412).
Processor 202 may analyze the relationships between desired distributions of input parameters 302 and desired distributions of output parameters 306 based on particular applications. For example, processor 202 may select desired ranges for output parameters 306 (e.g., NOx emission level that is desired or within certain predetermined range). Processor 202 may then run a simulation of the computational model to find a desired statistic distribution for an individual input parameter (e.g., one of intake manifold temperature, intake manifold pressure, ambient humidity, fuel rate, and engine speed, etc.). That is, processor 202 may separately determine a distribution (e.g., mean, standard variation, etc.) of the individual input parameter corresponding to the normal ranges of output parameters 306. After determining respective distributions for all individual input parameters, processor 202 may combine the desired distributions for all the individual input parameters to determine desired distributions and characteristics for overall input parameters 302.
Alternatively, processor 202 may identify desired distributions of input parameters 302 simultaneously to maximize the possibility of obtaining desired outcomes. In certain embodiments, processor 202 may simultaneously determine desired distributions of input parameters 302 based on zeta statistic. Zeta statistic may indicate a relationship between input parameters, their value ranges, and desired outcomes. Zeta statistic may be represented as
where
Under certain circumstances,
Processor 202 may identify a desired distribution of input parameters 302 such that the zeta statistic of the neural network computational model (i.e., virtual sensor process model 304) is maximized or optimized. An appropriate type of genetic algorithm may be used by processor 202 to search the desired distribution of input parameters 302 with the purpose of maximizing the zeta statistic. Processor 202 may select a candidate set values of input parameters 302 with predetermined search ranges and run a simulation of virtual sensor process model 304 to calculate the zeta statistic parameters based on input parameters 302, output parameters 306, and the neural network computational model. Processor 202 may obtain
Processor 202 may select the candidate set of input parameters 302 if the genetic algorithm converges (i.e., the genetic algorithm finds the maximized or optimized zeta statistic of virtual sensor process model 304 corresponding to the candidate set of input parameters 302). If the genetic algorithm does not converge, a different candidate set values of input parameters 302 may be created by the genetic algorithm for further searching. This searching process may continue until the genetic algorithm converges and a desired set of input parameters 302 is identified. Processor 202 may further determine desired distributions (e.g., mean and standard deviations) of input parameters 302 based on the desired input parameter set. Once the desired distributions are determined, processor 202 may define a valid input space that may include any input parameter within the desired distributions (step 414).
In one embodiment, statistical distributions of certain input parameters may be impossible or impractical to control. For example, an input parameter may be associated with a physical attribute of a device, such as a dimensional attribute of an engine part, or the input parameter may be associated with a constant variable within virtual sensor process model 304 itself. These input parameters may be used in the zeta statistic calculations to search or identify desired distributions for other input parameters corresponding to constant values and/or statistical distributions of these input parameters.
Further, optionally, more than one virtual sensor process model may be established. Multiple established virtual sensor process models may be simulated by using any appropriate type of simulation method, such as statistical simulation. Output parameters 306 based on simulation of these multiple virtual sensor process models may be compared to select a most-fit virtual sensor process model based on predetermined criteria, such as smallest variance with outputs from corresponding physical sensors, etc. The selected most-fit virtual sensor process model 304 may be deployed in virtual sensor applications.
As explained above, after virtual sensor process model 304 is trained, validated, and optimized, virtual sensor process model 304 may then be used by virtual sensor system 130 to predict output parameters 306. Further, ECM 120 and virtual sensor system 130 may provide control functions to relevant components of work machine 100. For example, ECM 120 may control engine 110 according to NOx emission level provided by virtual sensor system 130, and, in particular, by virtual sensor process model 304.
However, under certain circumstances, such as unrepresentative data records, discrete data records, and/or complexity of the process model, deviations between output parameters 306 of virtual sensor process model 304 and output parameters of a physical sensor that is modeled by virtual sensor process model 304 may exist. Alternatively, the deviations may exist between values of output parameters 306 of virtual sensor process model 304 and corresponding desired output parameters predetermined by other software programs or users. These deviations may be referred as modeling errors and may need to be corrected for accuracy. Other modeling errors, however, may also be used.
As shown in
The arrangement of virtual sensor process model 304 and error compensation process model 502 may also be in parallel.
The creations and operations of both the serial arrangement and the parallel arrangement of virtual sensor system 130 may be carried out by processor 202 via executing certain computer software programs.
As shown in
After the data records are available, processor 202 may create error compensation process model 502 using the data records (step 606). Error compensation process model 502 may be trained, validated, and/or optimized by using any appropriate method. For example, error compensation process model 502 may be trained by using the same process that may be used by virtual sensor process model 304. In the serial arrangement, data records of output 504 from error compensation process model 502 may be included in inputs to error compensation process model 502. On the other hand, in the parallel arrangement, data records of output 504 may be used to derive data records on modeling errors but may be unavailable for training error compensation process model 502. However, the amount of computation in the parallel arrangement may be significantly reduced and/or the speed of calculation in the parallel arrangement may be increased.
After error compensation process model 502 is created, processor 202 may operate both virtual sensor process model 304 and error compensation process model 502 (step 608). In one embodiment, both process models may be operated simultaneously. The operations may generate output 504 from virtual sensor process model 304 and output 506 from error compensation process model 502. Processor 202 may also provide output 504 and output 506 to logic 508. Processor 202 may also operate logic 508 to compensate output 504 with output 506 (i.e., model error compensation) to generate more accurate output parameters 306 (step 610).
Processor 202 may also present the compensated output parameters 306 to other control systems, such as ECM 120 (step 612). For example, processor 202 or virtual sensor system 130 may provide output parameters 306, such as NOx emission level to ECM 120. ECM 120 may obtain output parameters 306 (e.g., NOx emission level) via data link 150. After ECM 120 obtains the NOx emission level from virtual sensor system 130, ECM 120 may control engine 110 and/or other components of work machine 100 based on the NOx emission level (step 508). For example, ECM 120 may perform certain emission enhancing or minimization processes.
The disclosed methods and systems may provide efficient and accurate virtual sensor process models that can cover entire range of input and/or output spaces. Such technology may be used in a wide range of virtual sensors, such as sensors for engines, structures, environments, and materials, etc. In particular, the disclosed systems and methods provide practical solutions when process models are difficult to build using other techniques due to computational complexities and limitations of available data records.
The disclosed methods and systems may be used in combination with other process modeling techniques to significantly increase speed, accuracy, practicality, and/or flexibility. Other applications involving process modeling may also benefit from the disclosed methods and systems. Conventional techniques such as boosting and bagging may be replaced and/or improved by using the disclosed methods and systems.
The disclosed systems and methods may also be used by work machine manufacturers to reduce cost and increase reliability by replacing costly or failure-prone physical sensors. Reliability and flexibility may also be improved by the disclosed virtual sensor system. The disclosed virtual sensor techniques may be used to provide a wide range of parameters in components such as emission, engine, transmission, navigation, and/or control, etc. Further, parts of the disclosed system or steps of the disclosed method may also be used by computer system providers to facilitate or integrate other process models. For example, certain computer design software manufactures may integrate the disclosed methods and systems to improve the performance of various computer-based design software programs.
Other embodiments, features, aspects, and principles of the disclosed exemplary systems will be apparent to those skilled in the art and may be implemented in various environments and systems.
Number | Name | Date | Kind |
---|---|---|---|
3316395 | Lavin | Apr 1967 | A |
4136329 | Trobert | Jan 1979 | A |
4533900 | Muhlberger et al. | Aug 1985 | A |
5014220 | McMann et al. | May 1991 | A |
5163412 | Neu et al. | Nov 1992 | A |
5262941 | Saladin et al. | Nov 1993 | A |
5341315 | Niwa et al. | Aug 1994 | A |
5386373 | Keeler et al. | Jan 1995 | A |
5434796 | Weininger | Jul 1995 | A |
5539638 | Keeler et al. | Jul 1996 | A |
5548528 | Keeler et al. | Aug 1996 | A |
5561610 | Schricker et al. | Oct 1996 | A |
5566091 | Schricker et al. | Oct 1996 | A |
5585553 | Schricker | Dec 1996 | A |
5594637 | Eisenberg et al. | Jan 1997 | A |
5598076 | Neubauer et al. | Jan 1997 | A |
5604306 | Schricker | Feb 1997 | A |
5604895 | Raimi | Feb 1997 | A |
5608865 | Midgely et al. | Mar 1997 | A |
5666297 | Britt et al. | Sep 1997 | A |
5682317 | Keeler et al. | Oct 1997 | A |
5698780 | Mizutani et al. | Dec 1997 | A |
5727128 | Morrison | Mar 1998 | A |
5750887 | Schricker | May 1998 | A |
5752007 | Morrison | May 1998 | A |
5835902 | Jannarone | Nov 1998 | A |
5842202 | Kon | Nov 1998 | A |
5914890 | Sarangapani et al. | Jun 1999 | A |
5925089 | Fujime | Jul 1999 | A |
5950147 | Sarangapani et al. | Sep 1999 | A |
5966312 | Chen | Oct 1999 | A |
5987976 | Sarangapani | Nov 1999 | A |
6086617 | Waldon et al. | Jul 2000 | A |
6092016 | Sarangapani et al. | Jul 2000 | A |
6119074 | Sarangapani | Sep 2000 | A |
6145066 | Atkin | Nov 2000 | A |
6195648 | Simon et al. | Feb 2001 | B1 |
6199007 | Zavarehi et al. | Mar 2001 | B1 |
6208982 | Allen, Jr. et al. | Mar 2001 | B1 |
6223133 | Brown | Apr 2001 | B1 |
6236908 | Cheng et al. | May 2001 | B1 |
6240343 | Sarangapani et al. | May 2001 | B1 |
6269351 | Black | Jul 2001 | B1 |
6298718 | Wang | Oct 2001 | B1 |
6370544 | Krebs et al. | Apr 2002 | B1 |
6405122 | Yamaguchi | Jun 2002 | B1 |
6438430 | Martin et al. | Aug 2002 | B1 |
6442511 | Sarangapani et al. | Aug 2002 | B1 |
6477660 | Sohner | Nov 2002 | B1 |
6513018 | Culhane | Jan 2003 | B1 |
6546379 | Hong et al. | Apr 2003 | B1 |
6584768 | Hecker et al. | Jul 2003 | B1 |
6594989 | Hepburn et al. | Jul 2003 | B1 |
6698203 | Wang | Mar 2004 | B2 |
6711676 | Zomaya et al. | Mar 2004 | B1 |
6721606 | Kaji et al. | Apr 2004 | B1 |
6725208 | Hartman et al. | Apr 2004 | B1 |
6763708 | Ting et al. | Jul 2004 | B2 |
6775647 | Evans et al. | Aug 2004 | B1 |
6785604 | Jacobson | Aug 2004 | B2 |
6810442 | Lin et al. | Oct 2004 | B1 |
6823675 | Brunell et al. | Nov 2004 | B2 |
6859770 | Ramsey | Feb 2005 | B2 |
6859785 | Case | Feb 2005 | B2 |
6865883 | Gomulka | Mar 2005 | B2 |
6882929 | Liang et al. | Apr 2005 | B2 |
6895286 | Kaji et al. | May 2005 | B2 |
6935313 | Jacobson | Aug 2005 | B2 |
6941287 | Vaidyanathan et al. | Sep 2005 | B1 |
6952662 | Wegerich et al. | Oct 2005 | B2 |
6976062 | Denby et al. | Dec 2005 | B1 |
7000229 | Gere | Feb 2006 | B2 |
7024343 | El-Ratal | Apr 2006 | B2 |
7027953 | Klein | Apr 2006 | B2 |
7035834 | Jacobson | Apr 2006 | B2 |
7117079 | Streichsbier et al. | Oct 2006 | B2 |
7124047 | Zhang et al. | Oct 2006 | B2 |
7127892 | Akins et al. | Oct 2006 | B2 |
7136716 | Hsiung et al. | Nov 2006 | B2 |
7139619 | Martin et al. | Nov 2006 | B2 |
7149262 | Nayar et al. | Dec 2006 | B1 |
7161566 | Cok et al. | Jan 2007 | B2 |
7167583 | Lipson et al. | Jan 2007 | B1 |
7174284 | Dolansky et al. | Feb 2007 | B2 |
7178328 | Solbrig | Feb 2007 | B2 |
7184036 | Dimsdale et al. | Feb 2007 | B2 |
7191161 | Rai et al. | Mar 2007 | B1 |
7194392 | Tuken et al. | Mar 2007 | B2 |
7197398 | Azari | Mar 2007 | B2 |
7203629 | Ozis et al. | Apr 2007 | B2 |
7213007 | Grichnik | May 2007 | B2 |
7215430 | Kacyra et al. | May 2007 | B2 |
7218973 | Johnson et al. | May 2007 | B2 |
7244930 | Nelson et al. | Jul 2007 | B2 |
7263425 | Bleile et al. | Aug 2007 | B2 |
7272530 | Hsiung et al. | Sep 2007 | B2 |
7272575 | Vega | Sep 2007 | B2 |
7285772 | Labous et al. | Oct 2007 | B2 |
7313447 | Hsiung et al. | Dec 2007 | B2 |
7317938 | Lorenz et al. | Jan 2008 | B2 |
7319942 | Hatfield et al. | Jan 2008 | B2 |
7324867 | Dinauer et al. | Jan 2008 | B2 |
7356377 | Schwarm | Apr 2008 | B2 |
7356393 | Schlatre et al. | Apr 2008 | B1 |
7363319 | Cappellini | Apr 2008 | B2 |
7366244 | Gebara et al. | Apr 2008 | B2 |
7369925 | Morioka et al. | May 2008 | B2 |
7407799 | Balagadde et al. | Aug 2008 | B2 |
7415312 | Barnett et al. | Aug 2008 | B2 |
7424069 | Nicholls et al. | Sep 2008 | B1 |
7444190 | Pflugl et al. | Oct 2008 | B2 |
20020014294 | Okano et al. | Feb 2002 | A1 |
20020016701 | Duret et al. | Feb 2002 | A1 |
20020042784 | Kerven et al. | Apr 2002 | A1 |
20020049704 | Vanderveldt et al. | Apr 2002 | A1 |
20020103996 | LeVasseur et al. | Aug 2002 | A1 |
20020198821 | Munoz | Dec 2002 | A1 |
20030018503 | Shulman | Jan 2003 | A1 |
20030055607 | Wegerich et al. | Mar 2003 | A1 |
20030093250 | Goebel | May 2003 | A1 |
20030126053 | Boswell et al. | Jul 2003 | A1 |
20030126103 | Chen et al. | Jul 2003 | A1 |
20030130855 | Babu et al. | Jul 2003 | A1 |
20030167354 | Peppers et al. | Sep 2003 | A1 |
20030187567 | Sulatisky et al. | Oct 2003 | A1 |
20030187584 | Harris | Oct 2003 | A1 |
20030200296 | Lindsey | Oct 2003 | A1 |
20040030420 | Ulyanov et al. | Feb 2004 | A1 |
20040034857 | Mangino et al. | Feb 2004 | A1 |
20040059518 | Rothschild | Mar 2004 | A1 |
20040077966 | Yamaguchi et al. | Apr 2004 | A1 |
20040122702 | Sabol et al. | Jun 2004 | A1 |
20040122703 | Walker et al. | Jun 2004 | A1 |
20040128058 | Andres et al. | Jul 2004 | A1 |
20040135677 | Asam | Jul 2004 | A1 |
20040138995 | Hershkowitz et al. | Jul 2004 | A1 |
20040153227 | Hagiwara et al. | Aug 2004 | A1 |
20040230404 | Messmer et al. | Nov 2004 | A1 |
20040267818 | Hartenstine | Dec 2004 | A1 |
20050047661 | Mauer | Mar 2005 | A1 |
20050055176 | Clarke et al. | Mar 2005 | A1 |
20050091093 | Bhaskaran et al. | Apr 2005 | A1 |
20050209943 | Ballow et al. | Sep 2005 | A1 |
20050210337 | Chester et al. | Sep 2005 | A1 |
20050240539 | Olavson | Oct 2005 | A1 |
20050261791 | Chen et al. | Nov 2005 | A1 |
20050262031 | Saidi et al. | Nov 2005 | A1 |
20050278227 | Esary et al. | Dec 2005 | A1 |
20050278432 | Feinleib et al. | Dec 2005 | A1 |
20060010057 | Bradway et al. | Jan 2006 | A1 |
20060010142 | Kim et al. | Jan 2006 | A1 |
20060010157 | Dumitrascu et al. | Jan 2006 | A1 |
20060025897 | Shostak et al. | Feb 2006 | A1 |
20060026270 | Sadovsky et al. | Feb 2006 | A1 |
20060026587 | Lemarroy et al. | Feb 2006 | A1 |
20060064474 | Feinleib et al. | Mar 2006 | A1 |
20060068973 | Kappauf et al. | Mar 2006 | A1 |
20060129289 | Kumar et al. | Jun 2006 | A1 |
20060130052 | Allen et al. | Jun 2006 | A1 |
20060229753 | Seskin et al. | Oct 2006 | A1 |
20060229769 | Grichnik et al. | Oct 2006 | A1 |
20060229852 | Grichnik et al. | Oct 2006 | A1 |
20060229854 | Grichnik et al. | Oct 2006 | A1 |
20060230018 | Grichnik et al. | Oct 2006 | A1 |
20060230097 | Grichnik et al. | Oct 2006 | A1 |
20060230313 | Grichnik et al. | Oct 2006 | A1 |
20060241923 | Xu et al. | Oct 2006 | A1 |
20060247798 | Subbu et al. | Nov 2006 | A1 |
20070061144 | Grichnik et al. | Mar 2007 | A1 |
20070094048 | Grichnik | Apr 2007 | A1 |
20070094181 | Tayebnejad et al. | Apr 2007 | A1 |
20070118338 | Grichnik et al. | May 2007 | A1 |
20070124237 | Sundararajan et al. | May 2007 | A1 |
20070150332 | Grichnik et al. | Jun 2007 | A1 |
20070168494 | Liu et al. | Jul 2007 | A1 |
20070179769 | Grichnik et al. | Aug 2007 | A1 |
20070203864 | Grichnik | Aug 2007 | A1 |
20080154811 | Grichnik et al. | Jun 2008 | A1 |
Number | Date | Country |
---|---|---|
1103926 | May 2001 | EP |
1367248 | Dec 2003 | EP |
1418481 | May 2004 | EP |
10-332621 | Dec 1998 | JP |
11-351045 | Dec 1999 | JP |
2002-276344 | Sep 2002 | JP |
WO9742581 | Nov 1997 | WO |
WO02057856 | Jul 2002 | WO |
WO2006017453 | Feb 2006 | WO |
Number | Date | Country | |
---|---|---|---|
20070203864 A1 | Aug 2007 | US |