TITLE: Statistical Adjustment, Calibration, and Uncertainty Quantification of Complex Computer Models
STUDENT: Huan Yan
ADVISOR: Roshan Vengazhiyil
SUMMARY:
This thesis consists of three chapters, covering topics from the adjustment, calibration and uncertainty quantification of computer models to their applications in cardiac cell modeling and uncertainty propagation in machining process. The first chapter systematically develops an engineering-driven statistical adjustment and calibration framework, and the second chapter is devoted to the calibration of potassium current model in a cardiac cell. The third chapter proposes a novel emulator for approximating the complex computer model in the analysis of uncertainty propagation.
Engineering model development involves several simplifying assumptions for the purpose of mathematical tractability which are often not realistic in practice. This leads to discrepancies in the model predictions. A commonly used statistical approach to overcome this problem is to build a statistical model for the discrepancies between the engineering model and observed data. In contrast, an engineering approach would be to find the causes of discrepancy and fix the engineering model using first principles. However, the engineering approach is time consuming, whereas the statistical approach is fast. The drawback of the statistical approach is that it treats the engineering model as a black box and therefore, the statistically adjusted models lack physical interpretability. In the first chapter, we propose a new framework for model calibration and statistical adjustment. It tries to open up the black box using simple main effects analysis and graphical plots and introduces statistical models inside the engineering model. This approach leads to simpler adjustment models that are physically more interpretable. The approach is illustrated using a model for predicting the cutting forces in a laser-assisted mechanical micromachining process and a model for predicting the temperature of outlet air in a fluidized-bed process.
The second chapter studies the calibration of a complex computer model that describes the potassium currents in a cardiac cell. The computer model is expensive to evaluate and contains 24 unknown calibration parameters, which makes the problem very challenging for the traditional methods of calibration using kriging. We propose physics-driven strategies for the approximation and calibration of this large-complex model. Another difficulty with this problem is the presence of large cell-to-cell variation, which is modeled through random effects. We propose approximate Bayesian methods for the identification and estimation of the parameters in this complex nonlinear mixed-effects statistical model.
Monte Carlo and Quasi-Monte Carlo methods could be slow if the computer model is computationally expensive. An easy-to-evaluate metamodel is usually used to replace the computer model to improve the computational efficiency. However, the traditional metamodel could perform poorly for some complex system, e.g. the solid end milling process. In chapter three, we developed a new emulator, in which a base function is used to capture the general trend of the system and a discrepancy term to account for the remaining characteristics. We also proposed an optimal experimental design based on local input space for fitting the emulator. We call our proposed emulator local base emulator. In prediction, the simulation outputs with parameters at nominal values are used for the evaluation of base function. Linear function and Gaussian process are used for modeling the discrepancy. Through the solid end milling example, we show that the local base emulator is an efficient and accurate technique for uncertainty analysis and has obvious advantages over the other traditional tools.