Randomization

JC Julia Calatayud
MJ Marc Jornet
JM Jorge Mateu
ask Ask a question
Favorite

Part of the variability of the data is not captured by the deterministic model. We assume the errors are random variables naturally defined in a probabilistic framework. We also consider the lack of knowledge on the input parameters, which prescribe the constitutive laws of the system, represented within a probabilistic framework (Smith 2013, chapter 1), (Le Maître and Knio 2010, chapter 1). Thus, we consider in this new setup the parameters and the errors of the model as random variables.

The field of uncertainty quantification studies the impact of random uncertainties on models (Smith 2013). This quantification is necessary to evaluate the discrepancies between the model predictions and the current system behavior. Inverse uncertainty quantification deals with inference of the probability distributions of the parameters from the data. These probability distributions are not, in general, independent. Forward uncertainty quantification extracts the main statistical content of the model output, once the probability distributions of the parameters are fixed (Le Maître and Knio 2010; Xiu 2010). The various stages of the probabilistic modeling process are schematically illustrated in Fig. 2. Inverse uncertainty quantification is not an easy task. Here we rely on the Bayesian bootstrap technique. Forward uncertainty quantification will be conducted via Monte Carlo simulation.

Schematic illustration of the various stages of the probabilistic modeling process

Do you have any questions about this protocol?

Post your question to gather feedback from the community. We will also invite the authors of this article to respond.

post Post a Question
0 Q&A