After following this course, the student is able to
- numerically solve elliptic PDEs with random parameters (Forward problems)
- infer system parameters via Bayesian approaches (Inverse problems)
- use sampling techniques to estimate characteristic outputs of interest (Uncertainty propagation)
- build data-driven surrogate models for multi-query simulations (Acceleration)
- quantify uncertainties in the research of their own discipline (Interdisciplinary understanding)
Uncertainty quantification aims at the synthesis of probability, statistics, model development, mathematical and numerical analysis, large-scale simulations, experiments, and domain sciences to provide a computational framework for quantifying input and response uncertainties in a manner that facilitates predictions with quantified and reduced uncertainty. More recently, data-driven modeling benefits from the powerful tools of machine learning and provides brand new perspectives for physics-based simulation science, which lays the foundation of physics-informed machine learning.|
In this course, we will discuss both the forward propagation of uncertainties from inputs to responses and the inverse estimation of system parameters through Bayesian frameworks. For the speed-up of multi-query simulations that are required by uncertainty quantification, major surrogate modeling techniques will be introduced as well. This course covers the basics of uncertainty quantification and is also closely connected to the research frontier of computational science. With the aid of this course, the students are expected to acquire
competence to independently solve uncertainty quantification problems in their own research disciplines.
Main topics include:
- Fundamentals of probability and statistics
- Numerical discretization for PDEs with random parameters
- Sampling techniques: Monte Carlo, importance sampling, and subset simulation
- Bayesian inference and its applications
- Gaussian processes for surrogate modeling
- Basics of projection-based reduced order modeling
- Multi-fidelity surrogate modeling
Oral Exam (65%) and Final Project Report (35%)
Assumed previous knowledge
- Solid knowledge in
- linear algebra, e.g., via AM module 1 202001325, CS module 3 202001205, BMT module 3 202001203, EE module 4 202001209, ME module 4 202001210, or their equivalent,
- vector calculus. e.g., via AM module 3 202001229, BMT module 5 202001225, EE module 3 202001231, ME module 5 202001228, or their equivalent, and
- probability theory, e.g., via AM module 4 202001344, CS module 4 202001233, EE module 8 202001235, or their equivalent.
- Programming experience in Matlab or Python.
- Working knowledge of numerical analysis, e.g., via Numerical Mathematics (202001356) or its equivalent.
- Knowledge of machine learning, e.g., via Machine Learning I (201600070) or Deep learning – From Theory to Practice (201800177).
- Knowledge of numerical methods for PDEs, e.g., via Numerical Techniques for PDE (191551150) or Fundamentals of Numerical Methods (201900074).
|Master Applied Mathematics|
|Master Biomedical Engineering|
|Master Electrical Engineering|
|Master Mechanical Engineering|
|Master Technical Medicine||Required materials|
|Lecture notes, available digitally.|
|Uncertainty Quantification: Theory, Implementation, and Applications - R.C. Smith. ISBN 9781611973211|
|Gaussian Processes for Machine Learning - C.E. Rasmussen and C. Williams. ISBN 9780262182539.|
|Reduced Basis Methods for Partial Differential Equations: An Introduction - A. Quarteroni, A. Manzoni and N. Federico. ISBN 9783319154305|
|Oral Exam, Final Project Report|