Inference Design for the Uncertainty Quantification of Extreme-Scale Simulations

Ludger Paehler, Technical University of Munich
Supercomputer showdown

The uncertainty quantification of state-of-the-art numerical solvers poses significant computational constraints on users, even on nascent (pre-) Exascale systems. Requiring Bayesian inference at its core, significant computational resources have to be committed to the inference routine, which tends to be computationally intractable for large-scale fluid mechanics models where a single sample can require up to 600k CPUh. Hence forcing us to exploit potential model hierarchies to the fullest extent possible. But Multifidelity still tends to rely on random sampling at its core, hence wasting resources where we could sample more effectively. Rapid advances in fields adjacent to uncertainty quantification, such as machine-learning based design, experimental design, combinatorial optimization, sequential decision making, and reinforcement learning at scale enable us to pose this as the design of an inference routine with intelligent sampling agents at its core. These gain an understanding of the task- and problem-structure in their attention-based policy networks, hence enabling inference and uncertainty quantification on extreme scales.

In this talk, I will present an approach to this problem, which based on a graph network representation of the inference routine in combination with a Transformer-based placement network, orchestrates and designs the inference routine while amortizing the cost of the large-scale reinforcement learning training across later inference studies and transformer trainings, where it acts as an active learning agent.

Bluejeans Link: https://bluejeans.com/797313739/6893