Outer-loop problems arising in scientific applications (such as optimization, uncertainty quantification and inverse problems) require repeated evaluation of computationally intensive numerical models, such as those arising from discretization and solution of ordinary and partial differential equations. The cost of these evaluations makes solution using the model prohibitive, and efficient accurate surrogates are a key to solving these problems in practice. In this talk I will investigate how informed subspaces of the input and output spaces can be used to construct parsimonious encoder-decoder neural networks that have weight dimensions that can be made independent of the discretization dimension. These compact representations require relatively few data to train and outperform conventional data-driven approaches which require large training data sets.
Please use this link to attend the virtual seminar: