Modern Challenges in Inverse Problems: A Biased View

Matthias Chung, Virginia Tech
Computing Abstraction

Emerging fields such as data analytics, machine learning, and uncertainty quantification heavily rely on efficient computational methods for solving inverse problems. With growing model complexities and ever-increasing data volumes, state of the art inference method exceeded their limits of applicability and novel methods are urgently needed. In this talk, we discuss modern challenges in inverse problems and introduce novel approaches to overcome such challenges. For instance, we discuss massive least squares problems, where the size of the forward model exceeds the storage capabilities of computer memory, or the data is simply not available all at once. Here, randomized methods may be used to approximate solutions. We introduce sampled limited memory approaches, where an approximation of the global curvature of the underlying least squares problem is used to speed-up initial convergence while automatically addressing potential ill-posedness. We further discuss, how deep neural networks may benefit inversion processes and how uncertainty quantification may be improved by incorporating surrogate data. Numerical experiments such as superresolution, tomographic reconstruction, and deblurring will illustrate our presented approaches.

Bluejeans Link: