Reverse-differentiation of shared-memory parallel code

Jan Huckelheim
Seminar

Adjoint derivatives reveal the sensitivity of a computer program's output to changes in its inputs. These derivatives are useful e.g. in optimization. Algorithmic differentiation (AD) is an established method to transform a given computation into its corresponding adjoint computation. A key challenge in this process is the efficiency of the resulting adjoint computation. This becomes especially pressing with the increasing use of shared-memory parallelism on multi- and many-core architectures, for which AD support is currently insufficient.

This talk will give an overview of challenges and solutions for the differentiation of shared-memory-parallel code, using two examples: an unstructured-mesh CFD solver, and a structured-mesh stencil kernel, both parallelized with OpenMP. The methods shown in this talk result in AD-generated adjoint solvers that scale as well as their underlying original solvers on CPUs and a KNC XeonPhi.