When Do We Stop SGD?

Vivak Patel, University of Wisconsin - Madison

I n this talk, we will motivate the need for stopping criteria for stochastic gradient descent; subsequently, we highlight and address the four main challenges for the rigorous development of such criteria: strong convergence, detectability, controlling false positive rates, and controlling false negative rates. We note that the result on strong convergence is currently the most general result of its kind for SGD on nonconvex functions, and its proof employs techniques that may be of independent interest to those who study stochastic algorithms.

Please use this link to attend the virtual seminar:

https://bluejeans.com/820394850