
We consider the effects of Neural Network architecture in the setting of continual learning. Using dynamic programming we complete a bilevel optimization to determine the optimal architecture for the current and previous task data followed by the optimal weights for the network.
Bio: Allyson Hahn is a mathematics PhD candidate at Northern Illinois University, where she studies geometric function theory on quasiconformally homogeneous domains. She will be defending her dissertation in October 2025. In Summer 2024 and Summer 2025, she served as a PhD research aide under the advisement of Dr. Krishnan Raghavan in the mathematics and computer science division at Argonne National Laboratory. Additionally, Allyson is a part-time adjunct faculty member in the mathematics department at North Central College.
See all upcoming talks at https://www.anl.gov/mcs/lans-seminars