Hyperparameter Optimization

Prasanna Balaprakash, Argonne National Laboratory
Webinar Beginner
AI for Science training

Scientific data sets are diverse and often require data-set-specific deep neural network (DNN) models. Nevertheless, designing high-performing DNN architecture for a given data set is an expert-driven, time-consuming, trial-and-error manual task. To that end, we have developed DeepHyper, a software package that uses scalable neural architecture and hyperparameter search to automate the design and development of DNN models for scientific and engineering applications. In this session, trainees will learn how to use DeepHyper for scalable and automated development of deep neural networks.

Time: February 10, 3-5 p.m. US CT

This session is a part of the ALCF AI for Science Training Series

About the Speakers

Prasanna Balaprakash is a computer scientist at the Mathematics and Computer Science Division with a joint appointment in the Leadership Computing Facility at Argonne National Laboratory. His research interests span the areas of artificial intelligence, machine learning, optimization, and high-performance computing. Currently, his research focuses on the development of scalable, data-efficient machine learning methods for scientific applications. He is a recipient of U.S. Department of Energy 2018 Early Career Award. He is the machine-learning team lead and data-understanding team co-lead in RAPIDS, the SciDAC Computer Science institute. Prior to Argonne, he worked as a Chief Technology Officer at Mentis Sprl, a machine learning startup in Brussels, Belgium. He received his PhD from CoDE-IRIDIA (AI Lab), Université Libre de Bruxelles, Brussels, Belgium, where he was a recipient of Marie Curie and F.R.S-FNRS Aspirant fellowships.