Skip to main content
SHARE
Publication

Multiobjective Hyperparameter Optimization for Deep Learning Interatomic Potential Training Using NSGA-II...

Publication Type
Conference Paper
Book Title
International Conference on Parallel Processing Workshops (ICPP-W2023)
Publication Date
Page Numbers
1 to 8
Publisher Location
New York, United States of America
Conference Name
The Third International Workshop on Parallel and Distributed Algorithms for Decision Sciences (PDADS)
Conference Location
Salt Lake City, Utah, United States of America
Conference Sponsor
Oak Ridge National Laboratory
Conference Date
-

Deep neural network (DNN) potentials are an emerging tool for simulation of dynamical atomistic systems, with the promise of quantum mechanical accuracy at speedups of 10000$\times$. As with other DNN methods, hyperparameters used during training can make a substantial difference in model accuracy, and optimal settings vary with dataset. To enable rapid tuning of hyperparameters for DNN potential training, we developed a scalable multiobjective optimization evolutionary algorithm for supercomputers and tested it on the Summit system at the Oak Ridge Leadership Computing Facility (OLCF). The multiobjective approach is required due to the coupling of two learned values defining the potential: the energy and force. Using a large-scale implementation of the NSGA-II algorithm adapted for training DNN potentials, we discovered several optimal multiobjective combinations, including best choices of activation functions, learning rate scaling scheme, and pairing of the two radial cutoffs used in the three dimensional descriptor function.