Skip to main content
SHARE
Publication

Evolutionary NAS in Light of Model Stability for Accurate Continual Learning

by Xiaocong Du, Zheng Li, Jingbo Sun, Frank Y Liu, Yu Cao
Publication Type
Conference Paper
Book Title
Proceedings of 2021 International Joint Conference on Neural Networks (IJCNN)
Publication Date
Publisher Location
New Jersey, United States of America
Conference Name
International Joint Conference on Neural Networks (IJCNN)
Conference Location
Virtual, Tennessee, United States of America
Conference Sponsor
IEEE
Conference Date
-

Continual learning, the capability to learn new knowledge from streaming data without forgetting the previous knowledge, is a critical requirement for dynamic learning systems, especially for emerging edge devices such as self-driving cars and drones. However, continual learning is still facing the catastrophic forgetting problem. Previous work illustrate that model performance on continual learning is not only related to the learning algorithms but also strongly dependent on the inherited model, i.e., the model where continual learning starts. The better stability of the inherited model, the less catastrophic forgetting and thus, the inherited model should be elaborately selected. Inspired by this finding, we develop an evolutionary neural architecture search (ENAS) algorithm that emphasizes the Stability of the inherited model, namely ENAS-S. ENAS-S aims to find optimal architectures for accurate continual learning on edge devices. On CIFAR-10 and CIFAR-100, we present that ENAS-S achieves competitive architectures with lower catastrophic forgetting and smaller model size when learning from a data stream, as compared with handcrafted DNNs.