Abstract
Continual learning, the capability to learn new knowledge from streaming data without forgetting the previous knowledge, is a critical requirement for dynamic learning systems, especially for emerging edge devices such as self-driving cars and drones. However, continual learning is still facing the catastrophic forgetting problem. Previous work illustrate that model performance on continual learning is not only related to the learning algorithms but also strongly dependent on the inherited model, i.e., the model where continual learning starts. The better stability of the inherited model, the less catastrophic forgetting and thus, the inherited model should be elaborately selected. Inspired by this finding, we develop an evolutionary neural architecture search (ENAS) algorithm that emphasizes the Stability of the inherited model, namely ENAS-S. ENAS-S aims to find optimal architectures for accurate continual learning on edge devices. On CIFAR-10 and CIFAR-100, we present that ENAS-S achieves competitive architectures with lower catastrophic forgetting and smaller model size when learning from a data stream, as compared with handcrafted DNNs.