Neural architecture search (NAS) provides an automatic solution in designing network architectures. Unfortunately, the direct search for complete task-dependent network architectures is laborious since training and evaluating complete neural architectures over a large search space are computationally prohibitive. Recently, one-shot NAS (OSNAS) has attracted great attention in the NAS community because it significantly speeds up the candidate architecture evaluation procedure through weight sharing. However, the full weight sharing training paradigm in OSNAS may result in strong interference across candidate architectures and mislead the architecture search. To alleviate the problem, we propose a partial weight sharing OSNAS framework that directly evolves complete neural network architectures. In particular, we suggest a novel node representation scheme that randomly activates a subset of nodes of the one-shot model in each generation to reduce the weight coupling in the one-shot model. During the evolutionary search, a tailored crossover operator randomly samples the nodes from two parent individuals or a single parent to construct new candidate architectures, thus effectively constraining the degree of weight sharing. Furthermore, we introduce a new mutation operator that replaces the chosen nodes of the one-shot model with randomly generated nodes to enhance the exploratory capability. Finally, we encode a set of pyramidal convolution operations in the search space, enabling the evolved neural networks to capture different levels of details in the images. The proposed method is examined and compared with 26 state-of-the-art algorithms on ten image classification tasks, including CIFAR series, CINIC10, ImageNet, and MedMNIST series. The experimental results demonstrate that the proposed method can computationally much more efficiently find neural architectures that achieve comparable classification accuracy to the state-of-the-art designs.
Evolutionary Search for Complete Neural Network Architectures With Partial Weight Sharing
Neural architecture search (NAS) provides an automatic solution in designing network architectures. Unfortunately, the direct search for complete task-dependent network architectures is laborious since training and evaluating complete neural architectu…