Single Path One-Shot (SPOS): Efficient Neural Architecture Search with Simplified Supernet

01/08/2024
Single Path One-Shot (SPOS): Efficient Neural Architecture Search with Simplified Supernet

Listen "Single Path One-Shot (SPOS): Efficient Neural Architecture Search with Simplified Supernet"

Episode Synopsis


The paper introduces a novel approach called Single Path One-Shot (SPOS) for Neural Architecture Search (NAS). SPOS decouples architecture search from supernet training by using a simplified supernet with single paths and a uniform path sampling strategy, significantly improving efficiency and effectiveness. The method also incorporates channel search and mixed-precision quantization, leading to the discovery of accurate and resource-efficient neural network architectures.

SPOS addresses limitations of existing NAS methods by simplifying the supernet structure, utilizing an evolutionary algorithm, and incorporating channel search and mixed-precision quantization. The approach outperforms previous methods in accuracy, complexity, and resource efficiency. It demonstrates strong correlation between supernet and individual architecture performance, enhancing the search process efficiency.

Read full paper: https://arxiv.org/abs/1904.00420

Tags: Deep Learning, Optimization, Machine Learning

More episodes of the podcast Byte Sized Breakthroughs