Back
Neural Architecture Search (NAS) Overview
webautoml.org·automl.org/nas-overview/
Relevant to AI safety discussions about automated capability improvement and recursive self-improvement; NAS is an example of AI systems assisting in the design of more capable AI architectures, raising questions about automation of AI development pipelines.
Metadata
Importance: 35/100documentationeducational
Summary
An overview of Neural Architecture Search (NAS), a subfield of AutoML that automates the design of neural network architectures. It covers the key methods, search spaces, and optimization strategies used to automatically discover high-performing architectures, reducing the need for manual human design.
Key Points
- •NAS automates the discovery of optimal neural network architectures, replacing labor-intensive manual design by human experts.
- •Key components include defining a search space, a search strategy (e.g., reinforcement learning, evolutionary algorithms, gradient-based), and a performance estimation strategy.
- •NAS methods can find architectures that outperform manually designed ones on benchmarks like image classification and language modeling.
- •Computational cost is a major challenge; early NAS methods required thousands of GPU hours, though newer approaches like weight sharing have reduced this significantly.
- •NAS represents a form of automated capability improvement relevant to AI safety discussions about recursive self-improvement and autonomous AI development.
Cited by 2 pages
| Page | Type | Quality |
|---|---|---|
| Self-Improvement and Recursive Enhancement | Capability | 69.0 |
| Novel / Unknown Approaches | Capability | 53.0 |
Cached Content Preview
HTTP 200Fetched Apr 9, 20265 KB
AutoML | Neural Architecture Search
-->
Neural Architecture Search
Neural Architecture Search (NAS) automates the process of architecture design of neural networks. NAS approaches optimize the topology of the networks, incl. how to connect nodes and which operators to choose. User-defined optimization metrics can thereby include accuracy, model size or inference time to arrive at an optimal architecture for specific applications. Due to the extremely large search space, traditional evolution or reinforcement learning-based AutoML algorithms tend to be computationally expensive. Hence recent research on the topic has focused on exploring more efficient ways for NAS. In particular, recently developed gradient-based and multi-fidelity methods have provided a promising path and boosted research in these directions. Our group has been very active in developing state of the art NAS methods and has been at the forefront of driving NAS research forward. We give a summary of a few recent important work released from our group –
Selected NAS Papers
Literature Overview
NAS is one of the booming subfields of AutoML and the number of papers is quickly increasing. To provide a comprehensive overview of the recent trends, we provide the following sources:
NAS survey paper [JMLR 2020]
A book chapter on NAS from our open-access book , “AutoML: Methods, System, Challengers”
A continuously updated page with a comprehensive NAS literature overview
A github repo keeping track of the recent work at the intersection of NAS and Transformers awesome-transformer-search
One-Shot NAS Methods
Understanding and Robustifying Differentiable Architecture Search [ICLR 2020, Oral]
Meta Learning of Neural Architectures
MetaNAS: Meta-Learning of Neural Architectures for Few-Shot Learning [CVPR 2020]
Neural Ensemble Search
Neural Ensemble Search for Uncertainty Estimation and Dataset Shift [NeurIPS 2021]
Multi-headed Neural Ensemble Search [ ICML 2021, UDL Workshop ]
Joint NAS and Hyperparameter Optimization
Towards Automated Deep Learning: Efficient Joint Neural Architecture and Hyperparameter Search [ ICML 2018, AutoML Workshop ]
Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization [ ICML 2021, AutoML Workshop ]
Multi-Objective NAS
LEMONADE: Efficient multi-objective neural architecture search via lamarckian evolution [ ICLR 2019 ]
Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization [ ICML 2021, AutoML Workshop ]
Application-Specific NAS
Neural Architecture Search for Dense Prediction Tasks in Computer Vision
Large-scale study of NAS methods
How Powerful are Performance Predictors in Neural Architecture Search? [NeurIPS 2021]
NAS-Bench-Suite: NAS Evaluation is (Now) Surpri
... (truncated, 5 KB total)Resource ID:
d01d8824d9b6171b | Stable ID: sid_U6WxiBzveM