3260 papers • 126 benchmarks • 313 datasets
Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures. Image Credit : NAS with Reinforcement Learning
(Image credit: Open Source)
These leaderboards are used to track progress in neural-architecture-search-40
Use these libraries to find neural-architecture-search-40 models and implementations
No datasets available.
Adding a benchmark result helps the community track progress.