An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
NNI automates feature engineering, neural architecture search, hyperparameter tuning, and model compression for deep learning. Find the latest features, API, examples and tutorials in our official documentation (简体中文版点这里).
See the NNI installation guide to install from pip, or build from source.
To install the current release:
$ pip install nni
To update NNI to the latest version, add --upgrade
flag to the above commands.
Hyperparameter Tuning | Neural Architecture Search | Model Compression | |
Algorithms |
|
||
Supported Frameworks | Training Services | Tutorials | |
Supports |
|
|
If you want to contribute to NNI, be sure to review the contribution guidelines, which includes instructions of submitting feedbacks, best coding practices, and code of conduct.
We use GitHub issues to track tracking requests and bugs.
Please use NNI Discussion for general questions and new ideas.
For questions of specific use cases, please go to Stack Overflow.
Participating discussions via the following IM groups is also welcomed.
Gitter | ||
---|---|---|
OR |
Over the past few years, NNI has received thousands of feedbacks on GitHub issues, and pull requests from hundreds of contributors.
We appreciate all contributions from community to make NNI thrive.
Type | Status |
---|---|
Fast test | |
Full test - HPO | |
Full test - NAS | |
Full test - compression |
Type | Status |
---|---|
Local - linux | |
Local - windows | |
Remote - linux to linux | |
Remote - windows to windows | |
OpenPAI | |
Frameworkcontroller | |
Kubeflow | |
Hybrid | |
AzureML |
Targeting at openness and advancing state-of-art technology, Microsoft Research (MSR) had also released few other open source projects.
We encourage researchers and students leverage these projects to accelerate the AI development and research.
The entire codebase is under MIT license.