Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Commit

Permalink
update doc: TextNAS and PBT tuner (#2279)
Browse files Browse the repository at this point in the history
  • Loading branch information
QuanluZhang authored Apr 9, 2020
1 parent 41faab3 commit 2e88fe7
Show file tree
Hide file tree
Showing 4 changed files with 7 additions and 6 deletions.
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -132,7 +132,8 @@ Within the following table, we summarized the current NNI capabilities, we are g
<li><a href="docs/en_US/NAS/CDARTS.md">CDARTS</a></li>
<li><a href="docs/en_US/NAS/SPOS.md">SPOS</a></li>
<li><a href="docs/en_US/NAS/Proxylessnas.md">ProxylessNAS</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#NetworkMorphism">Network Morphism</a> </li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#NetworkMorphism">Network Morphism</a></li>
<li><a href="docs/en_US/NAS/TextNAS.md">TextNAS</a></li>
</ul>
</ul>
<a href="docs/en_US/Compressor/Overview.md">Model Compression</a>
Expand Down
3 changes: 2 additions & 1 deletion docs/en_US/NAS/Overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,8 @@ NNI currently supports the NAS algorithms listed below and is adding more. Users
| [P-DARTS](PDARTS.md) | [Progressive Differentiable Architecture Search: Bridging the Depth Gap between Search and Evaluation](https://arxiv.org/abs/1904.12760) is based on DARTS. It introduces an efficient algorithm which allows the depth of searched architectures to grow gradually during the training procedure. |
| [SPOS](SPOS.md) | [Single Path One-Shot Neural Architecture Search with Uniform Sampling](https://arxiv.org/abs/1904.00420) constructs a simplified supernet trained with a uniform path sampling method and applies an evolutionary algorithm to efficiently search for the best-performing architectures. |
| [CDARTS](CDARTS.md) | [Cyclic Differentiable Architecture Search](https://arxiv.org/abs/****) builds a cyclic feedback mechanism between the search and evaluation networks. It introduces a cyclic differentiable architecture search framework which integrates the two networks into a unified architecture.|
| [ProxylessNAS](Proxylessnas.md) | [ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware](https://arxiv.org/abs/1812.00332).|
| [ProxylessNAS](Proxylessnas.md) | [ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware](https://arxiv.org/abs/1812.00332). It removes proxy, directly learns the architectures for large-scale target tasks and target hardware platforms. |
| [TextNAS](TextNAS.md) | [TextNAS: A Neural Architecture Search Space tailored for Text Representation](https://arxiv.org/pdf/1912.10729.pdf). It is a neural architecture search algorithm tailored for text representation. |

One-shot algorithms run **standalone without nnictl**. Only the PyTorch version has been implemented. Tensorflow 2.x will be supported in a future release.

Expand Down
4 changes: 1 addition & 3 deletions docs/en_US/Tuner/PBTTuner.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,8 +47,6 @@ tuner:
population_size: 10
```
### Limitations
The current implementation only supports search space types in `float`, including `uniform`, `normal`. The support of other search space types is ongoing.
### Limitation
Importing data has not been supported yet.
3 changes: 2 additions & 1 deletion docs/en_US/Tuner/PPOTuner.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,8 @@ PPO Tuner on NNI

This is a tuner geared for NNI's Neural Architecture Search (NAS) interface. It uses the [ppo algorithm](https://arxiv.org/abs/1707.06347). The implementation inherits the main logic of the ppo2 OpenAI implementation [here](https://github.com/openai/baselines/tree/master/baselines/ppo2) and is adapted for the NAS scenario.

It can successfully tune the [mnist-nas example](https://github.com/microsoft/nni/tree/master/examples/trials/mnist-nas), and has the following result:
We had successfully tuned the mnist-nas example and has the following result:
**NOTE: we are refactoring this example to the latest NAS interface, will publish the example codes after the refactor.**

![](../../img/ppo_mnist.png)

Expand Down

0 comments on commit 2e88fe7

Please sign in to comment.