Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

NNI 2021 June~July Iteration Planning #3724

Closed
kvartet opened this issue Jun 4, 2021 · 3 comments
Closed

NNI 2021 June~July Iteration Planning #3724

kvartet opened this issue Jun 4, 2021 · 3 comments

Comments

@kvartet
Copy link
Contributor

kvartet commented Jun 4, 2021

This is the plan for the iteration in June~July Iteration, it's a 6 weeks iteration.

Release Plan for v2.4

  • Release manager: @QuanluZhang
  • Feature freeze date: July 14
  • Code freeze and demo date: July 21 23 26
  • Branch cut and next release planning date: July 28 August 2
  • Release date: August 2 4 11

Retiarii & NAS

Model Compression

Training service & NNI manager && nnictl

WebUI

Hyper-parameter tuning

Pipeline

Stretch goals

NAS

  • P1 - Evaluate one-shot strategies on NAS benchmarks @ultmaster
  • P1 - Further improve the logic of graph generation and code generation (Further integrated with TorchScript) @QuanluZhang
  • P1 - Test classic nas tuners on NNI NAS benchmark and report evaluation results @ultmaster
  • P2 - Migrate Cream from NAS v1.0 to Retiarii framework @ yuhangchen
  • P2 - refactor of Retiarii execution engine @ultmaster
    • support local debug mode for pure-python execution engine
    • support weight transfer from supernet to submodel
    • support export of top model for pure-python execution engine
    • P2 - refactor of "self._cur_samples" in mutators
    • P2 - support validation for other strategies
  • P1 - Review strategy/experiment/rest stop condition @QuanluZhang
  • P1 - support graph-based NAS algorithms @ultmaster @skeletondyh
  • P1 - Deal with strategy failure

model compression

  • P1 - MixedMaskerPruner @J-shang
  • P2 - Refactor of model graph generation (as an independent component) @zheng-ningxin

nnimanager

  • P2 - Mask confidential config fields (depends on prev items) @liuzhe-lz
  • P2 - nnictl experiment export improvements @J-shang
  • P2 - internal tuner APIs for improving tuner debuggability, allow debugger tool @liuzhe-lz
  • P2 - nnictl auto-completion @liuzhe-lz
  • P2 - Dependency-aware supported in iterative pruner (discussion needed)
  • P2 - Add full test (l2filter, Taylor, NetAdapt, SimulatedAnnealing, AutoCompress, AMC, Sensitivity, ADMM, Lottery Ticket)
  • P1 - reuse mode: ADL kubeflow support @SparkSnail
  • P2 - add/del tag of experiments (pending support from rest server)
  • UI: support experiment resume/view/stop on webUI
  • P0 - (discussion needed) how to support specifying initial configs (Can we set default values of hyper parameters for HPO? #3801 ) @ultmaster
  • P2 - (discussion needed) support constraints on hyper-parameters @ultmaster
  • P2 - handle all reject errors @J-shang
@QuanluZhang
Copy link
Contributor

From Ali PAI team

  • global sort, importance sort on all layers
  • QAT improvement, PTQ initialization (set scale, zero point), post-training quantization
  • new model compression algorithms (e.g., Bert), model-specific compression (not that generic), sparsifying training and sparsifying operator

@QuanluZhang
Copy link
Contributor

QuanluZhang commented Jul 26, 2021

bugbash

Retiarii & NAS

Model Compression

Training service & NNI manager && nnictl

WebUI

Hyper-parameter tuning

@QuanluZhang
Copy link
Contributor

QuanluZhang commented Aug 3, 2021

fix pipeline:

@QuanluZhang QuanluZhang unpinned this issue Aug 16, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

2 participants