Will online KD be added? #431
simoneangarano
started this conversation in
Ideas
Replies: 2 comments 3 replies
-
What online KD method is in your mind? And what task? |
Beta Was this translation helpful? Give feedback.
2 replies
-
Exactly, the original DML is meant to be performed with separate optimizers for student and teacher (like in adversarial training). Will this feature be added in the next release?
Simone Angarano
PhD Student at Politecnico di Torino<http://polito.it/> – Interdepartmental Center for Service Robotics (PIC4SeR<http://pic4ser.polito.it/>)
Visiting Research Scholar at the University of Texas at Austin<http://utexas.edu/> – VITA Group<https://vita-group.github.io/>
Email: ***@***.******@***.***> / ***@***.******@***.***>
Website: simoneangarano.github.io<http://simoneangarano.github.io/>
[1619187639606%3fe=1696348800&v=beta&t=E4bfCfB_qtCV_CkWFim00Ue-fdax4EBIFW7cUwLXJu0] [signature_1810599752]
From: Yoshitomo Matsubara ***@***.***>
Date: Friday, 22 December 2023 at 18:55
To: yoshitomo-matsubara/torchdistill ***@***.***>
Cc: Simone Angarano ***@***.***>, Mention ***@***.***>
Subject: Re: [yoshitomo-matsubara/torchdistill] Will online KD be added? (Discussion #431)
Training a teacher model from scratch and/or its auxiliary modules is possible. For the former, you can train a teacher model from scratch using one configuration file<https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/cifar10/ce/densenet_bc_k12_depth100-final_run.yaml> and then train a student model using the pretrained teacher model with another config<https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/cifar10/kd/resnet20_from_densenet_bc_k12_depth100-final_run.yaml>.
For DML, if the optimization can be done with one optimizer (say one SGD optimizer contains both models' parameters), you can do that with the current torchdistill (v1.0.0) and an example script.
If it does require two separate optimization steps (one for the first model, the other for the second model), then the current torchdistill doesn't support that
—
Reply to this email directly, view it on GitHub<#431 (reply in thread)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/ARVNNW3D7HZ6ECO2MUMDYKLYKXCSFAVCNFSM6AAAAABA6HTYG6VHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM3TSMRZHEYDQ>.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Or maybe there's a workaround to do it in the present version...
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions