-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to utilize timm's scheduler? #5555
Comments
One solution is to wrap it into a callback. I met similiar problem, and I created a MMCVLrCallback. Hope help. |
Dear @soomiles, It shouldn't be hard to extend the conditions to support it. Best, |
Hello, has there been any update on this? @soomiles, have you perhaps worked it out? |
I plan to make a PR in these days. |
This issue has been automatically marked as stale because it hasn't had any recent activity. This issue will be closed in 7 days if no further activity occurs. Thank you for your contributions, Pytorch Lightning Team! |
Is the timm scheduler still not supported? |
This wont work with multiple parameter groups, but a quick fix if anyone else want to use it: from timm.scheduler import StepLRScheduler
class TimmStepLRScheduler(torch.optim.lr_scheduler.LambdaLR):
def __init__(self, optim, **kwargs):
self.init_lr = optim.param_groups[0]["lr"]
self.timmsteplr = StepLRScheduler(optim, **kwargs)
super().__init__(optim, self)
def __call__(self, epoch):
desired_lr = self.timmsteplr.get_epoch_values(epoch)[0]
mult = desired_lr / self.init_lr
return mult |
Issue still exists for me (updated to latest lightning version) upon running
The fix according to the PR that I followed:
The scheduler object was: |
Upon further debugging, I realised that the latest lightning (1.5.10) is not updated yet with this merge. Silly me, I was assuming that the latest tag will follow the master branch as the practice is to only merge when stable and necessary. I saw that it was merged to master and immediately assume that it will be released upon the new updates, but further reading in details (looking at dates after this merge), I see that it was not brought up in any updates. Later on, I found out that it is actually going to only be tagged in the upcoming 1.6 version. Sharing this for learning purposes in case someone was inexperienced as I am. Remember to check the PR milestone! |
🚀 Feature
Hi, I want to reproduce a result of image classification network by using timm library.
But I couldn't use timm.scheduler.create_scheduler because pytorch_lightning doesn't accept custom class for a scheduler.
(timm.scheduler is not the torch.optim.lr_scheduler class)
Then results this error
Is there a plan for utilizing timm's scheduler?
Motivation
Pitch
Alternatives
Additional context
The text was updated successfully, but these errors were encountered: