Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deprecate trainer.slurm_job_id #10615

Closed
ananthsub opened this issue Nov 18, 2021 · 3 comments · Fixed by #10622
Closed

Deprecate trainer.slurm_job_id #10615

ananthsub opened this issue Nov 18, 2021 · 3 comments · Fixed by #10622
Assignees
Milestone

Comments

@ananthsub
Copy link
Contributor

ananthsub commented Nov 18, 2021

Proposed refactor

Deprecate the trainer property here: https://github.com/PyTorchLightning/pytorch-lightning/blob/6b728713bb3b35ad58cd0085acaa443b33ab03ac/pytorch_lightning/trainer/trainer.py#L1731-L1744

Motivation

This property doesn't depend on the Trainer at all: this property can be converted into a utility function/staticmethod instead.
Proposal: Migrate this logic into the SLURMEnvironment class as a staticmethod. Doing this will consolidate SLURM environment variable code in Lightning and simplify the Trainer. https://github.com/PyTorchLightning/pytorch-lightning/search?q=slurm_job_id

Pitch

  1. Move this logic to the SLURMEnvironment class as a staticmethod
@staticmethod
def job_id() -> Optional[int]:
@staticmethod
def job_name() -> Optional[str]:
  1. Then replace usages of trainer.slurm_job_id with SLURMEnvironment.slurm_job_id() / slurm_job_name
  2. Then deprecate trainer.slurm_job_id

Additional context

Part of #7740


If you enjoy Lightning, check out our other projects! ⚡

  • Metrics: Machine learning metrics for distributed, scalable PyTorch applications.

  • Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.

  • Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.

  • Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.

  • Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.

cc @justusschock @awaelchli @akihironitta @tchaton @kaushikb11 @Borda

@awaelchli
Copy link
Contributor

Yes agree! Probably can drop the slurm prefix

@staticmethod
def job_id() -> Optional[int]:

@ananthsub ananthsub added the good first issue Good for newcomers label Nov 18, 2021
@ananthsub ananthsub added this to the 1.6 milestone Nov 18, 2021
@AndresAlgaba
Copy link
Contributor

As it is marked as a "good first issue", do you mind if I create a PR for this issue? I will have time next Monday and Tuesday to work on this :).

@ananthsub
Copy link
Contributor Author

@AndresAlgaba that'd be great!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants