-
Notifications
You must be signed in to change notification settings - Fork 1
Commits on Feb 17, 2021
-
Configuration menu - View commit details
-
Copy full SHA for ad36c7b - Browse repository at this point
Copy the full SHA ad36c7bView commit details -
Prevent flickering progress bar (#6009)
* add padding * fix * fix * Update pytorch_lightning/callbacks/progress.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * updated based on suggestion * changelog * add test * fix pep8 * resolve test * fix code format Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> Co-authored-by: tchaton <thomas@grid.ai>
Configuration menu - View commit details
-
Copy full SHA for 68fd308 - Browse repository at this point
Copy the full SHA 68fd308View commit details -
Fix Wrapping optimizers upon assignment (#6006)
* Update properties.py * pep8
Configuration menu - View commit details
-
Copy full SHA for 15d6788 - Browse repository at this point
Copy the full SHA 15d6788View commit details -
[Bugfix] Apply untoggle_optimizer when result is None (#5983)
* update changelog * apply untoggle_optimizer when result is None * update tests * still return loss sometimes * Update CHANGELOG.md Co-authored-by: deng-cy <dcy1996@gmail.com> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for a121fd3 - Browse repository at this point
Copy the full SHA a121fd3View commit details -
Configuration menu - View commit details
-
Copy full SHA for 6a409c7 - Browse repository at this point
Copy the full SHA 6a409c7View commit details -
* Add initial deepspeed changes * Address code review * Move static method outside of function * Fixes * Add missing annotation * Remove seed setting * Doc changes * Doc changes, add address reviews * Fix docs * Try fixing issue by moving to torch adam * Clean up check * Changes, better APIs! * Add wrapper, swap to git install revision * Add special test * Add warning * Address review * Add better disclaimer * Turn off ZeRO for testing due to compilation * Add description on modifying parameters via the plugin * Doc strings clear * Small doc fixes * Fix hash, reduce test * Added CI change * Move to azure pipeline * Fix test name * Add missing flag * Remove sudo... * Try conda instead * Swap to conda base * Try suggested install * Apply suggestions from code review * Apply suggestions from code review * Revert "Apply suggestions from code review" This reverts commit 41cca05 * Revert "Apply suggestions from code review" This reverts commit e06ec29 * Remove setter * Address most review * Move out function, remove DeepSpeed from requirements * Install deepspeed/mpi4py within container * Use special tests, move to master commit for deepspeed * Export path * Force compile to happen first * Remove! * Debugging ninja * Fix error in optimizer step logic * Attempt to fix symbolic link * Reverse to aid debugging * Export path again * Clean up mess * var * Revert "var" This reverts commit 3450eac * Address review, add todo * Add note about unsupported functionality Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: tchaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <jirka.borovec@seznam.cz>
Configuration menu - View commit details
-
Copy full SHA for 7189d67 - Browse repository at this point
Copy the full SHA 7189d67View commit details -
Trainer only references accelerator (#6039)
* Trainer only references accelerator where it can * Move teardown to the trainer, as it is reponsible for the accelerator
Configuration menu - View commit details
-
Copy full SHA for b7c2e0a - Browse repository at this point
Copy the full SHA b7c2e0aView commit details -
Configuration menu - View commit details
-
Copy full SHA for 8d7ac8f - Browse repository at this point
Copy the full SHA 8d7ac8fView commit details -
[feat] Add Trainer(stochastic_weight_avg=True/False) (#6038)
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: Kaushik B <45285388+kaushikb11@users.noreply.github.com> Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for c9622ba - Browse repository at this point
Copy the full SHA c9622baView commit details -
[CI] Move DeepSpeed into CUDA image, remove DeepSpeed install from az…
…ure (#6043) * Move to CUDA image * Remove deepspeed install as deepspeed now in the cuda image * Remove path setting, as ninja should be in the container now
Configuration menu - View commit details
-
Copy full SHA for 8440595 - Browse repository at this point
Copy the full SHA 8440595View commit details -
Configuration menu - View commit details
-
Copy full SHA for bac617f - Browse repository at this point
Copy the full SHA bac617fView commit details
Commits on Feb 18, 2021
-
Add option for weight tying on TPU's (#5441)
* added on_post_move_to_device * added tests * docs and refactors * Update tests/backends/test_tpu_backend.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update docs/source/tpu.rst Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update docs/source/tpu.rst Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/core/decorators.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update pytorch_lightning/core/decorators.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Update docs/source/tpu.rst Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> * Update pytorch_lightning/core/decorators.py Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> * Update pytorch_lightning/core/decorators.py Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> * Update pytorch_lightning/core/decorators.py Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> * Update pytorch_lightning/core/decorators.py Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> * Update pytorch_lightning/core/hooks.py Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> * moved weight sharing module back to test updated tpu available * add count to warning * fix doctest * import trainer in doctest * import trainer in doctest * do not test code as no TPU device * param count to layer count * formatting * update docs * update import * update * resolve tests * remove legacy accelerator Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: tchaton <thomas@grid.ai> Co-authored-by: Your Name <you@example.com>
Configuration menu - View commit details
-
Copy full SHA for d2cd7cb - Browse repository at this point
Copy the full SHA d2cd7cbView commit details -
Delete tests.helpers.TrialMNISTDataModule (#5999)
* Remove TrialMNISTDataModule * Allow using TrialMNIST in the MNISTDataModule * Update tests/helpers/datasets.py
Configuration menu - View commit details
-
Copy full SHA for bfcfac4 - Browse repository at this point
Copy the full SHA bfcfac4View commit details -
Fix: Allow hashing of metrics with lists in their state (#5939)
* Fix: Allow hashing of metrics with lists in their state * Add test case and modify semantics of Metric __hash__ in order to be compatible with structural equality checks * Fix pep8 style issue Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for 77f6aa4 - Browse repository at this point
Copy the full SHA 77f6aa4View commit details -
* et al. * Apply suggestions from code review * Apply suggestions from code review Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: chaton <thomas@grid.ai>
Configuration menu - View commit details
-
Copy full SHA for 6de8dca - Browse repository at this point
Copy the full SHA 6de8dcaView commit details -
[ModelPruning] Add missing attribute with use_global_unstructured=Fal…
…se and verbose (#6045)
Configuration menu - View commit details
-
Copy full SHA for 38ad9e0 - Browse repository at this point
Copy the full SHA 38ad9e0View commit details -
Configuration menu - View commit details
-
Copy full SHA for 049006a - Browse repository at this point
Copy the full SHA 049006aView commit details -
Add descriptions to accelerator broadcast function/clean up all_gather (
#6044) * Add descriptions to accelerator broadcast function/clean up all_gather * Remove todo
Configuration menu - View commit details
-
Copy full SHA for b019c25 - Browse repository at this point
Copy the full SHA b019c25View commit details -
Add before_batch_transfer and after_batch_transfer hooks (#3671)
* add hooks * comment * docs * add tests * make it private * fix tests * docs * chlog * testcode * codefactor * fix doctest * fix doctest * suggestions * is always overriden * pep and BoringModel * BoringModel * docs * docs * docs * fix * rebase * rebase * suggestions * docs * suggestions * try fix docs * docs * update name * yapf * docs * rebase * yapf
Configuration menu - View commit details
-
Copy full SHA for bcc0004 - Browse repository at this point
Copy the full SHA bcc0004View commit details -
Make parallel devices optional across all plugins (#6051)
* Make parallel devices optional across all plugins so that they can be instantiated * Add any to types to capture vars passed in
Configuration menu - View commit details
-
Copy full SHA for ffdcb62 - Browse repository at this point
Copy the full SHA ffdcb62View commit details -
Configuration menu - View commit details
-
Copy full SHA for 115e58a - Browse repository at this point
Copy the full SHA 115e58aView commit details -
Configuration menu - View commit details
-
Copy full SHA for f48a933 - Browse repository at this point
Copy the full SHA f48a933View commit details -
Docs for Pruning, Quantization, and SWA (#6041)
Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> Co-authored-by: Sean Naren <sean.narenthiran@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 3449e2d - Browse repository at this point
Copy the full SHA 3449e2dView commit details -
Replace .get_model() with explicit .lightning_module (#6035)
* rename get_model -> lightning_module * update references to get_model * pep8 * add proper deprecation * remove outdated _get_reference_model * fix cyclic import
Configuration menu - View commit details
-
Copy full SHA for 02ac4b0 - Browse repository at this point
Copy the full SHA 02ac4b0View commit details -
rename accelerator_backend -> accelerator (#6034)
* rename accelerator backend * rename new additions from master * add proper deprecation * pep8 * warning match * add missing warning type
Configuration menu - View commit details
-
Copy full SHA for 6cc1a06 - Browse repository at this point
Copy the full SHA 6cc1a06View commit details -
fix flake8 for new plugins (#5951)
* flake8 * fix cyclic import * isort
Configuration menu - View commit details
-
Copy full SHA for fc9bb53 - Browse repository at this point
Copy the full SHA fc9bb53View commit details -
Configuration menu - View commit details
-
Copy full SHA for d3a31bc - Browse repository at this point
Copy the full SHA d3a31bcView commit details -
Add warnings to on_before/after_batch_transfer hooks (#6059)
* Add warnings to hooks * Add default idx to prevent signature change in the future * Nothing to see here * Add default val to transfer_batch_to_device hook * Apply suggestions from code review Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Revert "Add default val to transfer_batch_to_device hook" This reverts commit 5c6a68f Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for 2cf39dc - Browse repository at this point
Copy the full SHA 2cf39dcView commit details -
* v1.2.0rc2 * chlogs * chlogs * format * Apply suggestions from code review Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for c46c23a - Browse repository at this point
Copy the full SHA c46c23aView commit details -
* fix docs * update on comments * Apply suggestions from code review Co-authored-by: Nicki Skafte <skaftenicki@gmail.com> * Apply suggestions from code review Co-authored-by: Nicki Skafte <skaftenicki@gmail.com> * Apply suggestions from code review Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * rm comment * Update docs/source/common/lightning_module.rst Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Nicki Skafte <skaftenicki@gmail.com> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> Co-authored-by: chaton <thomas@grid.ai>
Configuration menu - View commit details
-
Copy full SHA for b0074a4 - Browse repository at this point
Copy the full SHA b0074a4View commit details -
Raise AttributeError in lightning_getattr and lightning_setattr when …
…attribute not found (#6024) * Empty commit * Raise AttributeError instead of ValueError * Make functions private * Update tests * Add match string * Apply suggestions from code review Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> * lightning to Lightning Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 8f82823 - Browse repository at this point
Copy the full SHA 8f82823View commit details -
Configuration menu - View commit details
-
Copy full SHA for 5d6a091 - Browse repository at this point
Copy the full SHA 5d6a091View commit details -
Configuration menu - View commit details
-
Copy full SHA for 4574023 - Browse repository at this point
Copy the full SHA 4574023View commit details -
Configuration menu - View commit details
-
Copy full SHA for e12c8a7 - Browse repository at this point
Copy the full SHA e12c8a7View commit details -
pypi azure badges - tags (#6068)
* pypi azure badges - tags * pep8 * id
Configuration menu - View commit details
-
Copy full SHA for 3645eb1 - Browse repository at this point
Copy the full SHA 3645eb1View commit details
Commits on Feb 19, 2021
-
Configuration menu - View commit details
-
Copy full SHA for 0b27147 - Browse repository at this point
Copy the full SHA 0b27147View commit details -
* precision fixes * add amp test model * fix test * revert * move assert to training step * fix test * fix test * remove unrelated changes * add changelog * remove unused import
Configuration menu - View commit details
-
Copy full SHA for 4b7c0fa - Browse repository at this point
Copy the full SHA 4b7c0faView commit details -
Configuration menu - View commit details
-
Copy full SHA for f2660ac - Browse repository at this point
Copy the full SHA f2660acView commit details
Commits on Feb 20, 2021
-
consistent behavior for reduce method across all Plugins (#6011)
* reduction docs * docs for abstract base method * make mean the default * add preliminary chlog Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for 3bdc067 - Browse repository at this point
Copy the full SHA 3bdc067View commit details -
[Hot Fix] Give priority to plugins to set distributed mode, and then …
…accelerator (#6089) * Give priority to plugins to set distributed mode, and then accelerator * Add CHANGELOG.md * Update CHANGELOG.md * Remove very scary line * Ensure we set cluster environment after slurm configured if necessary * Simplify the fix with a reset Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 97a81c3 - Browse repository at this point
Copy the full SHA 97a81c3View commit details
Commits on Feb 21, 2021
-
Enable ZeRO tests for CI, fix to/half function calls (#6070)
* Enable ZeRO optimization, and make sure that the lightning module hook is called when we move to half precision * Added test, update to function
Configuration menu - View commit details
-
Copy full SHA for 3b0e4e0 - Browse repository at this point
Copy the full SHA 3b0e4e0View commit details -
Expose DeepSpeed FP16 parameters due to loss instability (#6115)
* Expose deepspeed config parameters to init function due to instability in parameters * See if tests can run on normal CI, without special tests * Add changelog * Update pytorch_lightning/plugins/training_type/deepspeed.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 432e563 - Browse repository at this point
Copy the full SHA 432e563View commit details -
Configuration menu - View commit details
-
Copy full SHA for 97b4b3e - Browse repository at this point
Copy the full SHA 97b4b3eView commit details
Commits on Feb 22, 2021
-
fix amp/apex misconfiguration error for cpu (#6107)
* fix weird test * fix apex plugin test * fix raise * cpu test * fix type * add changelog
Configuration menu - View commit details
-
Copy full SHA for ae6ce17 - Browse repository at this point
Copy the full SHA ae6ce17View commit details -
Update Contributing Guide (#6118)
* Update Contributing Guide * update docs
Configuration menu - View commit details
-
Copy full SHA for 9b99328 - Browse repository at this point
Copy the full SHA 9b99328View commit details -
Minor fixes/improvements in Metric docs (#6114)
* Fix wrong render * Improve classification metrics docs * Improve other domain metrics docs * Change the structure level in the docs
Configuration menu - View commit details
-
Copy full SHA for 1d28d11 - Browse repository at this point
Copy the full SHA 1d28d11View commit details -
Configuration menu - View commit details
-
Copy full SHA for 57215b7 - Browse repository at this point
Copy the full SHA 57215b7View commit details -
Feature/5275 clean progress bar print (#5470)
* Trainer.test should return only test metrics (#5214) * resolve bug * merge tests * Fix metric state reset (#5273) * Fix metric state reset * Fix test * Improve formatting Co-authored-by: Ananya Harsh Jha <ananya@pytorchlightning.ai> * print() method added to ProgressBar * printing alongside progress bar added to LightningModule.print() * LightningModule.print() method documentation updated * ProgressBarBase.print() stub added * stub * add progress bar tests * fix isort * Progress Callback fixes * test_metric.py duplicate DummyList removed * PEP and isort fixes * CHANGELOG updated * test_progress_bar_print win linesep fix * test_progress_bar.py remove whitespaces * Update CHANGELOG.md Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Tadej Svetina <tadej.svetina@gmail.com> Co-authored-by: Ananya Harsh Jha <ananya@pytorchlightning.ai> Co-authored-by: Alexander Snorkin <Alexander.Snorkin@acronis.com> Co-authored-by: rohitgr7 <rohitgr1998@gmail.com> Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 423ecf9 - Browse repository at this point
Copy the full SHA 423ecf9View commit details -
mini refactor for _running_stage access (#5724)
* running stage * circular import * running stage cleanup * fix unused import * fix running stage access * add return type * Revert "add return type" This reverts commit 65b0fe2. * try fix typing
Configuration menu - View commit details
-
Copy full SHA for 0456b45 - Browse repository at this point
Copy the full SHA 0456b45View commit details -
Add specifics around DeepSpeed docs (#6142)
* Be more specific with DeepSpeed compatibility * Better wording
Configuration menu - View commit details
-
Copy full SHA for 863a70c - Browse repository at this point
Copy the full SHA 863a70cView commit details
Commits on Feb 23, 2021
-
Ensure accelerator is valid if running interactively (#5970)
Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> Co-authored-by: Carlos Mocholi <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for ebabe56 - Browse repository at this point
Copy the full SHA ebabe56View commit details -
fixing miss-leading tested acc values (#5876)
* fixing tested values * . * tests * yapf * softmax * hvd * rename * lr * duplicate * drop * classif * rm EvalModel * Revert "rm EvalModel" This reverts commit 6c3fb39. * update tests * fix * azure * azure * self * cpu * Apply suggestions from code review Co-authored-by: rohitgr7 <rohitgr1998@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 1c851b8 - Browse repository at this point
Copy the full SHA 1c851b8View commit details -
Configuration menu - View commit details
-
Copy full SHA for 45158aa - Browse repository at this point
Copy the full SHA 45158aaView commit details
Commits on Feb 24, 2021
-
Configuration menu - View commit details
-
Copy full SHA for 09baf29 - Browse repository at this point
Copy the full SHA 09baf29View commit details -
prune deprecated Trainer arg
enable_pl_optimizer
(#6163)* prune enable_pl_optimizer * prune automatic_optimization
Configuration menu - View commit details
-
Copy full SHA for 1d9c553 - Browse repository at this point
Copy the full SHA 1d9c553View commit details -
Prune deprecated metrics for 1.3 (#6161)
* prune deprecated metrics for 1.3 * isort / yapf
Configuration menu - View commit details
-
Copy full SHA for a731269 - Browse repository at this point
Copy the full SHA a731269View commit details -
[Bugfix] Fixed epoch level schedulers not being called when val_check…
…_interval < 1.0 (#6075) * fix bug * fix tests * changelog * fix pep8 * fix tests * fix and add some tests * add test for rlop * chlog * Update CHANGELOG.md Co-authored-by: rohitgr7 <rohitgr1998@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 1b498d1 - Browse repository at this point
Copy the full SHA 1b498d1View commit details -
Prune deprecated checkpoint arguments (#6162)
* prune prefix * prune mode=auto * chlog
Configuration menu - View commit details
-
Copy full SHA for 46617d9 - Browse repository at this point
Copy the full SHA 46617d9View commit details -
Prune deprecated EarlyStopping(mode='auto') (#6167)
Co-authored-by: Roger Shieh <sh.rog@protonmail.ch> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 8b47527 - Browse repository at this point
Copy the full SHA 8b47527View commit details -
Configuration menu - View commit details
-
Copy full SHA for 5cf892b - Browse repository at this point
Copy the full SHA 5cf892bView commit details -
Update issue template to use discussions for questions (#6155)
* add issue config * remove question template * update URL * Update README.md * Update README.md Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> * Update .github/ISSUE_TEMPLATE/config.yml Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for c33fd52 - Browse repository at this point
Copy the full SHA c33fd52View commit details -
Configuration menu - View commit details
-
Copy full SHA for c7130b7 - Browse repository at this point
Copy the full SHA c7130b7View commit details -
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: Kaushik Bokka <kaushikbokka@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for b0d1996 - Browse repository at this point
Copy the full SHA b0d1996View commit details
Commits on Feb 25, 2021
-
Configuration menu - View commit details
-
Copy full SHA for 3ed8ef8 - Browse repository at this point
Copy the full SHA 3ed8ef8View commit details -
Fix for multiple callbacks (#6197)
* Fix for multiple callbacks * Add CHANGELOG.md * Remove old params * Skip tests on windows using ddp * Change name of the variable to not clash with should stop, which is separate * Apply suggestions from code review * Fix params Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for dd2f5a0 - Browse repository at this point
Copy the full SHA dd2f5a0View commit details -
Add checkpoint parameter to on_save_checkpoint (#6072)
Co-authored-by: Kaushik B <45285388+kaushikb11@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for 3df02b8 - Browse repository at this point
Copy the full SHA 3df02b8View commit details -
Document exceptions in loggers (#6171)
* Document exceptions in loggers * minor formatting * docstring changed in comet.py * Apply suggestions from code review Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 4d96f19 - Browse repository at this point
Copy the full SHA 4d96f19View commit details -
Configuration menu - View commit details
-
Copy full SHA for ddf55a2 - Browse repository at this point
Copy the full SHA ddf55a2View commit details
Commits on Feb 26, 2021
-
Configuration menu - View commit details
-
Copy full SHA for e7298b5 - Browse repository at this point
Copy the full SHA e7298b5View commit details -
Add mypy typing to precision plugins. (#6149)
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: Jirka Borovec <jirka.borovec@seznam.cz> Co-authored-by: Akihiro Nitta <nitta@akihironitta.com>
Configuration menu - View commit details
-
Copy full SHA for 0647340 - Browse repository at this point
Copy the full SHA 0647340View commit details -
apply_func.py: from torchtext.legacy.data import Batch (#6211)
* Update apply_func.py The name Batch is no longer located under torchtext.data --Error message-- File "/home/daniel/py38/lib/python3.8/site-packages/pytorch_lightning/utilities/apply_func.py", line 25, in <module> from torchtext.data import Batch ImportError: cannot import name 'Batch' from 'torchtext.data' (/home/daniel/py38/lib/p ython3.8/site-packages/torchtext/data/__init__.py) You can fix this by changing line line 28 to: from torchtext.legacy.data import Batch * Update apply_func.py * Update apply_func.py * Update apply_func.py * Update apply_func.py * Update apply_func.py
Configuration menu - View commit details
-
Copy full SHA for ee5032a - Browse repository at this point
Copy the full SHA ee5032aView commit details
Commits on Feb 27, 2021
-
fix(wandb): prevent WandbLogger from dropping values (#5931)
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 40d5a9d - Browse repository at this point
Copy the full SHA 40d5a9dView commit details -
Configuration menu - View commit details
-
Copy full SHA for 111d9c7 - Browse repository at this point
Copy the full SHA 111d9c7View commit details
Commits on Feb 28, 2021
-
document exceptions for metrics/regression (#6202)
Co-authored-by: Akihiro Nitta <nitta@akihironitta.com> Co-authored-by: Prajakta Phadke <pphadke@iu.edu> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 15c477e - Browse repository at this point
Copy the full SHA 15c477eView commit details
Commits on Mar 1, 2021
-
simplify skip-if tests >> 0/n (#5920)
* skipif + yapf + isort * tests * docs * pp
Configuration menu - View commit details
-
Copy full SHA for 58a6d59 - Browse repository at this point
Copy the full SHA 58a6d59View commit details -
Configuration menu - View commit details
-
Copy full SHA for ce05687 - Browse repository at this point
Copy the full SHA ce05687View commit details -
Document Exceptions in profilers (#6229)
* docstring changes in profilers * minor changes in profilers.py
Configuration menu - View commit details
-
Copy full SHA for 8aba885 - Browse repository at this point
Copy the full SHA 8aba885View commit details -
Call
optimizer.zero_grad()
before backward inside closure in AutoOpt (#6147) Co-authored-by: Carlos Mocholi <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 925f082 - Browse repository at this point
Copy the full SHA 925f082View commit details -
Fix for incorrect usage of detach(), cpu(), to() (#6216)
* Fix for incorrect detach/cpu calls (#6214) * Fix incorrect use of detach(), to(), and cpu(), #6214 * Fix incorrect use of detach() and cpu(), #6214 * update pr * add typing * chlog * more... * revert on module * update on comments * revert changes on model Co-authored-by: tchaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <jirka.borovec@seznam.cz>
Configuration menu - View commit details
-
Copy full SHA for 651c25f - Browse repository at this point
Copy the full SHA 651c25fView commit details -
Configuration menu - View commit details
-
Copy full SHA for 352e8f0 - Browse repository at this point
Copy the full SHA 352e8f0View commit details -
Configuration menu - View commit details
-
Copy full SHA for ed67490 - Browse repository at this point
Copy the full SHA ed67490View commit details -
Configuration menu - View commit details
-
Copy full SHA for 412a7d8 - Browse repository at this point
Copy the full SHA 412a7d8View commit details -
Configuration menu - View commit details
-
Copy full SHA for 6788dba - Browse repository at this point
Copy the full SHA 6788dbaView commit details
Commits on Mar 2, 2021
-
docstring changes in tuner (#6264)
* docstring changes in tuner * added full stop
Configuration menu - View commit details
-
Copy full SHA for 3371d32 - Browse repository at this point
Copy the full SHA 3371d32View commit details -
Disable CPU Offload as default for DeepSpeed (#6262)
* Change default for CPU offload to false for best throughput/memory efficiency * Add changelog * default Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for efda48f - Browse repository at this point
Copy the full SHA efda48fView commit details -
Configuration menu - View commit details
-
Copy full SHA for dc8647e - Browse repository at this point
Copy the full SHA dc8647eView commit details -
Refactor: skipif for multi - gpus 1/n (#6266)
* ngpus * gpu * isort * pt * flake8
Configuration menu - View commit details
-
Copy full SHA for eb81500 - Browse repository at this point
Copy the full SHA eb81500View commit details -
Improved EarlyStopping.patience documentation (#6278)
* Improved early stopping documentation * Changed to 120 column format * doc * doc * doc Co-authored-by: Jirka Borovec <jirka.borovec@seznam.cz>
Configuration menu - View commit details
-
Copy full SHA for 22985d2 - Browse repository at this point
Copy the full SHA 22985d2View commit details -
Configuration menu - View commit details
-
Copy full SHA for 0f9134e - Browse repository at this point
Copy the full SHA 0f9134eView commit details -
fix duplicate console logging bug v2 (#6275)
Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for bc577ca - Browse repository at this point
Copy the full SHA bc577caView commit details -
Configuration menu - View commit details
-
Copy full SHA for b46d221 - Browse repository at this point
Copy the full SHA b46d221View commit details -
[fix] Ensure we check deepspeed/sharded in multinode DDP (#6297)
* Ensure we check deepspeed/sharded in multinode * Add CHANGELOG.md * Add CHANGELOG.md * Drop mock, use actual multi-gpu node
Configuration menu - View commit details
-
Copy full SHA for 8001987 - Browse repository at this point
Copy the full SHA 8001987View commit details -
Configuration menu - View commit details
-
Copy full SHA for 38274b9 - Browse repository at this point
Copy the full SHA 38274b9View commit details -
Configuration menu - View commit details
-
Copy full SHA for 24c3a3f - Browse repository at this point
Copy the full SHA 24c3a3fView commit details -
try to fix imports for parsing (#6256)
* try to fix imports * legacy 1.2.1
Configuration menu - View commit details
-
Copy full SHA for 7e8f4b9 - Browse repository at this point
Copy the full SHA 7e8f4b9View commit details -
Refactor: Runif for TPU and Horovod 5/n (#6301)
* TPU * horovod * extra * fix * Apply suggestions from code review Co-authored-by: Nicki Skafte <skaftenicki@gmail.com> * doc Co-authored-by: Nicki Skafte <skaftenicki@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for ac58378 - Browse repository at this point
Copy the full SHA ac58378View commit details -
Configuration menu - View commit details
-
Copy full SHA for d1a0315 - Browse repository at this point
Copy the full SHA d1a0315View commit details -
Add fairscale & deepspeed to skipif 4/n (#6281)
* add fairscale & windows to skipif * add deepspeed to runif * fairscale * deepspeed * flake8 Co-authored-by: Jirka Borovec <jirka.borovec@seznam.cz>
Configuration menu - View commit details
-
Copy full SHA for 4157b35 - Browse repository at this point
Copy the full SHA 4157b35View commit details -
[bugfix] TPU test hangs to barrier on 1 process (#6272)
* update * resolve flake8 * update * update * update changelog * update * resolve flake8 Co-authored-by: Your Name <you@example.com>
Configuration menu - View commit details
-
Copy full SHA for 1aac481 - Browse repository at this point
Copy the full SHA 1aac481View commit details
Commits on Mar 3, 2021
-
Configuration menu - View commit details
-
Copy full SHA for bf6ba83 - Browse repository at this point
Copy the full SHA bf6ba83View commit details -
Configuration menu - View commit details
-
Copy full SHA for dcec4ef - Browse repository at this point
Copy the full SHA dcec4efView commit details -
Fix ModelPruning(make_pruning_permanent=True) buffers getting removed…
… when saved during training (#6073) Co-authored-by: chaton <thomas@grid.ai>
Configuration menu - View commit details
-
Copy full SHA for 4a8422c - Browse repository at this point
Copy the full SHA 4a8422cView commit details -
[bugfix] TPU + all_gather + SingleTPU shouldn't call xm.all_gather (#…
…6296) * resolve an issue with TPU * update * add changelog
Configuration menu - View commit details
-
Copy full SHA for 484dce1 - Browse repository at this point
Copy the full SHA 484dce1View commit details
Commits on Mar 4, 2021
-
drop unused variable in API (#6308)
* drop unused pl model in ckpt * irelevant * on_evaluation_batch_start * evaluation_epoch_end * attach_datamodule
Configuration menu - View commit details
-
Copy full SHA for 6166f46 - Browse repository at this point
Copy the full SHA 6166f46View commit details -
hotfix for PT1.6 and torchtext (#6323)
* ci: azure reinstall torchtext * move * todos * 0.6.0 * skip examples * formatter * skip * todo * Apply suggestions from code review
Configuration menu - View commit details
-
Copy full SHA for e038e74 - Browse repository at this point
Copy the full SHA e038e74View commit details -
[fix] Use training type plugin hook when saving (FSDP 1/n) (#6321)
* Rely on training type plugin when saving * Add better typing to training type plugin
Configuration menu - View commit details
-
Copy full SHA for d01e8fd - Browse repository at this point
Copy the full SHA d01e8fdView commit details -
Configuration menu - View commit details
-
Copy full SHA for 577323c - Browse repository at this point
Copy the full SHA 577323cView commit details -
Add
tests/utilities/test_parsing.py
(#4460)* Create branch tests/4400_parsing * Rename test file for parsing.py * Fix lightning_hasattr * Fix lightning_hasattr * Fix lightning_setattr * Add empty lines and remove rubbish spaces * Raise AttributeError not ValueError * Use getattr in hasattr * Remove rubbish spaces * Fix getattr * Fix by flake8 * Add tests for str_to_bool_or_str * Fix by flake8 * Add tests for str_to_bool * Add tests for is_picklable * Add tests for clean_namespace * Fix typo * Fix lightning_getattr * Add tests for AttributeDict * Add tests for flatten_dict * Fix by flake8 * Apply suggestions from code review Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Apply isort * Revert "Apply suggestions from code review" * Define unpicklable_function outside * Add comment to test_clean_namespace * Add tests for parse_class_init_keys * Add tests for get_init_args and collect_init_args * Share objects across the tests Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: Ethan Harris <ewah1g13@soton.ac.uk>
Configuration menu - View commit details
-
Copy full SHA for 48a10f1 - Browse repository at this point
Copy the full SHA 48a10f1View commit details -
Add ignore param to save_hyperparameters (#6056)
* add ignore param to save_hyperparameters * add docstring for ignore * add type for frame object * Update pytorch_lightning/core/lightning.py Co-authored-by: Nicki Skafte <skaftenicki@gmail.com> * Update pytorch_lightning/core/lightning.py Co-authored-by: Nicki Skafte <skaftenicki@gmail.com> * fix whitespace * Update pytorch_lightning/core/lightning.py Co-authored-by: Nicki Skafte <skaftenicki@gmail.com> * Parametrize tests * Update pytorch_lightning/core/lightning.py Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> * Update pytorch_lightning/core/lightning.py Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> * seq * fix docs * Update lightning.py * Update lightning.py * fix docs errors * add example keyword * update docstring Co-authored-by: Nicki Skafte <skaftenicki@gmail.com> Co-authored-by: Carlos Mocholi <carlossmocholi@gmail.com> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 59acf57 - Browse repository at this point
Copy the full SHA 59acf57View commit details -
Fix when _stable_1d_sort to work when n >= N (#6177)
* Fix when _stable_1d_sort to work when n >= N * Apply suggestions Co-authored-by: Carlos Mocholi <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 5d7388d - Browse repository at this point
Copy the full SHA 5d7388dView commit details -
Update docs on arg train_dataloader in fit (#6076)
* add to docs * update docs * Apply suggestions from code review * Update pytorch_lightning/core/hooks.py Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> * nested loaders * Apply suggestions from code review Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> * shorten text length * Update pytorch_lightning/core/hooks.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 4f90455 - Browse repository at this point
Copy the full SHA 4f90455View commit details -
missing tests default_root_dir=tmpdir (#6314)
* default_root_dir=tmpdir * miss
Configuration menu - View commit details
-
Copy full SHA for b9cf122 - Browse repository at this point
Copy the full SHA b9cf122View commit details -
Document exception for metrics/classification (#6190)
* document exception for metrics/classification * minor formatting fixes * fix trailing whitespaces * document exception for metrics * Apply suggestions from code review Co-authored-by: Nicki Skafte <skaftenicki@gmail.com> * Apply suggestions from code review Co-authored-by: Nicki Skafte <skaftenicki@gmail.com> * Apply suggestions from code review Co-authored-by: Akihiro Nitta <nitta@akihironitta.com> Co-authored-by: Nicki Skafte <skaftenicki@gmail.com> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: Akihiro Nitta <nitta@akihironitta.com>
Configuration menu - View commit details
-
Copy full SHA for 8e3524d - Browse repository at this point
Copy the full SHA 8e3524dView commit details -
[Fix] Call clip gradients if clip val greater than 0 (#6330)
* Call clip gradients if clip val greater than 0 * format * Format * Move to top of file
Configuration menu - View commit details
-
Copy full SHA for 39231ae - Browse repository at this point
Copy the full SHA 39231aeView commit details -
[bugfix] Check LightningOptimizer doesn't delete optimizer hooks (#6305)
* update * resolve bug
Configuration menu - View commit details
-
Copy full SHA for 7acbd65 - Browse repository at this point
Copy the full SHA 7acbd65View commit details -
docstring changes in accelerators (#6327)
* docstring changes in accelerators * docstrings moved * whitespaces removed * PEP8 correction[1]
Configuration menu - View commit details
-
Copy full SHA for 49c579f - Browse repository at this point
Copy the full SHA 49c579fView commit details -
[bugfix] Perform reduction for dict in training_step and DP (#6324)
* fix * update * update * add changelog * Update CHANGELOG.md Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * Update tests/accelerators/test_dp.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * update changelog Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 248a8e8 - Browse repository at this point
Copy the full SHA 248a8e8View commit details
Commits on Mar 5, 2021
-
introduce default cluster environment for lightning-specific ddp (#5915)
* handle distributed_sampler_kwargs * move emptying cache to accelertor * fix a few tests * restoring the result from subprocess * fix queue.get() order for results * add missing "block_backward_sync" context manager * add missing "block_backward_sync" context manager * fix sync_batchnorm * fix supported gpu-ids for tuple * fix clip gradients and inf recursion * accelerator selection: added cluster_environment plugin * fix torchelastic test * fix reduce early stopping decision for DDP * fix tests: callbacks, conversion to lightning optimizer * fix lightning optimizer does not pickle * fix setting benchmark and deterministic option * fix slurm amp test * fix prepare_data test and determine node_rank * fix retrieving last path when testing * remove obsolete plugin argument * fix test: test_trainer_config * fix torchscript tests * fix trainer.model access * move properties * fix test_transfer_batch_hook * fix auto_select_gpus * fix omegaconf test * fix test that needs to simulate slurm ddp * add horovod plugin * fix test with named arguments * clean up whitespace * fix datamodules test * remove old accelerators * fix naming * move old plugins * move to plugins * create precision subpackage * create training_type subpackage * fix all new import errors * fix wrong arguments order passed to test * fix LR finder * Added sharded training type and amp plugin * Move clip grad to precision plugin * Added sharded spawn, select accelerators based on distributed_backend + enable custom fp16 plugin automatically * Fix import issue, attempting to fix tests * Fix initial test * Reflect hook logic from master, should wrap model after move to device * Optional state consolidation, since master has optimizers not wrapped * change attribute for instance test * reset optimizers optimizers are not used in main process, so state would be wrong. * legacy * imports in accel * legacy2 * trainer imports * fix import errors after rebase * move hook to new setup location * provide unwrapping logic * fix trainer callback system * added ddp2 implementation * fix imports .legacy * move plugins * restore legacy * drop test.py from root * add tpu accelerator and plugins * fixes * fix lightning optimizer merge * reset bugreportmodel * unwrapping * step routing forward * model access * unwrap * opt * integrate distrib_type * sync changes * sync * fixes * add forgotten generators * add missing logic * update * import * missed imports * import fixes * isort * mv f * changelog * format * move helper to parallel plugin * d * add world size * clean up * duplicate * activate ddp_sharded and tpu * set nvidia flags * remove unused colab var * use_tpu <-> on_tpu attrs * make some ddp_cpu and clusterplugin tests pass * Ref/accelerator connector (#5742) * final cleanup Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> * connector cleanup Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> * trainer cleanup Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> * accelerator cleanup + missing logic in accelerator connector Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> * add missing changes to callbacks Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> * reflect accelerator changes to lightning module Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> * clean cluster envs Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> * cleanup plugins Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> * add broadcasting Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> * yapf * remove plugin connector Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> * plugins * manual optimization * update optimizer routing * add rank to torchelastic * fix memory mixed precision * setstate on trainer for pickling in ddp spawn * add predict method * add back commented accelerator code * adapt test for sync_batch_norm to new plugin * fix deprecated tests * fix ddp cpu choice when no num_processes are given * yapf format * skip a memory test that cannot pass anymore * fix pickle error in spawn plugin * x * avoid * x * fix cyclic import in docs build * add support for sharded * update typing * add sharded and sharded_spawn to distributed types * make unwrap model default * refactor LightningShardedDataParallel similar to LightningDistributedDataParallel * update sharded spawn to reflect changes * update sharded to reflect changes * Merge 1.1.5 changes * fix merge * fix merge * yapf isort * fix merge * yapf isort * fix indentation in test * copy over reinit scheduler implementation from dev1.2 * fix apex tracking calls with dev_debugger * reduce diff to dev1.2, clean up * fix trainer config test when gpus>0 and num_processes >0 and ddp_cpu * sort plugin tests legacy/new * fix error handling for amp on cpu * fix merge fix merge fix merge * [Feat] Resolve manual_backward (#5837) * resolve manual_backward * resolve flake8 * update * resolve for ddp_spawn * resolve flake8 * resolve flake8 * resolve flake8 Co-authored-by: Ubuntu <ubuntu@ip-172-31-88-60.ec2.internal> * fix tests/accelerator tests on cpu * [BugFix] Resolve manual optimization (#5852) * resolve manual_optimization * update * update Co-authored-by: Ubuntu <ubuntu@ip-172-31-88-60.ec2.internal> * Remove copy trainer parameters to happen earlier within the loop and add safe guard to get ref model (#5856) * resovle a bug * Accelerator refactor sharded rpc (#5854) * rpc branch * merge * update handling of rpc * make devices etc. Optional in RPC * set devices etc. later if necessary * remove devices from sequential * make devices optional in rpc * fix import * uncomment everything * fix cluster selection Co-authored-by: Ubuntu <ubuntu@ip-172-31-88-60.ec2.internal> * resolve bug * fix assert in rpc test * resolve a test * fix docs compilation * accelerator refactor - fix for sharded parity test (#5866) * fix memory issue with ddp_spawn * x x x x x x x x x * x * Remove DDP2 as this does not apply * Add missing pre optimizer hook to ensure lambda closure is called * fix apex docstring * [accelerator][BugFix] Resolve some test for 1 gpu (#5863) * update * revert init * resolve a bug * update * resolve flake8 * update * update * update * revert init * resolve a bug * update * resolve flake8 * update * update * update * update * update * revert init * resolve a bug * update * resolve flake8 * update * update * update * revert init * update * resolve flake8 * update * update * update * update * update * all_gather * update * make plugins work, add misconfig for RPC * update * update * remove breaking test * resolve some tests * resolve flake8 * revert to ddp_spawn Co-authored-by: root <root@ip-172-31-88-60.ec2.internal> Co-authored-by: Ubuntu <ubuntu@ip-172-31-88-60.ec2.internal> Co-authored-by: Justus Schock <justus.schock@rwth-aachen.de> * yapf isort * resolve flake8 * fix apex doctests * fix apex doctests 2 * resolve docs * update drone * clean env * update * update * update * update * merge * Fix RPC related tests, clean out old API, update for new accelerator API [skip ci] (#5881) * Fix RPC related tests, clean out old API, update for new accelerator API * Move tests out of legacy folder, update paths and names * Update test_remove_1-4.py * Expose properties for tpu cores/gpus/num_gpus * Add root GPU property * Move properties to properties.py * move tests that were previously in drone * Fix root GPU property (#5908) * Move root GPU to property, remove horovod set as this is handled in horovod plugin, ensure we mock correctly to set GPU accelerator * Add missing tests back * fix best model path transfer when no checkpoint callback available * Fix setup hook order [wip] (#5858) * Call trainer setup hook before accelerator setup * Add test case * add new test * typo * fix callback order in test Co-authored-by: tchaton <thomas@grid.ai> Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> * rename ddp sequential -> rpc sequential for special test * revert * fix stupid merge problem * abstract the cluster plugins * default plugin * integrate default environment * fix property * adapt tests * adjust test * fix world size access * base cluster env * revert rebase errors * revert rebase errors * missing import * revert unrelated change * remove unused cluster local rank * remove unrelated changes * fix unrelated changes * fix pep8 * remove unused var * reset permissions * ypaf * test default environment * test torchelastic environment * world size as int * tests for slurm environment * changelog * test comments * remove unintended change * keep master port fixed after it is generated * test random master port * yapf * add missing default environment * move helper function * rename default environment * rename * rename * yapf * Update pytorch_lightning/plugins/environments/lightning_environment.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * Update CHANGELOG.md Co-authored-by: Justus Schock <12886177+justusschock@users.noreply.github.com> * spawn -> create Co-authored-by: justusschock <justus.schock@posteo.de> Co-authored-by: SeanNaren <sean@grid.ai> Co-authored-by: Justus Schock <12886177+justusschock@users.noreply.github.com> Co-authored-by: Jirka Borovec <jirka.borovec@seznam.cz> Co-authored-by: Justus Schock <justus.schock@rwth-aachen.de> Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Ubuntu <ubuntu@ip-172-31-88-60.ec2.internal> Co-authored-by: Sean Naren <sean.narenthiran@gmail.com> Co-authored-by: root <root@ip-172-31-88-60.ec2.internal> Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for ec8d46e - Browse repository at this point
Copy the full SHA ec8d46eView commit details -
[bugfix] Resolve memory leak for evaluation (#6326)
* resolve bug * resolve flake8 * revert name
Configuration menu - View commit details
-
Copy full SHA for 46540ee - Browse repository at this point
Copy the full SHA 46540eeView commit details -
Update changelog for v1.2.2 (#6325)
* update changelog for v1.2.2 * ckpr 1.2.2 Co-authored-by: Jirka Borovec <jirka.borovec@seznam.cz>
Configuration menu - View commit details
-
Copy full SHA for b6aa350 - Browse repository at this point
Copy the full SHA b6aa350View commit details -
CI: fix examples - patch download MNIST (#6357)
* patch download * CI * isort * extra
Configuration menu - View commit details
-
Copy full SHA for e848542 - Browse repository at this point
Copy the full SHA e848542View commit details -
[bug] Fix Pytorch profiler with emit_nvtx (#6260)
* resolve bug * update changelog * Update tests/trainer/test_trainer.py * Update pytorch_lightning/profiler/profilers.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * resolve comments * resolve flake8 Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for 2ec67a4 - Browse repository at this point
Copy the full SHA 2ec67a4View commit details -
fix importing torchtext batch (#6365)
* copy torchtext batch * update * rev * rev
Configuration menu - View commit details
-
Copy full SHA for 2a3ab67 - Browse repository at this point
Copy the full SHA 2a3ab67View commit details -
Configuration menu - View commit details
-
Copy full SHA for 4f391bc - Browse repository at this point
Copy the full SHA 4f391bcView commit details
Commits on Mar 6, 2021
-
Refactor RunningStage usage in advance of implementing Trainer.valida…
…te() (#4945) * Update code Co-authored-by: EliaCereda * More property updates * Move properties. Introduce trainer._fitting * Use trainer.fitting * Fix reset dataloaders * Unused code * RunningStage.SANITY_CHECKING * Use setters * Fix bugs * Fix bugs * TrainerState.{FITTING,VALIDATING,TESTING,PREDICTING,TUNING} * Fix bugs * Fix bugs * Fix tests * Update CHANGELOG. Add deprecation warning. Fix tests * Unused imports * Optional trainer * More deprecation. More refactoring * Correct version * Use properties * Address comments * flake8 * Missed renamings * Typo * is -> == It is recommended to use for Enums since they are singletons, however, since the LightningEnum subclasses str, it's not a good idea in case a user sets the state/stage with a str * Also for tests * Typo * Address @tchaton's comments * PEP8 * Correct property * Update CHANGELOG * Apply suggestions from code review Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> * Update pytorch_lightning/trainer/trainer.py Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> * Remove called sanity check Co-authored-by: Carlos Mocholi <carlossmocholi@gmail.com> Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for d0596fa - Browse repository at this point
Copy the full SHA d0596faView commit details -
require: adjust versions (#6363)
* adjust versions * release * manifest * pep8 * CI * fix * build
Configuration menu - View commit details
-
Copy full SHA for 85c8074 - Browse repository at this point
Copy the full SHA 85c8074View commit details -
Use f-"""-string in a Trainer comment (#6377)
* Use f-"""-string * Add r * Use Trainer. * r -> noqa: W605
Configuration menu - View commit details
-
Copy full SHA for 217470b - Browse repository at this point
Copy the full SHA 217470bView commit details -
Remove no return warning from val/test step (#6139)
* remove warning * auto_opt * chlog * auto_opt * no_warning_call * rm old code * add warning for predict * Apply suggestions from code review Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for facfda8 - Browse repository at this point
Copy the full SHA facfda8View commit details -
Fix manual optimization in pl_example (#6373)
* Fix automatic_optimization * Fix automatic_optimization * Uncomment fairscale
Configuration menu - View commit details
-
Copy full SHA for 34b733b - Browse repository at this point
Copy the full SHA 34b733bView commit details -
Configuration menu - View commit details
-
Copy full SHA for 966184a - Browse repository at this point
Copy the full SHA 966184aView commit details
Commits on Mar 7, 2021
-
Remove optimizer_idx arg in manual optimization (#6093)
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> Co-authored-by: chaton <thomas@grid.ai>
Configuration menu - View commit details
-
Copy full SHA for 38a5fe7 - Browse repository at this point
Copy the full SHA 38a5fe7View commit details -
[doc] Improve Multiple Val/Test Dataloaders with simultaneous batches…
… option (#6320) * improve doc to describe how to combine batches of multiple test and val dataloaders simultaneously * fix typo Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> * use paramref Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 2708c39 - Browse repository at this point
Copy the full SHA 2708c39View commit details -
[doc] Fix closure in manual optimization (#6374)
* Fix manual optimization docs * Fix typo. Thanks @import-antigravity
Configuration menu - View commit details
-
Copy full SHA for c7f30a2 - Browse repository at this point
Copy the full SHA c7f30a2View commit details -
Fix ModelCheckpoint(monitor=None, save_last=True) not saving checkpoi…
…nts (#6136) Co-authored-by: ananthsub <ananth.subramaniam@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 826375e - Browse repository at this point
Copy the full SHA 826375eView commit details
Commits on Mar 8, 2021
-
* Update tensorboard.py * Update logging.rst * pep8 * Update logging.rst * Update logging.rst * Apply suggestions from code review * add code sample * Update logging.rst Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for ff16104 - Browse repository at this point
Copy the full SHA ff16104View commit details -
Fix trainer not resetting lightning_optimizers (#6372)
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 718074b - Browse repository at this point
Copy the full SHA 718074bView commit details -
Configuration menu - View commit details
-
Copy full SHA for 0ec7a23 - Browse repository at this point
Copy the full SHA 0ec7a23View commit details -
Fix AttributeError: 'NoneType' object has no attribute 'finalize' on …
…TPU (#6221) * Fix bug Fix AttributeError: 'NoneType' object has no attribute 'finalize' * Update CHANGELOG.md * deleted a period * Update CHANGELOG.md Co-authored-by: Akihiro Nitta <nitta@akihironitta.com> * Update CHANGELOG.md * Update pytorch_lightning/plugins/training_type/tpu_spawn.py Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: Akihiro Nitta <nitta@akihironitta.com> Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for a6c98c4 - Browse repository at this point
Copy the full SHA a6c98c4View commit details -
Configuration menu - View commit details
-
Copy full SHA for 8dabc30 - Browse repository at this point
Copy the full SHA 8dabc30View commit details -
Configuration menu - View commit details
-
Copy full SHA for efd272a - Browse repository at this point
Copy the full SHA efd272aView commit details -
* fix * update * fix * move the class outside
Configuration menu - View commit details
-
Copy full SHA for e1f5eac - Browse repository at this point
Copy the full SHA e1f5eacView commit details -
Add check for verbose attribute of ModelCheckpoint (#6419)
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 9eded7f - Browse repository at this point
Copy the full SHA 9eded7fView commit details
Commits on Mar 9, 2021
-
fixed bug where tuner would not tune lr if also tuning batch_size (#4688
) * fixed bug where tuner would not tune lr if also tuning batch_size * added a '+1' to computing the smoothed loss. This maintains the behavior for the smoothed loss as before the bug fix * pep8 fix * add changelog Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Carlos Mocholi <carlossmocholi@gmail.com> Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 523c59b - Browse repository at this point
Copy the full SHA 523c59bView commit details -
Configuration menu - View commit details
-
Copy full SHA for 75c6486 - Browse repository at this point
Copy the full SHA 75c6486View commit details -
fix logger creating directory structure too early in DDP (#6380)
* fix * add simple test * fix imports * add changelog * tighter test with on_fit_start hook closer to the dispatch call * move class inside test f unction * add a comment
Configuration menu - View commit details
-
Copy full SHA for fc6d402 - Browse repository at this point
Copy the full SHA fc6d402View commit details -
Configuration menu - View commit details
-
Copy full SHA for 55dd3a4 - Browse repository at this point
Copy the full SHA 55dd3a4View commit details -
[changelog] Update Changelog on release v1.2.3 (#6444)
* update changelog * legacy 1.2.3 Co-authored-by: Jirka Borovec <jirka.borovec@seznam.cz>
Configuration menu - View commit details
-
Copy full SHA for 30d649b - Browse repository at this point
Copy the full SHA 30d649bView commit details -
* fix dummy logger * docs * update docs * add changelog * add none return annotation * return empty string for name, version
Configuration menu - View commit details
-
Copy full SHA for 615b2f7 - Browse repository at this point
Copy the full SHA 615b2f7View commit details
Commits on Mar 10, 2021
-
Raise an exception if check_val_every_n_epoch is not an integer (#6411)
* raise an exception if check_val_every_n_epoch is not an integer * remove unused object * add type hints * add return type * update exception message * update exception message
Configuration menu - View commit details
-
Copy full SHA for 74d79e7 - Browse repository at this point
Copy the full SHA 74d79e7View commit details -
Set find unused parameters to True by default to fix breaking compati…
…bility (#6438) * Set find unused parameters to True by default to fix breaking models, add suggestion to re-enable * Add changelog
Configuration menu - View commit details
-
Copy full SHA for c81b2a8 - Browse repository at this point
Copy the full SHA c81b2a8View commit details -
[bug] All_gather support tensor on cpu (#6416)
* add test * update changelog * update * rename function
Configuration menu - View commit details
-
Copy full SHA for 7d4e74c - Browse repository at this point
Copy the full SHA 7d4e74cView commit details -
[Fix] Ensure we set the default device before initializing deepspeed (#…
…6460) * Ensure we set the default device before initializing deepspeed * Add CHANGELOG.md * Update pytorch_lightning/plugins/training_type/deepspeed.py Co-authored-by: Kaushik B <45285388+kaushikb11@users.noreply.github.com> Co-authored-by: Kaushik B <45285388+kaushikb11@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for 1c013b4 - Browse repository at this point
Copy the full SHA 1c013b4View commit details -
Configuration menu - View commit details
-
Copy full SHA for d1db604 - Browse repository at this point
Copy the full SHA d1db604View commit details
Commits on Mar 11, 2021
-
Add Trainer.validate(…) method to run one validation epoch (#4948)
Co-authored-by: Carlos Mocholi <carlossmocholi@gmail.com> Co-authored-by: chaton <thomas@grid.ai> Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for f4cc745 - Browse repository at this point
Copy the full SHA f4cc745View commit details -
Allow user to disable the automatic formatting of checkpoint file nam…
…es. (#6277) * cleaning SWA (#6259) * rename * if * test * chlog * Remove opt from manual_backward in docs (#6267) * switch agents pool (#6270) * Allow user to disable the automatic formatting of checkpoint file names. * Added changelog entry. * Made flake8 happy. * Applied review suggestion: quotes for special characters in docstring Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * Fixed example in docstring. * Fixed syntax error in docstring. Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: Akihiro Nitta <nitta@akihironitta.com> Co-authored-by: thomas chaton <thomas@grid.ai> Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 2ecda5d - Browse repository at this point
Copy the full SHA 2ecda5dView commit details -
Configuration menu - View commit details
-
Copy full SHA for 079fe9b - Browse repository at this point
Copy the full SHA 079fe9bView commit details -
Configuration menu - View commit details
-
Copy full SHA for afe0ede - Browse repository at this point
Copy the full SHA afe0edeView commit details -
argparse: Add use_argument_group=True (#6088)
* argparse: Add inplace option Replicate in GAN model * datamodule: Deduplicate logic w/ argparser utilities * Update pl_examples/domain_templates/generative_adversarial_net.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * Apply suggestions from code review Co-authored-by: Akihiro Nitta <nitta@akihironitta.com> * Keep docstrings * Correct name * Whitespace * Consistency * fix weird type stuff * try alt - use_argument_group * fix syntax + lint * fix ci errs * fix ci * change examples... still failing w/ "unrecognized arguments: --batch_size" * address review * mnist_datamodule: add some docstrings * argparse: check cls or cls.__init__ for param didn't capture issue, but meh * fix lint * fix no-doc edge case * address review Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: Akihiro Nitta <nitta@akihironitta.com> Co-authored-by: Carlos Mocholi <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for e886d55 - Browse repository at this point
Copy the full SHA e886d55View commit details -
Disable batch transfer in DP mode (#6098)
* add exceptions and test * hook * fix * clean up * clean up * regex * regex * docs * rev * comment and docs * chlog * Apply suggestions from code review Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * Apply suggestions from code review Co-authored-by: chaton <thomas@grid.ai> * Monkey-patch device count * docs * pep * api_change Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> Co-authored-by: chaton <thomas@grid.ai>
Configuration menu - View commit details
-
Copy full SHA for c53edce - Browse repository at this point
Copy the full SHA c53edceView commit details -
Configuration menu - View commit details
-
Copy full SHA for 62d4304 - Browse repository at this point
Copy the full SHA 62d4304View commit details -
[feat] Support iteration-based checkpointing in model checkpoint call…
…back (#6146) * Update model_checkpoint.py * add tests * Update model_checkpoint.py * Update test_model_checkpoint.py * fix tests * every_n_batches * Update test_model_checkpoint.py * defaults * rm tests * Update model_checkpoint.py * Update test_model_checkpoint.py * Prune deprecated metrics for 1.3 (#6161) * prune deprecated metrics for 1.3 * isort / yapf * Update model_checkpoint.py * add tests * defaults * Update CHANGELOG.md * pre-commit * Update model_checkpoint.py * update defaults * Update test_remove_1-5.py * Update model_checkpoint.py * Update model_checkpoint.py * Update model_checkpoint.py * Update model_checkpoint.py * Update model_checkpoint.py * Update model_checkpoint.py * fix tests * Update test_model_checkpoint.py * Update model_checkpoint.py * Update model_checkpoint.py * Update model_checkpoint.py * Update test_model_checkpoint.py * ckpt-callback * Update test_model_checkpoint.py * Update model_checkpoint.py * Update model_checkpoint.py * validation-end * Update model_checkpoint.py * Update test_model_checkpoint.py * Update test_model_checkpoint.py * Update test_model_checkpoint.py * Update test_model_checkpoint.py * clarify-names - Make names explicit as to which hooks they apply to - Use step instead of batch for consistency with global step * Update model_checkpoint.py * Update model_checkpoint.py * Update model_checkpoint.py * Update model_checkpoint.py * Update model_checkpoint.py * mutual-exclusive Make every_n_train_steps and every_n_val_epochs mutually exclusive * fix-default-0 * Update CHANGELOG.md * formatting * make-private make attributes private to the class * rebase Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for cea170e - Browse repository at this point
Copy the full SHA cea170eView commit details
Commits on Mar 12, 2021
-
Configuration menu - View commit details
-
Copy full SHA for 6596447 - Browse repository at this point
Copy the full SHA 6596447View commit details -
Remove unused mixin attributes (#6487)
* Remove unused mixing attributes * Missing import
Configuration menu - View commit details
-
Copy full SHA for 518c7e4 - Browse repository at this point
Copy the full SHA 518c7e4View commit details -
[doc] Update the order of zero_grad and backward (#6478)
* Fix zero_grad in docs * Fix zero_grad in docs
Configuration menu - View commit details
-
Copy full SHA for 680e83a - Browse repository at this point
Copy the full SHA 680e83aView commit details
Commits on Mar 14, 2021
-
Configuration menu - View commit details
-
Copy full SHA for b2bcad1 - Browse repository at this point
Copy the full SHA b2bcad1View commit details -
Update docs for limit_predict_batches (#6507)
* add docs and minor updates * docs * fraction
Configuration menu - View commit details
-
Copy full SHA for dcd9dd8 - Browse repository at this point
Copy the full SHA dcd9dd8View commit details -
[bug] Update broadcast + reduce decision ModelCheckpoint] (#6410)
* resolve bug * update * update changelog * update PR * Update pytorch_lightning/trainer/connectors/logger_connector/epoch_result_store.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * add todo * resolve issues * resolve flake8 * update * add coverage for reduce * wip * restore back to brodbact * remove test.py * resolve flake8 * update * check world size * resolve test * update * use pytorch version when defined * update on comments * update on comments * flake8 * resolve bugs * Update CHANGELOG.md Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * update * update * update * update * remove test * update * resolve flake8 * update * update * update * proxy * update * update * resolve typo * prune * update parallel * update Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 0544efd - Browse repository at this point
Copy the full SHA 0544efdView commit details
Commits on Mar 15, 2021
-
Configuration menu - View commit details
-
Copy full SHA for 02fa32b - Browse repository at this point
Copy the full SHA 02fa32bView commit details -
Configuration menu - View commit details
-
Copy full SHA for 156847b - Browse repository at this point
Copy the full SHA 156847bView commit details -
document exceptions for metrics/functional (#6273)
* document exceptions for metrics/functional * Apply suggestions from code review Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> * Apply suggestions from code review Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: Akihiro Nitta <nitta@akihironitta.com>
Configuration menu - View commit details
-
Copy full SHA for 06756a8 - Browse repository at this point
Copy the full SHA 06756a8View commit details -
Mean Average Precision metric for Information Retrieval (1/5) (#5032)
* init information retrieval metrics * changed retrieval metrics names, expanded arguments and fixed typo * added 'Retrieval' prefix to metrics and fixed conflict with already-present 'average_precision' file * improved code formatting * pep8 code compatibility * features/implemented new Mean Average Precision metrics for Information Retrieval + doc * fixed pep8 compatibility * removed threshold parameter and fixed typo on types in RetrievalMAP and improved doc * improved doc, put first class-specific args in RetrievalMetric and transformed RetrievalMetric in abstract class * implemented tests for functional and class metric. fixed typo when input tensors are empty or when all targets are False * fixed typos in doc and changed torch.true_divide to torch.div * fixed typos pep8 compatibility * fixed types in long division in ir_average_precision and example in mean_average_precision * RetrievalMetric states are not lists and _metric method accepts predictions and targets for easier extension * updated CHANGELOG file * added '# noqa: F401' flag to not used imports * added double space before '# noqa: F401' flag * Update CHANGELOG.md Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * change get_mini_groups in get_group_indexes * added checks on target inputs * minor refactoring for code cleanness * split tests over exception raising in separate function && refactored test code into multiple functions * fixed pep8 compatibility * implemented suggestions of @SkafteNicki * fixed imports for isort and added types annontations to functions in test_map.py * isort on test_map and fixed typing * isort on retrieval and on __init__.py and utils.py in metrics package * fixed typo in pytorch_lightning/metrics/__init__.py regarding code style * fixed yapf compatibility * fixed yapf compatibility * fixed typo in doc Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: Nicki Skafte <skaftenicki@gmail.com> Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for 5d73fbb - Browse repository at this point
Copy the full SHA 5d73fbbView commit details -
Configuration menu - View commit details
-
Copy full SHA for eb3ff41 - Browse repository at this point
Copy the full SHA eb3ff41View commit details -
* deprecate metrics * examples * req * docs * Apply suggestions from code review Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> Co-authored-by: Nicki Skafte <skaftenicki@gmail.com> * pep8 Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> Co-authored-by: Nicki Skafte <skaftenicki@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for b341b53 - Browse repository at this point
Copy the full SHA b341b53View commit details -
[test] lr_find with bs_scale (#6422)
* init test: test_lr_find_with_bs_scale * Update test_lr_finder.py * remove gpu req * try boring model * custom boring model * pep8 * fix typo * Update test_lr_finder.py * typo * typo
Configuration menu - View commit details
-
Copy full SHA for c48fc6a - Browse repository at this point
Copy the full SHA c48fc6aView commit details -
* Clean up docs and add some explicitness around stages * Apply suggestions from code review Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 383565d - Browse repository at this point
Copy the full SHA 383565dView commit details -
Configuration menu - View commit details
-
Copy full SHA for ea36ee3 - Browse repository at this point
Copy the full SHA ea36ee3View commit details -
* Update hook lifecycle * Update docs/source/common/lightning_module.rst
Configuration menu - View commit details
-
Copy full SHA for 9c59733 - Browse repository at this point
Copy the full SHA 9c59733View commit details -
Prune metrics base classes 2/n (#6530)
* base class * extensions * chlog * _stable_1d_sort * _check_same_shape * _input_format_classification_one_hot * utils * to_onehot * select_topk * to_categorical * get_num_classes * reduce * class_reduce * tests
Configuration menu - View commit details
-
Copy full SHA for 6453091 - Browse repository at this point
Copy the full SHA 6453091View commit details -
Custom Plugin is_distributed (#6537)
* return from plugin * dont return for tpu
Configuration menu - View commit details
-
Copy full SHA for 6a14146 - Browse repository at this point
Copy the full SHA 6a14146View commit details
Commits on Mar 16, 2021
-
refactor reading env defaults (#6510)
* change tests * fix * test * _defaults_from_env_vars Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 0f07eaf - Browse repository at this point
Copy the full SHA 0f07eafView commit details -
Prune metric: helpers and inputs 3/n (#6547)
* _basic_input_validation * _check_shape_and_type_consistency * _check_num_classes_binary * _check_num_classes_mc * _check_num_classes_ml * _check_top_k * _check_classification_inputs * _input_format_classification * _reduce_stat_scores * DataType * rest * flake8 * chlog
Configuration menu - View commit details
-
Copy full SHA for a312219 - Browse repository at this point
Copy the full SHA a312219View commit details -
prune warning & deprecation wrapper (#6540)
* docs * wrapper * test * count * flake8
Configuration menu - View commit details
-
Copy full SHA for 555a6fe - Browse repository at this point
Copy the full SHA 555a6feView commit details -
Add outputs param for
on_val/test_epoch_end
hooks (#6120)* add outputs param for on_val/test_epoch_end hooks * update changelog * fix warning message * add custom call hook * cache logged metrics * add args to docstrings * use warning cache * add utility method for param in sig check * Update CHANGELOG.md Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * update docstring * add test for eval epoch end hook * add types and replace model ref * add deprecation test * fix test fx name * add model hooks warning * add old signature model to tests * add clear warning cache * sopport args param * update tests * add tests for model hooks * code suggestions * add signature utils * fix pep8 issues * fix pep8 issues * fix outputs issue * fix tests * code fixes * fix validate test * test Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for b190403 - Browse repository at this point
Copy the full SHA b190403View commit details -
[doc] Add Zero Grad
set_to_none=True
trick (#6548)* add trick to doc * update * update path * Update docs/source/benchmarking/performance.rst Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 00cd918 - Browse repository at this point
Copy the full SHA 00cd918View commit details
Commits on Mar 17, 2021
-
fix deprecation wrapper & tests (#6553)
* fix deprecation wrapper & tests * flake8
Configuration menu - View commit details
-
Copy full SHA for 297e438 - Browse repository at this point
Copy the full SHA 297e438View commit details -
prune metric: accuracy 4/n (#6515)
* prune accuracy * chlog * flake8 * Apply suggestions from code review Co-authored-by: Nicki Skafte <skaftenicki@gmail.com> * wrap * test * test * fix Co-authored-by: Nicki Skafte <skaftenicki@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 2f6ce1a - Browse repository at this point
Copy the full SHA 2f6ce1aView commit details
Commits on Mar 18, 2021
-
Prune metrics: AUC & AUROC (#6572)
* class: AUC AUROC * func: auc auroc * format * tests
Configuration menu - View commit details
-
Copy full SHA for 9e35f97 - Browse repository at this point
Copy the full SHA 9e35f97View commit details -
[doc] Update Dict Train Loader doc. (#6579)
* update doc * update example
Configuration menu - View commit details
-
Copy full SHA for 8853a36 - Browse repository at this point
Copy the full SHA 8853a36View commit details -
Prune metrics: precision & recall 6/n (#6573)
* avg precision * precision * recall * curve * tests * chlog * isort * fix
Configuration menu - View commit details
-
Copy full SHA for 38a2119 - Browse repository at this point
Copy the full SHA 38a2119View commit details -
Update Changelog for v1.2.4 (#6581)
* Update changelog for v1.2.4 * lagacy v1.2.4 * prune duplicates from changelog Co-authored-by: Jirka Borovec <jirka.borovec@seznam.cz>
Configuration menu - View commit details
-
Copy full SHA for b606171 - Browse repository at this point
Copy the full SHA b606171View commit details -
[Fix] Move init dist connection into the setup function (#6506)
* Move connection setup into the setup function. Call setup hook after we set up the accelerator * Added CHANGELOG.md * fix setup order in callback test * fix input arguments in test * Mock distributed function, remove protection to turn into training type hook * Remove import * Add missing mock, ensure custom plugin does not create children process * Skip test on windows * Update deepspeed to init connection in setup * Do not initialize distributed module * Move DeepSpeed tests to special tests since dist communication is being set up * Special the test to see if this fixes CI * Delete accelerator connector test to see if its causing build to fail * Delete deepspeed test * Revert "Delete accelerator connector test to see if its causing build to fail" This reverts commit edde60b * Revert "Delete deepspeed test" This reverts commit 9d317429 * Reverse hook * Reverse setup hooks to debug again * Add todo so i know where i left off * For single device move in pre_dispatch after setup function * Add additional model to device hook if any additional parameters have been set * See if we can enable deepspeed tests * Revert "See if we can enable deepspeed tests" This reverts commit b5450de * See if this hook approach works * Introduce new granular hooks * Remove import, fix tpu spawn by moving the function to setup * Added missing special test Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 4e9b453 - Browse repository at this point
Copy the full SHA 4e9b453View commit details
Commits on Mar 19, 2021
-
Configuration menu - View commit details
-
Copy full SHA for 983a888 - Browse repository at this point
Copy the full SHA 983a888View commit details -
Configuration menu - View commit details
-
Copy full SHA for 87c03b1 - Browse repository at this point
Copy the full SHA 87c03b1View commit details -
* add NVIDIA flows * push * pull * ... * extras * ci prune * fix * tag * . * list
Configuration menu - View commit details
-
Copy full SHA for 5780796 - Browse repository at this point
Copy the full SHA 5780796View commit details -
Automatically set sync_batchnorm for training_type_plugin (#6536)
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> Co-authored-by: Roger Shieh <sh.rog@protonmail.ch> Co-authored-by: Kaushik Bokka <kaushikbokka@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 3b72bcc - Browse repository at this point
Copy the full SHA 3b72bccView commit details -
Prune metrics: other classification 7/n (#6584)
* confusion_matrix * iou * f_beta * hamming_distance * stat_scores * tests * flake8 * chlog
Configuration menu - View commit details
-
Copy full SHA for 3a56a60 - Browse repository at this point
Copy the full SHA 3a56a60View commit details
Commits on Mar 20, 2021
-
Configuration menu - View commit details
-
Copy full SHA for cb59039 - Browse repository at this point
Copy the full SHA cb59039View commit details -
Add AMP for validation, prediction and testing (#6565)
* Add Tests for val and test-steps * Add native AMP * pep8 tests * pep8 plugin * changelog
Configuration menu - View commit details
-
Copy full SHA for 634d831 - Browse repository at this point
Copy the full SHA 634d831View commit details
Commits on Mar 21, 2021
-
Add trainer.predict config validation (#6543)
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 37f22c9 - Browse repository at this point
Copy the full SHA 37f22c9View commit details -
Configuration menu - View commit details
-
Copy full SHA for 42a7b70 - Browse repository at this point
Copy the full SHA 42a7b70View commit details -
Configuration menu - View commit details
-
Copy full SHA for 51c9260 - Browse repository at this point
Copy the full SHA 51c9260View commit details
Commits on Mar 22, 2021
-
Configuration menu - View commit details
-
Copy full SHA for 870247f - Browse repository at this point
Copy the full SHA 870247fView commit details -
Configuration menu - View commit details
-
Copy full SHA for 853523e - Browse repository at this point
Copy the full SHA 853523eView commit details -
Allow training type plugin to delay optimizer creation (FSDP 2/n) (#6331
Configuration menu - View commit details
-
Copy full SHA for 58c9fa7 - Browse repository at this point
Copy the full SHA 58c9fa7View commit details -
Add teardown method to BaseProfiler. (#6370)
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> Co-authored-by: ananthsub <ananth.subramaniam@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for e2e1de0 - Browse repository at this point
Copy the full SHA e2e1de0View commit details -
Configuration menu - View commit details
-
Copy full SHA for 1fae10a - Browse repository at this point
Copy the full SHA 1fae10aView commit details -
Configuration menu - View commit details
-
Copy full SHA for e62c7c7 - Browse repository at this point
Copy the full SHA e62c7c7View commit details -
[refactor] Add setup to profilers + _run_stage_setup to trainer 2/5 (#…
…6633) * add setup * update * updates on comment * Minor changes * Extra import * Docs Co-authored-by: Carlos Mocholi <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 2064ece - Browse repository at this point
Copy the full SHA 2064eceView commit details
Commits on Mar 23, 2021
-
fix comparing versions (#6434)
* fix comparing versions * chlog * . * ... * datasets
Configuration menu - View commit details
-
Copy full SHA for 8cd75a4 - Browse repository at this point
Copy the full SHA 8cd75a4View commit details -
Prune metrics: regression 8/n (#6636)
* explained_variance * tests * mean_absolute_error * mean_squared_error * mean_relative_error * mean_squared_log_error * chlog
Configuration menu - View commit details
-
Copy full SHA for efce2b7 - Browse repository at this point
Copy the full SHA efce2b7View commit details -
Configuration menu - View commit details
-
Copy full SHA for f93414d - Browse repository at this point
Copy the full SHA f93414dView commit details -
Refactor base profilers 3/5 (#6621)
Co-authored-by: tchaton <thomas@grid.ai>
Configuration menu - View commit details
-
Copy full SHA for 36d180e - Browse repository at this point
Copy the full SHA 36d180eView commit details -
Configuration menu - View commit details
-
Copy full SHA for a74909a - Browse repository at this point
Copy the full SHA a74909aView commit details -
* add predict_step * Update predict_loop.py * Update trainer.py * Update trainer.py * resolve bugs * update * update * update * resolve bug * resolve some failing tests * udpate tests * update * resolve tests * add a test * remove typo * add a test for attachement * update * changed to on_train_dataloader * remove __flash_special_attr__ * resolve tests * update * update * update * update on comments * Update pytorch_lightning/trainer/data_loading.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> Co-authored-by: Justus Schock <12886177+justusschock@users.noreply.github.com> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for 0995d30 - Browse repository at this point
Copy the full SHA 0995d30View commit details -
Configuration menu - View commit details
-
Copy full SHA for 3cf0c31 - Browse repository at this point
Copy the full SHA 3cf0c31View commit details -
Refactor PyTorch profiler 4/5 (#6349)
Co-authored-by: thomas chaton <thomas@grid.ai>
Configuration menu - View commit details
-
Copy full SHA for 51b10f7 - Browse repository at this point
Copy the full SHA 51b10f7View commit details -
Add PyTorch 1.8 Profiler 5/5 (#6618)
* Refactor profilers * Update PassThrough * WIP - This is broken and will change * Update pytorch_lightning/profiler/pytorch.py Co-authored-by: thomas chaton <thomas@grid.ai> * resolve tests * resolve tests * find output * try something * update * add support for test and predict * update * update * use getattr * test * test * update * tests * update * update * update * update * update * remove file * update * update * update * update * update * test * update# * update * update tests * update * add suport for 1.8 * rename records * add support for 1.8 * update * resolve flake8 * resolve test * Refactor basic profilers * Fixes * Unused import * Introduce setup * Profile on all ranks. Print to stdout on 0 * Introduce dirpath + filename * CHANGELOG * Add tests. Address comments * add `on_run_stage_setup` * add on_run_stage_setup function * update * add test for RegisterRecordFunction * update lightnng flow direction * move variable to private * remove trace * Undo code that should be in 3/4 * Multi-stage multi-rank * 2/5 changes * Pass stage in __del__ * Remove TODOs * Describe on_evaluation_end. Add tests * Typo * Address comments * deepcopy tests * Advanced teardown * Fix teardown test * Fix tests * Minor change * Update CHANGELOG.md * Fix test * Quick fixes * Fix 6522 * resolve ddp tests * resolve tests * resolve some tests * update tests * resolve tests * update * resolve tests * resolve some tests * Missed fixes from 3/5 * Fixes * resolve some tests * resolve test for 1.7.1 * Broken refactor * Missed stage * Minor changes * resolve tests * Update CHANGELOG * resolve bug * remove print * Typo * Cleanup * resolve ddp test * remove barrier * update profiler * update * Smaller model * update * resolve tests * update * Minor changes. CHANGELOG * Minimize diff * update to 1.8.1 * RunIf. Extra code. Check segfault * resolve tests * Typo. Bad merge * Fixing a bad merge * replace for kineto * Update pytorch_lightning/profiler/pytorch.py Co-authored-by: ananthsub <ananth.subramaniam@gmail.com> * Update pytorch_lightning/profiler/pytorch.py Co-authored-by: ananthsub <ananth.subramaniam@gmail.com> * Minor changes * Bad merge * Use lists for flexibility * Use sets * predict_step * Ananth's suggestion * update * Docs * Update pl_examples/basic_examples/profiler_example.py Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com> * update example * update example Co-authored-by: Carlos Mocholi <carlossmocholi@gmail.com> Co-authored-by: ananthsub <ananth.subramaniam@gmail.com> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for fd5cb7f - Browse repository at this point
Copy the full SHA fd5cb7fView commit details -
update coverage config (#6524)
* update coverage config * parallel * parallel * Apply suggestions from code review * Apply suggestions from code review * paralel * paralel * paralel * combine * combine * . * .. * .. * .. * rev * cb * cb * drop * drop * . * .. * ... * ... * ... * .
Configuration menu - View commit details
-
Copy full SHA for 64d0fa4 - Browse repository at this point
Copy the full SHA 64d0fa4View commit details -
Configuration menu - View commit details
-
Copy full SHA for 741c452 - Browse repository at this point
Copy the full SHA 741c452View commit details -
Configuration menu - View commit details
-
Copy full SHA for b1e3dcc - Browse repository at this point
Copy the full SHA b1e3dccView commit details
Commits on Mar 24, 2021
-
Prune metrics: others 11/DoNe (#6659)
* classif * grad_img * nlp * ssl * format
Configuration menu - View commit details
-
Copy full SHA for 70beddf - Browse repository at this point
Copy the full SHA 70beddfView commit details -
fix: update example autoencoder.py to reflect args (#6638)
* fix: update example autoencoder.py to reflect args * Update pl_examples/basic_examples/autoencoder.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for cbca6cd - Browse repository at this point
Copy the full SHA cbca6cdView commit details -
Configuration menu - View commit details
-
Copy full SHA for 5733889 - Browse repository at this point
Copy the full SHA 5733889View commit details -
Feature/double precision (#6595)
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: Justus Schock <12886177+justusschock@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for d02fe34 - Browse repository at this point
Copy the full SHA d02fe34View commit details -
* Remove E231 from ignore list * Follow E231 * Update pytorch_lightning/trainer/data_loading.py
Configuration menu - View commit details
-
Copy full SHA for ac60536 - Browse repository at this point
Copy the full SHA ac60536View commit details -
Configuration menu - View commit details
-
Copy full SHA for ab4c838 - Browse repository at this point
Copy the full SHA ab4c838View commit details -
Configuration menu - View commit details
-
Copy full SHA for d471fa3 - Browse repository at this point
Copy the full SHA d471fa3View commit details -
MetricsHolder
clean-up + typing (#6645)* Metrics holder cleanup and better error message * Update pytorch_lightning/trainer/connectors/logger_connector/logger_connector.py * _VALUE -> _METRIC_TYPE
Configuration menu - View commit details
-
Copy full SHA for 2dd6f9e - Browse repository at this point
Copy the full SHA 2dd6f9eView commit details
Commits on Mar 25, 2021
-
Configuration menu - View commit details
-
Copy full SHA for b8ef52b - Browse repository at this point
Copy the full SHA b8ef52bView commit details -
Fix checkpoint callback & Trainer.test(_) issue for TPUs (#6654)
* Fix checkpoint callback issue for TPUs * update changelog * add barrier * apply code suggestions * update trainer test * remove spaces * fix tpu tests * Apply suggestions from code review * add comment Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for 2cbdc01 - Browse repository at this point
Copy the full SHA 2cbdc01View commit details -
Configuration menu - View commit details
-
Copy full SHA for 92a1671 - Browse repository at this point
Copy the full SHA 92a1671View commit details -
Support teardown hook on DataModule (#4673)
Co-authored-by: Carlos Mocholi <carlossmocholi@gmail.com> Co-authored-by: chaton <thomas@grid.ai>
Configuration menu - View commit details
-
Copy full SHA for 40976e4 - Browse repository at this point
Copy the full SHA 40976e4View commit details -
Add on_epoch_start to run at the beginning of every loop irrespective…
… of train/val/test (#6498) * update docs * add hook and update docs * update tests * chlog * Update CHANGELOG.md Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com> * chlog Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 9be092d - Browse repository at this point
Copy the full SHA 9be092dView commit details -
* use external deprecate * simplify * simplify * simplify * flake8 * . * others * .
Configuration menu - View commit details
-
Copy full SHA for 217c12a - Browse repository at this point
Copy the full SHA 217c12aView commit details -
Resolve schedule step bug for PyTorch Profiler (#6674)
Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 0ea8f39 - Browse repository at this point
Copy the full SHA 0ea8f39View commit details -
Add artifcact_location arg to MLFlow logger (#6677)
* Add artifcact_location arg to MLFlow logger * Add CHANGELOG URL * Update test
Configuration menu - View commit details
-
Copy full SHA for 6b990f3 - Browse repository at this point
Copy the full SHA 6b990f3View commit details
Commits on Mar 26, 2021
-
Configuration menu - View commit details
-
Copy full SHA for bc61361 - Browse repository at this point
Copy the full SHA bc61361View commit details -
Configuration menu - View commit details
-
Copy full SHA for b730a5a - Browse repository at this point
Copy the full SHA b730a5aView commit details -
Configuration menu - View commit details
-
Copy full SHA for 21fc5eb - Browse repository at this point
Copy the full SHA 21fc5ebView commit details -
[warning] Add warning when values are not being reduced (#6417)
* add warning non reduced * add test * update test * update changelog * Update pytorch_lightning/trainer/connectors/logger_connector/epoch_result_store.py Co-authored-by: Justus Schock <12886177+justusschock@users.noreply.github.com> * update Co-authored-by: Kaushik B <45285388+kaushikb11@users.noreply.github.com> Co-authored-by: Justus Schock <12886177+justusschock@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for 0e45220 - Browse repository at this point
Copy the full SHA 0e45220View commit details
Commits on Mar 28, 2021
-
Configuration menu - View commit details
-
Copy full SHA for f0c5479 - Browse repository at this point
Copy the full SHA f0c5479View commit details
Commits on Mar 29, 2021
-
Configuration menu - View commit details
-
Copy full SHA for dcf6e4e - Browse repository at this point
Copy the full SHA dcf6e4eView commit details -
More explicit exception message when testing with fast_dev_run=True (#…
…6667) Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for cca0eca - Browse repository at this point
Copy the full SHA cca0ecaView commit details -
* support python 3.9 * update CI * onnxruntime * . * . * onnxruntime * t 55 * t 75 * add script * use * onnx * onnx * onnx * whl * np * find * 21 * Apply suggestions from code review * Apply suggestions from code review * onnx * CI * req * ~ dockers * min * . * drop horovod * drop horovod * drop horovod * fix * fix * .
Configuration menu - View commit details
-
Copy full SHA for 5b5a5cc - Browse repository at this point
Copy the full SHA 5b5a5ccView commit details -
[TPU] update is_tpu_exists utils internal logic to rely on xmp.spawn (#…
…6719) * update_logic * update * Update tests/utilities/test_xla_device_utils.py * Update pytorch_lightning/utilities/xla_device.py Co-authored-by: Kaushik B <45285388+kaushikb11@users.noreply.github.com> * Update pytorch_lightning/utilities/xla_device.py Co-authored-by: Kaushik B <45285388+kaushikb11@users.noreply.github.com> * update test * Update tests/utilities/test_xla_device_utils.py * update * Apply fix * Docstring * flake8 * update Co-authored-by: Your Name <you@example.com> Co-authored-by: Kaushik B <45285388+kaushikb11@users.noreply.github.com> Co-authored-by: Carlos Mocholi <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for 3a4c424 - Browse repository at this point
Copy the full SHA 3a4c424View commit details -
[refactor] Move save_function to accelerator 1/n [DeepSpeed] (#6689)
* move save_checkpoint responsability to accelerator * update
Configuration menu - View commit details
-
Copy full SHA for 646cf2f - Browse repository at this point
Copy the full SHA 646cf2fView commit details -
[Model Parallel] Add configure sharded model hook (#6679)
* Add base hook for model parallel * fix callback signature * Simplify hook * Add hook logic * add tests * add property setter * add logic for being called once * Update changelog * Fix * fix return type * fix lambda callback test * Fix tests * Apply code suggestions * add logic for setup_optimizers_predispatch * add common dummy model * Swap call order * Remove test that isn't needed anymore * Update tests * Add a bit more doc * Few code review fixes * Update pytorch_lightning/accelerators/accelerator.py Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com> * Change hook name * Fix test * Test setup hook, refactor names * Swap call order of callbacks and model initialization * Change name of context manager Co-authored-by: SeanNaren <sean@grid.ai> Co-authored-by: Sean Naren <sean.narenthiran@gmail.com> Co-authored-by: Carlos Mocholí <carlossmocholi@gmail.com>
Configuration menu - View commit details
-
Copy full SHA for f79a13e - Browse repository at this point
Copy the full SHA f79a13eView commit details -
Configuration menu - View commit details
-
Copy full SHA for 3c86193 - Browse repository at this point
Copy the full SHA 3c86193View commit details
Commits on Mar 30, 2021
-
Configuration menu - View commit details
-
Copy full SHA for 9044470 - Browse repository at this point
Copy the full SHA 9044470View commit details -
Configuration menu - View commit details
-
Copy full SHA for 583fcf2 - Browse repository at this point
Copy the full SHA 583fcf2View commit details