-
Notifications
You must be signed in to change notification settings - Fork 212
Fix pretraining_transforms
for ImageEmbedder
#1196
Conversation
Codecov Report
@@ Coverage Diff @@
## master #1196 +/- ##
==========================================
+ Coverage 90.99% 91.11% +0.12%
==========================================
Files 284 284
Lines 12740 12755 +15
==========================================
+ Hits 11593 11622 +29
+ Misses 1147 1133 -14
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
Update: looks like it's still not called. WIP :( |
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
…tning/lightning-flash into fix/ImageEmbedder/transforms
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
…tning/lightning-flash into fix/ImageEmbedder/transforms
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome!!! LGTM, just a few small comments
tests/image/embedding/test_model.py
Outdated
"backbone, training_strategy, head, pretraining_transform", | ||
[ | ||
("vision_transformer", "simclr", "simclr_head", "simclr_transform"), | ||
("vision_transformer", "dino", "dino_head", "dino_transform"), | ||
("vision_transformer", "barlow_twins", "simclr_head", "barlow_twins_transform"), | ||
("vision_transformer", "swav", "swav_head", "swav_transform"), | ||
], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice!
…tning/lightning-flash into fix/ImageEmbedder/transforms
@ethanwharris - How to get rid of codecov failures? Can you please help with that? Also, there are these tests failing: https://dev.azure.com/PytorchLightning/lightning%20Flash/_build/results?buildId=59765&view=logs&j=fb683405-d979-52da-6de9-2541dff429a6
With an error: Resource �[93mpunkt�[0m not found.
2022-03-03T04:45:07.0342026Z Please use the NLTK Downloader to obtain the resource: And the suggested fix from the CI: # suggestion from the error:
import nltk
nltk.download('punkt') Should we try downloading |
…tning/lightning-flash into fix/ImageEmbedder/transforms
Thank you @ethanwharris for fixing the failures! All the tests pass now 🚀, regarding the DeepSource failures, I think we can safely ignore them for now? (is there a way to auto-merge with these failures?) I have a minor suggestion though: https://deepsource.io/gh/PyTorchLightning/lightning-flash/run/cc57fc8b-c459-4137-8c24-69f7ddc9b119/python/PYL-W0221 - I think deepsource is right here, and we can make |
Co-authored-by: Ethan Harris <ethanwharris@gmail.com>
Co-authored-by: Ethan Harris <ethanwharris@gmail.com>
What does this PR do?
Attempts to fix #1185. Before this,
pretraining_transforms
was never called.Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃