You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Oct 9, 2023. It is now read-only.
Hi, @betairylia - Thanks for creating this issue and also for the additional context. Apart from the _patch_dropout_layers, there are a few other things that have changed in baal. Our CI currently tests against 1.3.2 version of baal, and I'll be working on making Flash work on baal's latest release.
Meanwhile, if it's urgent, and you need to make it work, could you downgrade the baal version to 1.3.2? pip install baal==1.3.2.
🐛 Bug
I could not import
flash.image.classification.integrations.baal.ActiveLearningDataModule
as being described in https://devblog.pytorchlightning.ai/active-learning-made-simple-using-flash-and-baal-2216df6f872cTo Reproduce
Code sample
gives
Expected behavior
Modules got imported without errors.
Environment
BaaL
1.5.2Additional context
Seems baal team has removed
_patch_dropout_layers
: baal-org/baal@1006861 (baal/bayesian/dropout.py
).Is this relevant or I'm doing it in a wrong way?
The text was updated successfully, but these errors were encountered: