You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
More details: enforce_retention_duration was added recently and that's successfully letting me bypass the retention duration error I was facing before. Let me know if I'm missing something here and there's a way to get the files vacuumed with the existing code.
The text was updated successfully, but these errors were encountered:
From what I can tell, the bug is that if you provide a relative path to DeltaTable() constructor, then vacuum doesn't delete any files. But it works if you pass an absolute path. It seems like we've probably only tested with the latter.
@wjones127 - looks like you added a fix already, awesome ;)
I verified in the notebook that vacuum does work as anticipated with absolute paths and this is a relative path bug, like you expected. Thanks for the rapid response time!!
Environment
Delta-rs version: 0.5.8
Binding: Python
Environment: localhost
Bug
What happened: Ran
dt.vacuum(retention_hours=0, enforce_retention_duration=False, dry_run=False)
and files didn't actually get deleted.What you expected to happen: I thought the stale files would actually get deleted.
How to reproduce it: Here's the notebook to reproduce this.
More details:
enforce_retention_duration
was added recently and that's successfully letting me bypass the retention duration error I was facing before. Let me know if I'm missing something here and there's a way to get the files vacuumed with the existing code.The text was updated successfully, but these errors were encountered: