Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dir clean up on delete #371

Open
Callisto13 opened this issue Jan 26, 2022 · 9 comments
Open

Dir clean up on delete #371

Callisto13 opened this issue Jan 26, 2022 · 9 comments
Labels
help wanted Requires help from contributors to get done kind/bug Something isn't working priority/backlog Higher priority than priority/awaiting-more-evidence.

Comments

@Callisto13
Copy link
Member

Callisto13 commented Jan 26, 2022

I don;t know whether this was intentional, but while before deletes cleared the dir structure back up to the namespace, now we leave the name dir as well.

Close if this is meant to be (it does look weird with dead capmvm names lying around)

Before delete:

/var/lib/flintlock/vm/<namespace>/<name>/<uid>

After delete:

/var/lib/flintlock/vm/<namespace>/<name>

Ideally we would remove back to

/var/lib/flintlock/vm/<namespace>

if all identically names mvms in that namespace are removed, and

/var/lib/flintlock/vm

if all mvms in that namespace are removed

@Callisto13 Callisto13 added kind/bug Something isn't working good first issue Good for newcomers labels Jan 26, 2022
@Callisto13 Callisto13 removed the good first issue Good for newcomers label Feb 3, 2022
@yitsushi
Copy link
Contributor

yitsushi commented Feb 3, 2022

Closing, this will be part of #90

@yitsushi yitsushi closed this as completed Feb 3, 2022
@richardcase richardcase reopened this Jul 20, 2022
@richardcase richardcase added the priority/backlog Higher priority than priority/awaiting-more-evidence. label Jul 20, 2022
@Callisto13 Callisto13 added the help wanted Requires help from contributors to get done label Jul 20, 2022
@LutzLange
Copy link

I do have a lot of old directories lying around on my Liquid Metal host. I need to clean these up manually like this :
$ find /var/lib/flintlock/vm/default/ -type d -empty -delete

It would be nice if that were not the case.

@yitsushi
Copy link
Contributor

yitsushi commented Jul 20, 2022

Large scale, it can be problematic without a locking mechanism.
For example we find /var/lib/flintlock/vm/<namespace>/ is empty, we try to
delete it,but at the same time another VM is being reconciled and would like to
use that directory, but their ShouldDo function said we don't have to create
that directory.

Or we can live with that. I think the simplest solution if we care about FS
garbage (I assume we would be happier if we clean up containerd as well), we can
add a simply check in deleteDirectory -> Do(ctx) to walk back back until hits
flintlock's vm root directory. This approach is ugly and right now steps have no
concept of application or application configuration, they do their only job
without extra context / knowledge of other steps. And that's a good thing.

We can't add all parent directory to delete and it will not delete the directory
if it's not empty because the pre-check ShouldDo will tell the system "hey
there are VMs in that directory", and if we mess up one check it can delete
other VMs accidentally (as it uses RemoveAll).

And that leads back to my original proposal to write a background goroutine or
cron job that does a clean-up periodically.

@Callisto13
Copy link
Member Author

👍 thanks for recapping @yitsushi, i knew there was "stuff" we had to consider for this 😁

@Callisto13
Copy link
Member Author

I think a background routine which periodically locks the filesystem to creates, recursively removes anything empty, then unlocks would be fine for now.
The remove operations won't be massively slow, so any parallel incoming calls to create should be fine waiting for that second.

their ShouldDo function said we don't have to create that directory.

that check is weird tbh. doesn't cost anything to create over an existing one

@LutzLange
Copy link

I'm creating an deleting clusters over and over again in my demo environment, and I need to clean up from time to time :

$ tree -d 1 /var/lib/flintlock/vm/default/
1 [error opening dir]
/var/lib/flintlock/vm/default/
├── dev15-control-plane-b7fzn
├── dev16-control-plane-m8zhd
├── dev20-control-plane-m76bn
├── dev21-control-plane-plsdk
├── dev25-md-0-l4k8d
├── dev25-md-0-vgzrp
├── dev26-md-0-855rp
│   └── 01G7EMFEMWWXGZTWJ6WJ7WVN46
└── dev26-md-0-k8mwl
└── 01G7EMFEGAXAAY4E03K7TJDSH6

@github-actions
Copy link
Contributor

This issue is stale because it has been open 60 days with no activity.

@github-actions github-actions bot added the lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. label May 19, 2023
@yitsushi
Copy link
Contributor

Still valid.

Copy link
Contributor

This issue was closed because it has been stalled for 365 days with no activity.

@github-actions github-actions bot added the lifecycle/rotten Denotes an issue or PR that has aged beyond stale and will be auto-closed. label May 19, 2024
@richardcase richardcase reopened this Jul 10, 2024
@richardcase richardcase removed lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. lifecycle/rotten Denotes an issue or PR that has aged beyond stale and will be auto-closed. labels Jul 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Requires help from contributors to get done kind/bug Something isn't working priority/backlog Higher priority than priority/awaiting-more-evidence.
Projects
No open projects
Status: Closed
Development

No branches or pull requests

4 participants