You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jul 5, 2021. It is now read-only.
I'll use glacier to do day by day backups.
In this scenario it will be nice to have the ability to schedule a deletion of file older than X days ( where X > 90 to avoid deletion fee :-).
Do you think that this feature could be useful also to others ?
The text was updated successfully, but these errors were encountered:
Is this a feature of Amazon Glacier? I know that you can use the S3 lifecycle to automatically move old files from your S3 buckets into a glacier vault, but I'm not aware of any lifecycle features of the vault.
Can you maybe give an URL to the Glacier documentation for this?
Most people store metadata for their Glacier archives externally. With this, it's possible to query the metadata on a schedule and then determine which archives to delete. A simple workflow would be something like:
Upload to Glacier
Store metadata (could use a small DynamoDB table)
Scheduled Lambda function queries the DynamoDB table. Receives a list of archives, determines which are older than X days. Lambda function could then call another Lambda function to delete the archives on your behalf.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I'll use glacier to do day by day backups.
In this scenario it will be nice to have the ability to schedule a deletion of file older than X days ( where X > 90 to avoid deletion fee :-).
Do you think that this feature could be useful also to others ?
The text was updated successfully, but these errors were encountered: