Backup data from websites such as Facebook or Google Takeout. This can be extremely useful if you have a data cap from your ISP, but still wish to backup your data.
- Wait for Google Takeout or Facebook Downloads to finish processing
- ** Create your GPG public key
- Get the download link for each file you wish to download
- Create the required json file. Save the json in
tf/data/data.json
- Follow the steps in the terraform doc
** Skip this step and leave the gpg_email
value blank when running the script to avoid encryption before uploading
Steps to use:
- Wait for Google Takeout or Facebook Downloads to finish processing
- * Create the aws instance
- * Initialize the hard drive
- ** Create and import your GPG public key into the new ec2 virtual machine
- Get the download link for each file you wish to download
- Using the links found in the previous step, create the required json file
- Run the script
- Destroy or shut down the instance
* Can be skipped if starting from a previously shutdown ec3 instance
** Skip this step and leave the gpg_email
value blank when running the script to avoid encryption before uploading
That's it! The script will go through all links and download the file to your Deep Archive S3 bucket
Taken and edited from https://gunargessner.com/takeout