Skip to content

Backup data from websites such as Facebook or Google Takeout

License

Notifications You must be signed in to change notification settings

JacobFrericks/backup_data

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Backup Data

Backup data from websites such as Facebook or Google Takeout. This can be extremely useful if you have a data cap from your ISP, but still wish to backup your data.

Easy/Automatic (using Terraform)

  1. Wait for Google Takeout or Facebook Downloads to finish processing
  2. ** Create your GPG public key
  3. Get the download link for each file you wish to download
  4. Create the required json file. Save the json in tf/data/data.json
  5. Follow the steps in the terraform doc

** Skip this step and leave the gpg_email value blank when running the script to avoid encryption before uploading

Hard/Manual

Steps to use:

  1. Wait for Google Takeout or Facebook Downloads to finish processing
  2. * Create the aws instance
  3. * Initialize the hard drive
  4. ** Create and import your GPG public key into the new ec2 virtual machine
  5. Get the download link for each file you wish to download
  6. Using the links found in the previous step, create the required json file
  7. Run the script
  8. Destroy or shut down the instance

* Can be skipped if starting from a previously shutdown ec3 instance

** Skip this step and leave the gpg_email value blank when running the script to avoid encryption before uploading

That's it! The script will go through all links and download the file to your Deep Archive S3 bucket

Taken and edited from https://gunargessner.com/takeout

About

Backup data from websites such as Facebook or Google Takeout

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published