Jamon Holmgren

2024 2023

Backing up Google Photos to Amazon Glacier

January 3, 2024

I have a LOT of photos in Google Photos.

My wife and I started taking a few digital photos (mixed with regular film photos) when we started dating in 2002ish. But we really didn't start taking a lot of photos in earnest until 2005, when our son Cedric was born.

Jamon standing on a ridge in 2002
One of the earliest digital photos of me, standing on a ridge in Lava Canyon in southwest Washington state in 2002.

Since that time, we've taken thousands and thousands of photos and videos, amounting to just under a terabyte of data. Initially they were all uploaded to Google Picasa Web, but then that was migrated to Google Photos.

After deliberating about this for quite some time, I finally decided to back up our entire archive. I chose Amazon Glacier because it's very cheap long-term storage.

Downloading the archive

I started by buying a 2 TB Crucial external drive that I could connect to my Mac's Thunderbolt/USB-C port. Having an external drive served two purposes: one, I don't blow up my Mac's hard drive when I download all these photos and videos, and two, now I have another backup -- this one locally.

I then went to Google Takeout. (Make sure you're in the right Google account if you're signed into multiple!). In the "Select data to include" section, I chose the "deselect all" button first, then scrolled down to Google Photos and checked the box next to it. Then I scrolled ALL the way to the bottom and clicked "Next step".

In the "Choose file type, frequency & destination" section, I chose the "Send download link to email" option. It would be amazing if they had a way to choose an Amazon S3 bucket (or better yet, Glacier itself), but they only support Drive, Dropbox, OneDrive, and Box as of this date. I chose the "Export once" option, .zip, and for file size I chose 10 GB. (I experimented with 50 GB but that was tough to download and upload effectively.)

After that, I waited a few days for Google Takeout to send me a link.

Once I had a link, it brought me to a page where I could download the ZIP exports one by one ... about 85 of them. I clicked to download about two or three at a time, putting them on the new external drive I bought, and let them download. It made me log in nearly every time which was annoying. Also, you only have about a week to download them, and with how many I needed to download, I cut it kinda close.

While you are downloading, you can prepare for uploading with the following instructions.

Uploading to AWS Glacier

I already have an Amazon AWS account, but if you don't, sign up for one. I won't walk you through that. If you're not able to sign up then this is probably too technical for you.

Here are the steps I took to create the credentials and Glacier bucket:

  1. Log into the AWS Console as a "root user"
  2. Go to the IAM security credentials section (you can choose a region in the top right, but I just left it as "Global" for this section)
  3. Create an access key and secret there and copy it somewhere.
  4. Install AWS's CLI (these instructions are for macOS): brew install awscli
  5. Log in using the access key and secret: aws configure
  6. Change directories into wherever you downloaded your backups. For me, it was in an external volume: cd "/Volumes/Crucial X8/Backups/JamonAndChyra-GooglePhotos"
  7. Create a Glacier bucket in the region of your choice: aws s3 mb s3://bucketnamehere --region us-west-2
  8. When your zip files are done downloading, you can upload them either all at once like this: aws s3 cp . s3://bucketnamehere/ --recursive --exclude "*" --include "takeout-*.zip" --storage-class DEEP_ARCHIVE ...or one at a time like this: aws s3 cp . s3://bucketnamehere/ --recursive --exclude "*" --include "takeout-*-001.zip" --storage-class DEEP_ARCHIVE ...or in blocks of 10 like this: aws s3 cp . s3://bucketnamehere/ --recursive --exclude "*" --include "takeout-*-00?.zip" --storage-class DEEP_ARCHIVE aws s3 cp . s3://bucketnamehere/ --recursive --exclude "*" --include "takeout-*-01?.zip" --storage-class DEEP_ARCHIVE aws s3 cp . s3://bucketnamehere/ --recursive --exclude "*" --include "takeout-*-02?.zip" --storage-class DEEP_ARCHIVE

This part is the most painstaking.

Restoring the backup

I haven't yet had to restore from a backup yet. Theoretically, you could download using a command something like this to download it to your local folder: aws s3 cp s3://bucketnamehere/your-backup-file.zip . --storage-class DEEP_ARCHIVE

Good luck!