Skip to content

Instantly share code, notes, and snippets.

@davidporter-id-au
Created March 29, 2016 13:07
Show Gist options
  • Save davidporter-id-au/1be55b4e7c3e9ba3b326 to your computer and use it in GitHub Desktop.
Save davidporter-id-au/1be55b4e7c3e9ba3b326 to your computer and use it in GitHub Desktop.

2014.... so, I can't really remember much of what happened that year beyond breaking up with my girlfriend. I'm not going to be able to shed all that much light on whats going on there, but to be honest I'd not be all that surprised if the dropbox daemon just died/borked/stopped working or something. I honestly wouldn't try and repair it, it might not work for large amounts of data.

Oddly enough I was kinda doing something similar today with a cronjob, actually the first database backup script I've written here (things move slower). What I'd suggest is rather than caring about the piece-of-shit dropbox software we just dump it into amazon s3 because it's pretty cheap.

All that's needed is to do a mysql dump into a file, compress it with Gzip or something and then you can upload quite simply with the AWS cli tools. The script to run in cron might look something like this:

first, install python pip:

sudo apt-get update && sudo apt-get install python-pip -y

then install awscli to do be able to upload shit

sudo pip install awscli

And then go and create an s3 bucket for uploading stuff via the Amazon console or terraform or cloudformation or whatever.

Once you've got a bucket, you'll need to give permission to the ec2 instance to access it. This is possibly the first of the two tricky bits:

  • The better, more secure and saner way is to re-launch the instance with the IAM policy assigned to the ec2-instance role like this.
  • The probably-more-feasible-without-doing-heaps-of-work is to fairly insecurely add the credentials to the machine and just allow access to s3, something like this. Both approaches will work, just if you choose this approach you must make sure you don't give access to everything, you must constrain it to only s3 access, because there's a high-risk that these credentials can leak and you'll end up with a huge (multi-thousand-dollar-per-hour) bill.

Either way, just give me a call anytime (lunchtime is easiest) and I'll happily walk you through any bits which are tricky (though I'll be out of the country for the next two weeks after friday).

So after all that you should be able to upload stuff like this:

# to s3 -> 
aws s3 cp some-local-file s3://my-s3-bucket/bucket-folder/file

# <-from s3 
aws s3 cp s3://my-s3-bucket/bucket-folder/file some-local-file

Once that's all done uploads should be ready. Anyway, permissions aside the second tricky bit is writing the backup script and getting the script to work in cron. Cron's annoying, but there's heaps of syntax things online

Stick something like this in a bash script maybe?

#!/bin/bash -e
cd $(dirname $0) # Angry directory hax to make cwd sane

mysqldump --some-flags-I-can-remember rshiz.sql
cat shiz.sql | tar -zcvf shiz.sql.gz # compress the sql

# Copy to s3 and clean up: 
todaysDate=$(date +%Y%m%dT%H%M)
aws s3 cp shiz.sql.gz s3://my-backup-s3-bucket/backups/$todaysDate/shiz.sql.gz
rm -rf *.sql # clean up shit
rm -rf *.sql.gz

And once you've got that working you'll want to add your bash script to cron if it isn't already. To edit your crontab:

crontab -e

And then you should be able to add a line which runs your backup script whenever.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment