Backup Neo4j Database to AWS S3

| Comments

Update (2014-07-28)

I got some issues using tar with big folder size. I fixed it by using using 7zip using LZMA2 compression algorithm instead LZO.

New Gist available here. Please feel free to improve it !

Purpose

I needed to daily backup a Neo4j Database on AWS S3 via a cron task so i developed a shell script doing the job.

Script

Steps

  1. neo4j-backup is doing a backup of the database to a local target folder given. Be sure to have the available space in local. Binary : neo4j-backup

  2. tar is archiving all files into one. No gzip or bzip compression here since it was too slow for my file (> 100 Go). Binary : tar

  3. lzop is a very fast compression algorithm who compressing the file in few minutes and is saving file size to upload on AWS S3. Binary : lzop

  4. aws s3 cp is uploading our file to S3 using Amazon S3 Multipart Upload if the file size is big. It’s uploading a file faster. Binary : aws

Cron task

Add a file into /etc/cron.d/neo4j-backup with:

1
2
# Run a daily backup at 4:00 AM.
0 4 * * * root /bin/sh /opt/neo4j-enterprise-1.9.8/backup_neo4j_to_s3.sh 127.0.0.1 6362 /mnt/datadisk/backup

Gist

Available here. Please feel free to improve it ! Update gist (2014-07-28) available here. Please feel free to improve it !

More…