Backup Neo4j Database to AWS S3
Update (2014-07-28)
I got some issues using tar with big folder size. I fixed it by using
using 7zip using LZMA2 compression algorithm instead LZO.
New Gist available here. Please feel free to improve it !
Purpose
I needed to daily backup a Neo4j Database on AWS S3 via a cron task so i developed a shell script doing the job.
Script
Steps
neo4j-backupis doing a backup of the database to a local target folder given. Be sure to have the available space in local. Binary : neo4j-backuptaris archiving all files into one. No gzip or bzip compression here since it was too slow for my file (> 100 Go). Binary : tarlzopis a very fast compression algorithm who compressing the file in few minutes and is saving file size to upload on AWS S3. Binary : lzopaws s3 cpis uploading our file to S3 using Amazon S3 Multipart Upload if the file size is big. It’s uploading a file faster. Binary : aws
Cron task
Add a file into /etc/cron.d/neo4j-backup with:
1 2 | |
Gist
Available here. Please feel free to improve it ! Update gist (2014-07-28) available here. Please feel free to improve it !