I got some issues using
tar with big folder size. I fixed it by using
using 7zip using LZMA2 compression algorithm instead LZO.
New Gist available here. Please feel free to improve it !
I needed to daily backup a Neo4j Database on AWS S3 via a cron task so i
developed a shell script doing the job.
neo4j-backup is doing a backup of the database to a local target folder
given. Be sure to have the available space in local.
Binary : neo4j-backup
tar is archiving all files into one. No gzip or bzip compression here
since it was too slow for my file (> 100 Go).
Binary : tar
lzop is a very fast compression algorithm who compressing the file in few
minutes and is saving file size to upload on AWS S3.
Binary : lzop
aws s3 cp is uploading our file to S3 using Amazon S3 Multipart Upload if
the file size is big. It’s uploading a file faster.
Binary : aws
Add a file into
# Run a daily backup at 4:00 AM.
0 4 * * * root /bin/sh /opt/neo4j-enterprise-1.9.8/backup_neo4j_to_s3.sh 127.0.0.1 6362 /mnt/datadisk/backup
Available here. Please feel free to improve it !
Update gist (2014-07-28) available here. Please feel free to improve it !