So without getting into too much detail this is basically what I did today:
1) I stopped all weekly backups from the WHM "Backup Configuration" page.
2) I deleted all weekly and monthly backup files from the server by SSHing in and going to the /backup/cpbackup/weekly directory ... And deleting the big tar file.
3) In the bottom of that same WHM "Backup Configuration" page I checked the "/scripts/postcpbackup" option and wrote the contents of that script. Which you'll find below:
#!/usr/bin/perl $directory='/backup/cpbackup/daily/'; $filename='mybackup.tar'; $awsDir='/home4/myuser/phpbackupdir/'; $awsScriptPath='/home4/myuser/phpbackupdir/s3_backup.php'; chdir $directory; `gzip -f $directory . $filename`; chdir $awsDir; `php5 $awsScriptPath`;
The above code basically compresses the file after it's been created. This alone will save you a huge amount of space.
Then it calls a php script that I downloaded from this site which uploads the resulting compressed file to Amazon S3. Admittedly I hacked that script a bit to comment out all the archiving since my bluehost account already does that for me.
What I love about this is the postcpbackup script will automatically run after the backup is complete so I don't have to actually call that using a cron job.
I also added a mandrill bit to all this to send me an email when it's done.
So what do I get with all this?
Well, I get to not only remove my monthly and weekly backups from the server saving me about 10GB, I also get to compress the daily file from about 5GB to 1.8GB which frees up another 3GB.
Not only that, I get to keep 30 days of backups on AWS (so like 60GB) and anytime I need to I can access that regardless of Bluehost being up or not.
What really surprised me was how fast the 1.8GB compressed file was transferred to AWS. I took about 2-3 minutes which is amazing speed. That must mean that Bluehost is basically hosted on AWS or they have a very big pipe for that transfer.
Either way, this 2 hour project is now going to save me $15/mo and provide me with 30 days of continuous full backups.
You can figure out the hacks for yourself in the 33_backup.php script. If you need help, leave a comment below and I can help you figure it out.
Yes, it is helpful... thanks for sharing the valuable coding for AWS backup script, given coding is perfect. thank you for sharing.
ReplyDelete