Previously I had posted a Powershell script that would use 7Zip to compress and encrypt a folder and then send it to Amazon S3. This script could have many uses, but in my case it was designed to fulfill the need for off site storage of compressed database backups. While it did all the heavy lifting of getting the data encrypted, compressed and uploaded to S3 I left the creation of the database backup folder as an exercise for the user to complete.
I now have a working system that I would like to share in case anyone was unable to create the backup folder using Powershell. There were a few hiccups that I found, such as not being able to use the Start-Transcript method in a SQL Agent Job and having my MSDB data file triple in size after 3 months of backups, but after working out those issues the script provides a very cost effective method of creating both local and offsite database backups with 1 week or 1 month retention. Currently I have it running on a server with about 100 databases ranging in size from 7MB to 700MB with daily full backups and hourly differential backups being sent to Amazon S3 and stored with 1 month retention for about $5 a month. The total raw backup size is 2.2GB per full backup set and 100-400MB for each differential backup set, but the script gets about 10:1 compression so the daily storage on S3 is about 500MB total for all of the full and differential backups.
You will need to modify the settings in both the psBackupAllDBtoAmazonS3.ps1 and ps7ZiptoAmazonS3.ps1 files to get this to work on your server, but once you do you can sit back and let the scripts do it’s magic. You can set it to run as a scheduled task, but we chose to run it as an SQL Agent Job so that our server monitoring software would notify us if there were any issues with the script.
3 comments:
Thank you for mentioning CloudBerry Explorer on your blog again!
Hey Greg these scripts are brilliant. Have you ever thought about using AWS as a net-accessible file-share for a log-shipping type of setup? I've looked into your scripts in place for a production database backup, but i'm also looking at log shipping techniques across different networks (I work in NZ but we use servers located inside US cloud storage).
I haven't worked with log shipping much, but you could change the TSQL string on line 063 to perform a backup of the active log files instead of a differential database backup and then they should be able to be used in log shipping. Not sure how well it would work, as the the compressions time might limit how quickly you could ship the logs. My guess is this would work for 30-45 minute intervals, but for anything less than that you would be better off using an SQL job to create and ship the logs.
You also would need another script to download the files, extract them, and apply them to the other server in "No Recovery" mode. It should be pretty straight forward to do, and I may in the future create a similar script for testing the full nightly backups created using this one (ex: deploy them weekly/nightly to a reporting server), but for now I'm busy working on other projects.
Best of luck!
Post a Comment