It’s T-SQL Tuesday, the blog party that SQL Server expert Adam Machanic (blog|twitter) started. This month’s episode is hosted by Kenneth Fisher (blog| twitter). The topic: Backup & Recovery
Your backups are my bread & butter
Backup & recovery are a DBA’s bread & butter. I always tell people that if you can’t restore from backup in an emergency, you should at least make sure your resume is up to date–you’ll need it. Part of my approach to making sure I have foolproof backups is making sure my setup is scripted and fool-proof to configure. Traditionally, this has been as simple as some T-SQL scripts that create the backup stored procedure, and create a few SQL Agent jobs.
The SQL Agent jobs would write to the NAS, use the company standard retention, and everything was set. No muss, no fuss.
In the cloud, there is no NAS, there is just storage
SQL Server 2014 introduced the ability to back up to URL, and put your backups directly in the cloud. Whether your SQL Server instance is on premises or in the cloud, you can back up your database to Azure Blob Storage. Additionally, you can use geographically-redundant storage (GRS) to have your backups available in multiple Azure regions in case of a disaster. In one step, you get your backups off-server, off-site, and multi-site. Magic!
Geographically-redundant storage, on a cool access tier currently costs about $0.02/GB. That’s a fraction of what it would cost you to have it on your NAS, let alone having multiple geographically redundant copies, and the effort to set up and maintain your copy to off-site.
Now that you’re excited to back up to URL, let’s look at configuring the storage, and configuring the backups.
Configuring your Azure storage…using PowerShell
I’m going to assume you have an Azure subscription, and there’s a Resource Group in which you’re going to create your storage. You have rights on that Resource Group so that you can manage the storage setup. You have a SQL Server instance that you are going to back up to this storage. You could do all this work from the Azure Portal, but I prefer to figure it out with PowerShell so I can script it.
First we’ll do this step-by-step, then we’ll stitch the steps together into a script we can reuse.
If you don’t have the AzureRM PowerShell module, you’ll need to install it on your computer (you just need to do this once per machine). If you already have the module installed, you’ll want to import the module (you do this once per PowerShell window).
Install-Module AzureRM Import-Module AzureRM
Let’s also declare some variables for some of the names we’re going to give to the objects we create. We’ll use these more later, but let’s declare them as variables up front so that we can be consistent with the names.
$resourceGroupName = "AM2_rg" $serverName = "SqlServer002" $storageAccountName = "sqlbackups" $containerName = "sqlserver002bak" $policyName = "sqlserver002bak_policy" $storageLocation = "Central US"
Now It’s time to log in to Azure. If you have only one Azure subscription, you can skip the -SubscriptionName
parameter. This will pop open a window where you will enter your credentials to log in to your Azure account.
Add-AzureRmAccount -SubscriptionName "Visual Studio Professional with MSDN"
The next step will create your storage account using the New-AzureRmStorageAccount
cmdlet. The storage account will define the storage location, redundancy level, access tier (hot/cool), whether encryption is enabled, etc. You can use the Get-AzureRmStorageAccount
cmdlet to get the info for an already existing storage account.
New-AzureRmStorageAccount -ResourceGroupName $resourceGroupName -Name $storageAccountName -Location "Central US" ` -SkuName Standard_GRS -Kind BlobStorage -AccessTier Cool -EnableEncryptionService Blob Get-AzureRmStorageAccount -ResourceGroupName $resourceGroupName -Name $storageAccountName
Now that we have a storage account, we can create a storage container using the New-AzureStorageContainer
cmdlet. The storage container is essentially the endpoint for the URL that you will send your backups to. We’ll get the storage account key for the storage account we just created, then use the key to generate a storage context, then finally create the storage container using that storage context.
$accountKeys = Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName $storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys[0].Value $container = New-AzureStorageContainer -Context $storageContext -Name $containerName $blobContainer = $container.CloudBlobContainer # Output the URL to use for your SQL Server backup URL Write-Host $BlobContainer.Uri
The last step is to set up permissions, both on Azure and SQL Server. On the Azure side, you’ll need to create a shared access policy and shared access signature for your container. You will then create a credential in SQL Server using the shared access signature and blob storage URL. This PowerShell will output the CREATE CREDENTIAL
statement, so that it can be run in SSMS. Using the identity name of “Shared Access Signature” on the credential is the magic incantation so that SQL Server will know to use this when connecting to the specified URL.
# Create Stored Access Policy $permissions = $blobContainer.GetPermissions(); $policy = new-object 'Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPolicy' $policy.SharedAccessStartTime = $(Get-Date).ToUniversalTime().AddMinutes(-5) $policy.SharedAccessExpiryTime = $(Get-Date).ToUniversalTime().AddYears(10) $policy.Permissions = "Read,Write,List,Delete" $permissions.SharedAccessPolicies.Add($policyName, $policy) $blobContainer.SetPermissions($permissions); # Get the Shared Access Signature for the policy $policy = new-object 'Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPolicy' $sas = $blobContainer.GetSharedAccessSignature($policy, $policyName) Write-Host 'Shared Access Signature: '$($sas.Substring(1))'' # Build CREATE CREDENTIAL t-sql statement $sql = "CREATE CREDENTIAL [$($blobContainer.Uri)] WITH IDENTITY='Shared Access Signature', SECRET='$($sas.Substring(1))'" Write-Host $sql
Use the output from the last step to create this credential on your SQL Server instance. Now you can can start backing up to and restoring from URL:
CREATE CREDENTIAL [https://sqlbackups.blob.core.windows.net/sqlserver002bak/MyDatabase.bak] WITH IDENTITY='Shared Access Signature', SECRET='<SAS value>'; BACKUP DATABASE MyDatabase TO URL = 'https://sqlbackups.blob.core.windows.net/sqlserver002bak/MyDatabase.bak'; RESTORE DATABASE MyDatabase FROM URL = 'https://sqlbackups.blob.core.windows.net/sqlserver002bak/MyDatabase.bak';
Stitching it together
Now, just take the above PowerShell statements and stitch them into a single script. Save the below PowerShell to a file named Set-AzureSqlBackups.ps1
. Using this script, you can configure your Azure storage container & SQL Server credential is just one line of PowerShell.
PS C:\scripts\> .\Set-AzureSqlBackups.sql -ResourceGroupName "AM2_rg" -ServerName "SqlServer002" -StorageAccountName "sqlbackups"
Set-AzureSqlBackups.ps1 script:
Param( [Parameter(Mandatory=$true)] [string] $resourceGroupName, [Parameter(Mandatory=$true)] [string] $serverName, [Parameter(Mandatory=$true)] [string] $storageAccountName, [string] $subscriptionName = "Visual Studio Professional with MSDN", [string] $storageLocation = "Central US" ) Import-Module AzureRM Import-Module Sqlserver # the storage container name that we'll create for backups $containerName= $serverName.ToLower() + 'bu' # the name of the storage access policy we'll create $policyName = $containerName.ToLower() + '_policy' # Log in to your Azure account, if not already logged in Try { Select-AzureRmSubscription -SubscriptionName $subscriptionName -ErrorAction Stop } Catch{ Add-AzureRmAccount -SubscriptionName $subscriptionName } # Check if the storage account exists, and create it if necessary Try { Get-AzureRmStorageAccount -ResourceGroupName $resourceGroupName -Name $storageAccountName } Catch { New-AzureRmStorageAccount -ResourceGroupName $resourceGroupName -Name $storageAccountName -Location $storageLocation -SkuName Standard_GRS -Kind BlobStorage -AccessTier Cool -EnableEncryptionService Blob } # Grab storage account keys $accountKeys = Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName # Create a new storage account context using account key 0 $storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys[0].Value # Check if the container exists, and create it if it doesn't Try { $container = Get-AzureStorageContainer -Context $storageContext -Name $containerName $blobContainer = $container.CloudBlobContainer } Catch{ $container = New-AzureStorageContainer -Context $storageContext -Name $containerName $blobContainer = $container.CloudBlobContainer } # Sets up a Stored Access Policy and a Shared Access Signature for the new container $permissions = $BlobContainer.GetPermissions(); $policy = new-object 'Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPolicy' $policy.SharedAccessStartTime = $(Get-Date).ToUniversalTime().AddMinutes(-5) $policy.SharedAccessExpiryTime = $(Get-Date).ToUniversalTime().AddYears(10) $policy.Permissions = "Read,Write,List,Delete" $permissions.SharedAccessPolicies.Add($policyName, $policy) $blobContainer.SetPermissions($permissions); # Gets the Shared Access Signature for the policy $policy = new-object 'Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPolicy' $sas = $blobContainer.GetSharedAccessSignature($policy, $policyName) Write-Host 'Shared Access Signature: '$($sas.Substring(1))'' # Outputs the Transact SQL to the clipboard and to the screen to create the credential using the Shared Access Signature $sql = "CREATE CREDENTIAL [{0}] WITH IDENTITY='Shared Access Signature', SECRET='{1}'" -f $BlobContainer.Uri,$sas.Substring(1) #Create the credential on the server Invoke-Sqlcmd -ServerInstance $servername -Query $sql