Backup Button

SQL Server 2016 – TSQL2sDay – Backup to Azure Cloud

This post might contain affiliate links. We may earn a commission if you click and make a purchase. Your support is appreciated!

T-SQL Tuesday #82 – To the cloud… And beyond!!!

T-SQL Tuesday is a recurring blog party, that is started by Adam Machanic (b | t). Each month a blog will host the party, and everyone that want’s to can write a blog about a specific subject.

This month Jeffrey Verheul is our TSQL2sDay host and the subject is “To the cloud… And beyond!!!”.

My post about “to the cloud… and beyond!!!” will be about how to use the Azure Cloud as a Backup Storage and how to implement this with SQL Server 2016.

However, I won’t write about the general need for SQL server backups, why backups are so important and how to create those backups. I would like to explain the various ways a SQL Server can do backups into the Azure cloud and how PowerShell might help us.

An important objection I basically would like to brief because sometimes you might not think about if you create a solution … So working backups are absolutely necessary and very important, there are several ways to create secure backups and have them available if you need them. Normally, the fastest and easiest solution is a local “Backup-to-Disc”, but what about those backups if a disk fails? Optimum (or more failsafe) is a “backup-to-share” or better known as the backup-to-network device (where the network share should, of course, be configured as a RAID. But we all know a small company which is not able to host large file servers where those last ~ 5 days of SQL server backups (in addition ?!) can be stored. What might be easier than storing these backups on another (“cheaper”) storage box like an Azure Storage Accounts?

Options of SQL Server in combined with Azure

Since SQL Server 2012 Microsoft provides the possibility of “SQL Server Backup to URL” which creates various kinds of SQL server backups as usual. But in this way backup isn’t located on a local disc or local file share, it is pumped to a URL within the Azure cloud. This feature had been optimized in SQL Server 2016th with additional options (BlockBlobs, Shared Access Signatures and striping) to create faster and more reliable backups in the cloud.

With SQL Server 2014 Microsoft led “ SQL Server Managed Backup to Microsoft Azure ” into a field. This new feature enables the DBA a simplified and faster operation because the SQL Server will decide himself when he creates a backup. The DBA does not have to worry about the backup strategy and backup scripts. Of course, the options allows intervention by the DBA but it’s probably but it even more simple with onboard tools of SQL Server. For Azure SQL Server this feature is recommended to be used.

New in the SQL Server 2016 is following feature “ File snapshot backups for Database Files in Azure “. This new feature allows SQL Server 2016, which is runs in Azure to create a very, very fast backup of data files and even a quick restore.

SQL Server Backup and Restore with Azure Blob Storage

Backup to Azure Blob Storage

What do we need for deployment and connectivity of our SQL Servers to an Azure Blob storage?

Since PowerShell 2012 and 2014 had become more important to SQL Server, I would like to contribute a little guide to here. First, we have to start with the installation of the Azure Plugin for PowerShell. For instructions and a download link can be found here => https://azure.microsoft.com/de-de/documentation/articles/powershell-install-configure/
Of course, you’ll need an account for Azure Cloud (170 Euro Test-Zugang), for example as part of the Visual Studio licensing or MPN or a BizTalk customer.

Create an Access Policy und a Shared Access Signature

#
# This script uses the Azure Resource model and creates a new ARM storage account.
# Modify this script to use an existing ARM or classic storage account
# using the instructions in comments within this script
#
# Define global variables for the script
$prefixName = '' # used as the prefix for the name for various objects
$subscriptionName='' # the name of subscription name you will use
$locationName = '' # the data center region you will use
$storageAccountName= $prefixName + 'storage' # the storage account name you will create or use
$containerName= $prefixName + 'container' # the storage container name to which you will attach the SAS policy with its SAS token
$policyName = $prefixName + 'policy' # the name of the SAS policy

#
# Using Azure Resource Manager deployment model
# Comment out this entire section and use the classic storage account name to use an existing classic storage account
#

# Set a variable for the name of the resource group you will create or use
$resourceGroupName=$prefixName + 'rg'

# adds an authenticated Azure account for use in the session
Login-AzureRmAccount

# set the tenant, subscription and environment for use in the rest of
Set-AzureRmContext -SubscriptionName $subscriptionName

# create a new resource group - comment out this line to use an existing resource group
New-AzureRmResourceGroup -Name $resourceGroupName -Location $locationName

# Create a new ARM storage account - comment out this line to use an existing ARM storage account
New-AzureRmStorageAccount -Name $storageAccountName -ResourceGroupName $resourceGroupName -Type Standard_RAGRS -Location $locationName

# Get the access keys for the ARM storage account
$accountKeys = Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName

# Create a new storage account context using an ARM storage account
$storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys[0].Value

#
# Using the Classic deployment model
# Use the following four lines to use an existing classic storage account
#
#Classic storage account name
#Add-AzureAccount
#Select-AzureSubscription -SubscriptionName $subscriptionName #provide an existing classic storage account
#$accountKeys = Get-AzureStorageKey -StorageAccountName $storageAccountName
#$storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys.Primary

# The remainder of this script works with either the ARM or classic sections of code above

# Creates a new container in blob storage
$container = New-AzureStorageContainer -Context $storageContext -Name $containerName
$cbc = $container.CloudBlobContainer

# Sets up a Stored Access Policy and a Shared Access Signature for the new container
$permissions = $cbc.GetPermissions();
$policyName = $policyName
$policy = new-object 'Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPolicy'
$policy.SharedAccessStartTime = $(Get-Date).ToUniversalTime().AddMinutes(-5)
$policy.SharedAccessExpiryTime = $(Get-Date).ToUniversalTime().AddYears(10)
$policy.Permissions = "Read,Write,List,Delete"
$permissions.SharedAccessPolicies.Add($policyName, $policy)
$cbc.SetPermissions($permissions);

# Gets the Shared Access Signature for the policy
$policy = new-object 'Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPolicy'
$sas = $cbc.GetSharedAccessSignature($policy, $policyName)
Write-Host 'Shared Access Signature= '$($sas.Substring(1))''

# Outputs the Transact SQL to the clipboard and to the screen to create the credential using the Shared Access Signature
Write-Host 'Credential T-SQL'
$tSql = "CREATE CREDENTIAL [{0}] WITH IDENTITY='Shared Access Signature', SECRET='{1}'" -f $cbc.Uri,$sas.Substring(1)
$tSql | clip
Write-Host $tSql

Quelle: https://msdn.microsoft.com/en-us/library/dn466438.aspx

To use this script, we need to edit the following information in the script first.

Get-AzureSubscription | Format-Table

“North Europe” is the best location für Germany, you’ll have to choose your best fitting country/datacenter.

But to be able to make the selection of “North Europe”, we first have to log into our Azure account where we can display a list of available locations.

Login-AzureRmAccount
Get-AzureRmLocation | Format-Table Location, DisplayName

Location       DisplayName     
--------       -----------     
[...]
southcentralus South Central US
northeurope    North Europe    
westeurope     West Europe     
japanwest      Japan West      
[...]

These values inserted and executed into the script should result in the following message:

Shared Access Signature=  sv=2015-04-05&sr=c&si=backupdemopolicy&sig=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 
Credential T-SQL
CREATE CREDENTIAL [https://backupdemostorage.blob.core.windows.net/backupdemocontainer] WITH IDENTITY='Shared Access Signature', SECRET='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'

This TSQL Statement can be easily executed on SQL server which should be backup. With

SELECT * from sys.credentials

you can check if your new credentials were created successfully if you don’t trust the message in SQL Server Management Studio. 😉

Now we are able to create our first backup directly into Azure Storage. For this purpose, just run a backup-to-URL (replace all values appropriately):

BACKUP DATABASE AdventureWorks2014 TO URL = 'https://<mystorageaccountname>.blob.core.windows.net/<mystorageaccountcontainername>/AdventureWorks2014_onprem.bak'

I was very impressed with the speed of taking a SQL Server backup. Of course, the backup speed depends mostly on the network infrastructure and the connection to the internet – e.g. is it a DSL line or a T1 leased line. (in my case it is a DSL line with 100MB down/ 50MB up )
a ~43MB transaction log backup needs 12,7 seconds within my configuration.

BACKUP LOG successfully processed 13569 pages in 12.696 seconds (8.349 MB/sec).

To validate and compare soma data I’d also created a second transaction log backup directly to my local disc.

BACKUP LOG successfully processed 13569 pages in 1.746 seconds (60.652 MB/sec).

Which is much faster, of course, but also involves a degree of uncertainty. In this post, I just wanna give a short impression of the new backup feature of SQL Server 2016. Each one of you has to decide by himself if he wants (and will be able) to use this cool feature or if his customers won’t let him implement this stuff.

Edit:
There is a little mistake in those scripts from Microsoft to create the BlobStorages…
Line 38 says:
$storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys.[0].Value

which should be

$storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys[0].Value

My first attempt with this powershell based implementation of a new SQL Server 2016 feature and how it works had surprised me a little and took me only 20 minutes to implement and successfully run a backup.

This post might contain affiliate links. We may earn a commission if you click and make a purchase. Your support is appreciated!

Similar Posts

2 Comments

  1. Script is outdated and doesn´t work anymore.
    Do you have any News howto backup from SQL 2016 to Azure Blob (Classic) with SAS?

    1. No, actually not. I always tried this with Azure Resource Manager model. Where do you have you problems? with executing the powershell command or with creating the backup… ok, backup relies on that SAS credentials. I just executed the powershell part as ARM to create a new storage with credentials (but right now I’m not able to test the backup part)

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.