Pause / Resume / Backup an Azure SQL Database with Powershell – Part 3

Many services in Azure allow certain cost savings through automation, this can also be achieved with the Platform-as-a-Service “Azure SQL Database”, this is what this blog post is all about. Quite as simple as e.g. with the Azure Analysis Service it is not because actually there is no pause-resume functionality – here comes the backup into place, that’s what I’ll talk about first.

Azure SQL Database and the Backup topic

Before we dive deeper into the Azure SQL Database and its pause-resume functionality, we first have to take a brief look at the topic of backup, which is not quite unimportant in this context. A big advantage of Azure SQL Database against a SQL Server (on premise or Iaas) in any case is … you do not have to explicitly worry about the backup since this is created automatically. According to the database size and number of data changes, Microsoft automatically backs up the databases at regular intervals! That means that the costs for an extra backup software or even the development of own services are already gone. Depending on Performance-Level, there are different backup retention times:

  • at the basic level the retention time is 7 days.
  • at the standard level the retention time is 35 days.
  • at the premium level the retention time is 35 days.

When will my database be backed up? Also there is an answer …
The full backup takes place every week (unfortunately) at non-fixed times, differential backups are generally created every few hours and depending on the data volume, the transaction logs are saved every 5-10 minutes. A first full backup is created within the first 30 minutes after the database is created. These backups are kept according to the service tier(see above) (cost savings: you do not need an extra storage account because the backup is already included in the price). If you want to keep your backup for more than 35 days, you have the option of activating a “Long-Time Retention-Backup”. This requires a further storage account, where the backups are stored in parallel and permanently.

Backup Azure SQL Database

Pause and Resume for cost savings

This functionality is unfortunately not available with Azure SQL Database … How can I still benefit with cost savings if I would like to use this platform-as-a-service… Of course, as already explained in another blog post changing the database performance to prevent occurring load peaks. But we would really like to achieve certain cost savings with the migration to the cloud … if a department only works during the day (8am-8pm), then I need this database(s) only during the night… So why you cannot stop those services, since you only pay when the database is online?

For this scenario, the department needs the database only during the day, there is actually no solution, but a workaround, dropping the database in the evening and restoring the next morning via “Create Database from Backup” helps. This procedure has been implemented by Microsoft extremely pleasantly and does not mean a lot of effort.

# Dropping DB to stop costs
Get-AzureRmSqlDatabase -ResourceGroupName $resourcegroupname -ServerName $servername -DatabaseName $databasename -ev notPresent -ea 0
if ($notPresent) {
    Write-Host $databasename "already deleted" 
} else {
    Remove-AzureRmSqlDatabase -ResourceGroupName $resourcegroupname -ServerName $servername -DatabaseName $databasename
}

Please note that you only delete/remove the database and not the logical server, because the backup history is stored on that logical SQL server. In order to restore this database, you’ll need the date of the last backup database. When you restore the database on the following morning, you’ll directly use this backup time to perform a restore. In Powershell, you can combine these activities very easily.

$deleteddatabase = Get-AzureRmSqlDeletedDatabaseBackup -ResourceGroupName $resourcegroupname -ServerName $servername #-DatabaseName $databasename
$deleteddatabase
# Do not continue until the cmdlet returns information about the deleted database.
 
Restore-AzureRmSqlDatabase -FromDeletedDatabaseBackup `
    -ResourceGroupName $resourcegroupname `
    -ServerName $servername `
    -TargetDatabaseName $databasename `
    -ResourceId $deleteddatabase.ResourceID `
    -DeletionDate $deleteddatabase.DeletionDate `
    -Edition "Standard" `
    -ServiceObjectiveName "S0"

You can find more sample scripts on backup / restore here => https://docs.microsoft.com/de-de/azure/sql-database/scripts/sql-database-restore-database-powershell

Björn works as a database administrator and Head of Competence for MS SQL and mySQL in Hamburg (Germany). He regularly participates in the PASS regional group meetings, events of the PASS such as SQLSaturday and SQLGrillen and he organizes the Azure Meetup group in Hamburg. He is interested in topics like SQL Server, Powershell and Azure for science fiction, snowboarding, baking and cycling.

SQL Server 2016 – TSQL2sDay – Backup to Azure Cloud

T-SQL Tuesday #82 – To the cloud… And beyond!!!

T-SQL Tuesday is a recurring blog party, that is started by Adam Machanic (b | t). Each month a blog will host the party, and everyone that want’s to can write a blog about a specific subject.

This month Jeffrey Verheul is our TSQL2sDay host and the subject is “To the cloud… And beyond!!!”.

My post about “to the cloud… and beyond!!!” will be about how to use the Azure Cloud as a Backup Storage and how to implement this with SQL Server 2016.

However, I won’t write about the general need for SQL server backups, why backups are so important and how to create those backups. I would like to explain the various ways a SQL Server can do backups into the Azure cloud and how PowerShell might help us.

An important objection I basically would like to brief because sometimes you might not think about if you create a solution … So working backups are absolutely necessary and very important, there are several ways to create secure backups and have them available if you need them. Normally, the fastest and easiest solution is a local “Backup-to-Disc”, but what about those backups if a disk fails? Optimum (or more failsafe) is a “backup-to-share” or better known as the backup-to-network device (where the network share should, of course, be configured as a RAID. But we all know a small company which is not able to host large file servers where those last ~ 5 days of SQL server backups (in addition ?!) can be stored. What might be easier than storing these backups on another (“cheaper”) storage box like an Azure Storage Accounts?

Options of SQL Server in combined with Azure

Since SQL Server 2012 Microsoft provides the possibility of “SQL Server Backup to URL” which creates various kinds of SQL server backups as usual. But in this way backup isn’t located on a local disc or local file share, it is pumped to a URL within the Azure cloud. This feature had been optimized in SQL Server 2016th with additional options (BlockBlobs, Shared Access Signatures and striping) to create faster and more reliable backups in the cloud.

With SQL Server 2014 Microsoft led “ SQL Server Managed Backup to Microsoft Azure ” into a field. This new feature enables the DBA a simplified and faster operation because the SQL Server will decide himself when he creates a backup. The DBA does not have to worry about the backup strategy and backup scripts. Of course, the options allows intervention by the DBA but it’s probably but it even more simple with onboard tools of SQL Server. For Azure SQL Server this feature is recommended to be used.

New in the SQL Server 2016 is following feature “ File snapshot backups for Database Files in Azure “. This new feature allows SQL Server 2016, which is runs in Azure to create a very, very fast backup of data files and even a quick restore.

SQL Server Backup and Restore with Azure Blob Storage

Backup to Azure Blob Storage

What do we need for deployment and connectivity of our SQL Servers to an Azure Blob storage?

Since PowerShell 2012 and 2014 had become more important to SQL Server, I would like to contribute a little guide to here. First, we have to start with the installation of the Azure Plugin for PowerShell. For instructions and a download link can be found here => https://azure.microsoft.com/de-de/documentation/articles/powershell-install-configure/
Of course, you’ll need an account for Azure Cloud (170 Euro Test-Zugang), for example as part of the Visual Studio licensing or MPN or a BizTalk customer.

Create an Access Policy und a Shared Access Signature

#
# This script uses the Azure Resource model and creates a new ARM storage account.
# Modify this script to use an existing ARM or classic storage account
# using the instructions in comments within this script
#
# Define global variables for the script
$prefixName = '' # used as the prefix for the name for various objects
$subscriptionName='' # the name of subscription name you will use
$locationName = '<a>' # the data center region you will use
$storageAccountName= $prefixName + 'storage' # the storage account name you will create or use
$containerName= $prefixName + 'container' # the storage container name to which you will attach the SAS policy with its SAS token
$policyName = $prefixName + 'policy' # the name of the SAS policy
 
#
# Using Azure Resource Manager deployment model
# Comment out this entire section and use the classic storage account name to use an existing classic storage account
#
 
# Set a variable for the name of the resource group you will create or use
$resourceGroupName=$prefixName + 'rg'
 
# adds an authenticated Azure account for use in the session
Login-AzureRmAccount
 
# set the tenant, subscription and environment for use in the rest of
Set-AzureRmContext -SubscriptionName $subscriptionName
 
# create a new resource group - comment out this line to use an existing resource group
New-AzureRmResourceGroup -Name $resourceGroupName -Location $locationName
 
# Create a new ARM storage account - comment out this line to use an existing ARM storage account
New-AzureRmStorageAccount -Name $storageAccountName -ResourceGroupName $resourceGroupName -Type Standard_RAGRS -Location $locationName
 
# Get the access keys for the ARM storage account
$accountKeys = Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName
 
# Create a new storage account context using an ARM storage account
$storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys[0].Value
 
#
# Using the Classic deployment model
# Use the following four lines to use an existing classic storage account
#
#Classic storage account name
#Add-AzureAccount
#Select-AzureSubscription -SubscriptionName $subscriptionName #provide an existing classic storage account
#$accountKeys = Get-AzureStorageKey -StorageAccountName $storageAccountName
#$storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys.Primary
 
# The remainder of this script works with either the ARM or classic sections of code above
 
# Creates a new container in blob storage
$container = New-AzureStorageContainer -Context $storageContext -Name $containerName
$cbc = $container.CloudBlobContainer
 
# Sets up a Stored Access Policy and a Shared Access Signature for the new container
$permissions = $cbc.GetPermissions();
$policyName = $policyName
$policy = new-object 'Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPolicy'
$policy.SharedAccessStartTime = $(Get-Date).ToUniversalTime().AddMinutes(-5)
$policy.SharedAccessExpiryTime = $(Get-Date).ToUniversalTime().AddYears(10)
$policy.Permissions = "Read,Write,List,Delete"
$permissions.SharedAccessPolicies.Add($policyName, $policy)
$cbc.SetPermissions($permissions);
 
# Gets the Shared Access Signature for the policy
$policy = new-object 'Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPolicy'
$sas = $cbc.GetSharedAccessSignature($policy, $policyName)
Write-Host 'Shared Access Signature= '$($sas.Substring(1))''
 
# Outputs the Transact SQL to the clipboard and to the screen to create the credential using the Shared Access Signature
Write-Host 'Credential T-SQL'
$tSql = "CREATE CREDENTIAL [{0}] WITH IDENTITY='Shared Access Signature', SECRET='{1}'" -f $cbc.Uri,$sas.Substring(1)
$tSql | clip
Write-Host $tSql</a>

Quelle: https://msdn.microsoft.com/en-us/library/dn466438.aspx

To use this script, we need to edit the following information in the script first.

Get-AzureSubscription | Format-Table

“North Europe” is the best location für Germany, you’ll have to choose your best fitting country/datacenter.

But to be able to make the selection of “North Europe”, we first have to log into our Azure account where we can display a list of available locations.

Login-AzureRmAccount
Get-AzureRmLocation | Format-Table Location, DisplayName
 
Location       DisplayName     
--------       -----------     
[...]
southcentralus South Central US
northeurope    North Europe    
westeurope     West Europe     
japanwest      Japan West      
[...]

These values inserted and executed into the script should result in the following message:

Shared Access Signature=  sv=2015-04-05&amp;sr=c&amp;si=backupdemopolicy&amp;sig=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 
Credential T-SQL
CREATE CREDENTIAL [https://backupdemostorage.blob.core.windows.net/backupdemocontainer] WITH IDENTITY='Shared Access Signature', SECRET='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'

This TSQL Statement can be easily executed on SQL server which should be backup. With

SELECT * FROM sys.credentials

you can check if your new credentials were created successfully if you don’t trust the message in SQL Server Management Studio. 😉

Now we are able to create our first backup directly into Azure Storage. For this purpose, just run a backup-to-URL (replace all values appropriately):

BACKUP DATABASE AdventureWorks2014 TO URL = 'https://.blob.core.windows.net//AdventureWorks2014_onprem.bak'

I was very impressed with the speed of taking a SQL Server backup. Of course, the backup speed depends mostly on the network infrastructure and the connection to the internet – e.g. is it a DSL line or a T1 leased line. (in my case it is a DSL line with 100MB down/ 50MB up )
a ~43MB transaction log backup needs 12,7 seconds within my configuration.

BACKUP LOG successfully processed 13569 pages in 12.696 seconds (8.349 MB/sec).

To validate and compare soma data I’d also created a second transaction log backup directly to my local disc.

BACKUP LOG successfully processed 13569 pages in 1.746 seconds (60.652 MB/sec).

Which is much faster, of course, but also involves a degree of uncertainty. In this post, I just wanna give a short impression of the new backup feature of SQL Server 2016. Each one of you has to decide by himself if he wants (and will be able) to use this cool feature or if his customers won’t let him implement this stuff.

Edit:
There is a little mistake in those scripts from Microsoft to create the BlobStorages…
Line 38 says:
$storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys.[0].Value

which should be

$storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys[0].Value

My first attempt with this powershell based implementation of a new SQL Server 2016 feature and how it works had surprised me a little and took me only 20 minutes to implement and successfully run a backup.

Björn works as a database administrator and Head of Competence for MS SQL and mySQL in Hamburg (Germany). He regularly participates in the PASS regional group meetings, events of the PASS such as SQLSaturday and SQLGrillen and he organizes the Azure Meetup group in Hamburg. He is interested in topics like SQL Server, Powershell and Azure for science fiction, snowboarding, baking and cycling.