Many services in Azure allow certain cost savings through automation, this can also be achieved with the Platform-as-a-Service “Azure SQL Database”, this is what this blog post is all about. Quite as simple as e.g. with the Azure Analysis Service it is not because actually there is no pause-resume functionality – here comes the backup into place, that’s what I’ll talk about first.
Azure SQL Database and the Backup topic
Before we dive deeper into the Azure SQL Database and its pause-resume functionality, we first have to take a brief look at the topic of backup, which is not quite unimportant in this context. A big advantage of Azure SQL Database against a SQL Server (on premise or Iaas) in any case is … you do not have to explicitly worry about the backup since this is created automatically. According to the database size and number of data changes, Microsoft automatically backs up the databases at regular intervals! That means that the costs for an extra backup software or even the development of own services are already gone. Depending on Performance-Level, there are different backup retention times:
at the basic level the retention time is 7 days.
at the standard level the retention time is 35 days.
at the premium level the retention time is 35 days.
When will my database be backed up? Also there is an answer … The full backup takes place every week (unfortunately) at non-fixed times, differential backups are generally created every few hours and depending on the data volume, the transaction logs are saved every 5-10 minutes. A first full backup is created within the first 30 minutes after the database is created. These backups are kept according to the service tier(see above) (cost savings: you do not need an extra storage account because the backup is already included in the price). If you want to keep your backup for more than 35 days, you have the option of activating a “Long-Time Retention-Backup”. This requires a further storage account, where the backups are stored in parallel and permanently.
Pause and Resume for cost savings
This functionality is unfortunately not available with Azure SQL Database … How can I still benefit with cost savings if I would like to use this platform-as-a-service… Of course, as already explained in another blog post changing the database performance to prevent occurring load peaks. But we would really like to achieve certain cost savings with the migration to the cloud … if a department only works during the day (8am-8pm), then I need this database(s) only during the night… So why you cannot stop those services, since you only pay when the database is online?
For this scenario, the department needs the database only during the day, there is actually no solution, but a workaround, dropping the database in the evening and restoring the next morning via “Create Database from Backup” helps. This procedure has been implemented by Microsoft extremely pleasantly and does not mean a lot of effort.
Please note that you only delete/remove the database and not the logical server, because the backup history is stored on that logical SQL server. In order to restore this database, you’ll need the date of the last backup database. When you restore the database on the following morning, you’ll directly use this backup time to perform a restore. In Powershell, you can combine these activities very easily.
$deleteddatabase = Get-AzureRmSqlDeletedDatabaseBackup -ResourceGroupName $resourcegroupname -ServerName $servername #-DatabaseName $databasename
$deleteddatabase
# Do not continue until the cmdlet returns information about the deleted database.
Restore-AzureRmSqlDatabase -FromDeletedDatabaseBackup `
-ResourceGroupName $resourcegroupname `
-ServerName $servername `
-TargetDatabaseName $databasename `
-ResourceId $deleteddatabase.ResourceID `
-DeletionDate $deleteddatabase.DeletionDate `
-Edition "Standard" `
-ServiceObjectiveName "S0"
Björn works in Hamburg as Senior Consultant – Microsoft Data Platform and Cloud at Kramer & Crew. He regularly participates in the PASS regional group meeting in Hamburg, the events of the PASS such as SQLSaturday and DataGrillen and organises the Azure Meetup in Hamburg. He is interested in topics such as SQL Server, Powershell and Azure for science fiction, snowboarding, baking and cycling.
T-SQL Tuesday is a recurring blog party, that is started by Adam Machanic (b | t). Each month a blog will host the party, and everyone that want’s to can write a blog about a specific subject.
My post about “to the cloud… and beyond!!!” will be about how to use the Azure Cloud as a Backup Storage and how to implement this with SQL Server 2016.
However, I won’t write about the general need for SQL server backups, why backups are so important and how to create those backups. I would like to explain the various ways a SQL Server can do backups into the Azure cloud and how PowerShell might help us.
An important objection I basically would like to brief because sometimes you might not think about if you create a solution … So working backups are absolutely necessary and very important, there are several ways to create secure backups and have them available if you need them. Normally, the fastest and easiest solution is a local “Backup-to-Disc”, but what about those backups if a disk fails? Optimum (or more failsafe) is a “backup-to-share” or better known as the backup-to-network device (where the network share should, of course, be configured as a RAID. But we all know a small company which is not able to host large file servers where those last ~ 5 days of SQL server backups (in addition ?!) can be stored. What might be easier than storing these backups on another (“cheaper”) storage box like an Azure Storage Accounts?
Options of SQL Server in combined with Azure
Since SQL Server 2012 Microsoft provides the possibility of “SQL Server Backup to URL” which creates various kinds of SQL server backups as usual. But in this way backup isn’t located on a local disc or local file share, it is pumped to a URL within the Azure cloud. This feature had been optimized in SQL Server 2016th with additional options (BlockBlobs, Shared Access Signatures and striping) to create faster and more reliable backups in the cloud.
With SQL Server 2014 Microsoft led “ SQL Server Managed Backup to Microsoft Azure ” into a field. This new feature enables the DBA a simplified and faster operation because the SQL Server will decide himself when he creates a backup. The DBA does not have to worry about the backup strategy and backup scripts. Of course, the options allows intervention by the DBA but it’s probably but it even more simple with onboard tools of SQL Server. For Azure SQL Server this feature is recommended to be used.
New in the SQL Server 2016 is following feature “ File snapshot backups for Database Files in Azure “. This new feature allows SQL Server 2016, which is runs in Azure to create a very, very fast backup of data files and even a quick restore.
SQL Server Backup and Restore with Azure Blob Storage
What do we need for deployment and connectivity of our SQL Servers to an Azure Blob storage?
Since PowerShell 2012 and 2014 had become more important to SQL Server, I would like to contribute a little guide to here. First, we have to start with the installation of the Azure Plugin for PowerShell. For instructions and a download link can be found here => https://azure.microsoft.com/de-de/documentation/articles/powershell-install-configure/
Of course, you’ll need an account for Azure Cloud (170 Euro Test-Zugang), for example as part of the Visual Studio licensing or MPN or a BizTalk customer.
Create an Access Policy und a Shared Access Signature
To use this script, we need to edit the following information in the script first.
Get-AzureSubscription | Format-Table
“North Europe” is the best location für Germany, you’ll have to choose your best fitting country/datacenter.
But to be able to make the selection of “North Europe”, we first have to log into our Azure account where we can display a list of available locations.
Login-AzureRmAccount
Get-AzureRmLocation | Format-Table Location, DisplayName
Location DisplayName
-------- -----------
[...]
southcentralus South Central US
northeurope North Europe
westeurope West Europe
japanwest Japan West
[...]
These values inserted and executed into the script should result in the following message:
This TSQL Statement can be easily executed on SQL server which should be backup. With
SELECT * from sys.credentials
you can check if your new credentials were created successfully if you don’t trust the message in SQL Server Management Studio. 😉
Now we are able to create our first backup directly into Azure Storage. For this purpose, just run a backup-to-URL (replace all values appropriately):
BACKUP DATABASE AdventureWorks2014 TO URL = 'https://<mystorageaccountname>.blob.core.windows.net/<mystorageaccountcontainername>/AdventureWorks2014_onprem.bak'
I was very impressed with the speed of taking a SQL Server backup. Of course, the backup speed depends mostly on the network infrastructure and the connection to the internet – e.g. is it a DSL line or a T1 leased line. (in my case it is a DSL line with 100MB down/ 50MB up )
a ~43MB transaction log backup needs 12,7 seconds within my configuration.
Which is much faster, of course, but also involves a degree of uncertainty. In this post, I just wanna give a short impression of the new backup feature of SQL Server 2016. Each one of you has to decide by himself if he wants (and will be able) to use this cool feature or if his customers won’t let him implement this stuff.
Edit:
There is a little mistake in those scripts from Microsoft to create the BlobStorages…
Line 38 says:
$storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys.[0].Value
My first attempt with this powershell based implementation of a new SQL Server 2016 feature and how it works had surprised me a little and took me only 20 minutes to implement and successfully run a backup.
Björn works in Hamburg as Senior Consultant – Microsoft Data Platform and Cloud at Kramer & Crew. He regularly participates in the PASS regional group meeting in Hamburg, the events of the PASS such as SQLSaturday and DataGrillen and organises the Azure Meetup in Hamburg. He is interested in topics such as SQL Server, Powershell and Azure for science fiction, snowboarding, baking and cycling.