Azure SQL VM – no possibility to configure the disk usage

A customer had asked me the last few days to help him optimize a SQL server on an Azure VM. The customer always had slight performance problems with this SQL server, so he first asked the application manufacturer or supervisor, but they didn’t want to or could help. The only answer they received was that a disk latency of 5 ms is recommended for the DATA disks of the SQL Server… Officially, in Azure, this can only be achieved with ULTRA disks. In this blog post, I don’t want to detail how this might be fulfilled. Just show you an error that I noticed during the analysis.

Starting with a small intro, I talked to the customer and showed him my calculation of what it would cost to equip the server with 2 TB ultra disks and whether it might not be more reasonable and adequate to put the SQL server through its paces and optimize it (also the application or its T-SQL) and in the last step to go to ultra disks if necessary.

The customer agreed, and we have taken the SQL Server under review. Even if they are using a marketplace image from Microsoft (thanks to the product team!), which are already configured very well, there might be some more screws to adjust.

Azure SQL VM and the data disks

If you follow the recommendations for Azure SQL in a virtual machine, you can read the recommendations there:

Stripe multiple Azure data disks using Storage Spaces to increase I/O bandwidth up to the target virtual machine’s IOPS and throughput limits.

https://docs.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/performance-guidelines-best-practices-storage

In a first effort, we replaced the 1TB disk with several small disks to increase the maximum throughput from 200MB to 625MB. So far, this has been accomplished without any downtime, the customer has been delighted so far. But in this activity, I also came across a message in the Azure portal on this virtual machine or the SQL Server configuration overview, which confused me a little; Google wasn’t a real help.

In the Azure portal, there is the possibility to roll out several extensions for all SQL servers within an Azure VM, which are intended to support the admin-team in managing the VM or the SQL server via Powershell, Azure CLI, or the portal and provide an overview on data from the logs and metrics. Microsoft uses the “SQL Server IaaS Extension Query Service” or “SQL Server IaaS Agent extension” for this purpose. Both collecting data from the virtual machine and the SQL Server and display it in a user-friendly way in the portal. Unfortunately, that was the problem with this customer VM, and these metrics were not 100% available, as the following screenshot shows.

Azure SQL VM - No data in the portal for disk usage

No data on the storage could be determined in the further “Configure” overview, although all extensions were rolled out on the server and did not display any data even after one or more reboots. So I went looking for the cause…

Extension-Log and fn_trace_gettable

As the Microsoft documentation states, there are different modes for different requirements, and of course, you can also notice a local error log, which you can use for troubleshooting. This error log is an XE file and a trace file, neither of which can be read with Notepad only with SQL Server Management Studio… so the easiest way to access the trace file with a T-SQL system function to determine that the system user apparently has no or too few authorisations. Usually, the extension is located directly on the C drive in the directory C:\SQLIaaSExtension, so you have to use this path accordingly in the SQL query.

SELECT * FROM fn_trace_gettable('C:\SQLIaaSExtension\Log\log.trc', default);  
GO  

By executing this T-SQL statement to examine the extension log file, I received the following line, among other things:

alter server role [sysadmin] add member [NT Service\SQLIaaSExtensionQuery]

When seeking for the user, I discovered that the customer had deleted this user – for whatever reason (later I found out that this apparently did not happen “by accident”, on other SQL servers in Azure, the user is also missing)

USE [master]
GO

/****** Object:  Login [NT SERVICE\SqlIaaSExtensionQuery]    Script Date: 5/10/2021 3:35:08 PM ******/
DROP LOGIN [NT SERVICE\SqlIaaSExtensionQuery]
GO

/****** Object:  Login [NT SERVICE\SqlIaaSExtensionQuery]    Script Date: 5/10/2021 3:35:08 PM ******/
CREATE LOGIN [NT SERVICE\SqlIaaSExtensionQuery] FROM WINDOWS WITH DEFAULT_DATABASE=[master], DEFAULT_LANGUAGE=[us_english]
GO

ALTER SERVER ROLE [sysadmin] ADD MEMBER [NT SERVICE\SqlIaaSExtensionQuery]
GO

As a test, I recreated this user, authorized it, and checked in the Azure portal whether there was a change in the representations… Yes, the SQL Server IaaS Agent Extension can now access the SQL Server and read out the necessary and relevant data.

Azure SQL VM - Data in the portal for disk usage

Azure SQL – Change your Skills – DataSaturday #6 Malta

Yesterday I was able to give my speech as part of DataSaturday #6 in Malta. It was about changes in the life of a DBA when he is confronted with the fact that his SQL servers and/or databases should be migrated to the Microsoft Cloud Azure. With this session, I wanted to minimize the fears and show that Azure SQL is “only” a SQL server or an SQL database.

In contrast to all other events, Dennes (and his team) had come up with something new in the process. The session did not have 60 minutes but 90 minutes. The first 15 minutes were used as an interview or introduction, the host introduced the speaker, a small video was produced for this, and individual questions were prepared on the respective curriculum vitae of the speaker. Then the speech took place over a full 60 minutes, followed by another 15 minutes for Q&A from the participants, or the host could ask the relevant questions himself. A successful solution led the talk more like a conversation and reduced the “pressure” a little, and I (hopefully also as a participant) felt more part of it.

My host was Deepthi Goguri; I had a lot of fun holding this session with her. Even if this wasn’t my first lecture in English, I am still a little more nervous … I find the right words at the right moment, I pronounce everything correctly … Thanks to the calm and friendly nature of Deepthi, I became more relaxed and could prepare myself a little better for the session. Didn’t help to get over the fact that I wanted to tell and show more than I had time; I have to work on it urgently! Two years ago, I talked to Rob Sewell that I primarily wanted to leave my comfort zone => not only to give lectures in German but start with English and whether he could support me in my start – as a mentor, so to speak. And then Corona came, and everything turned out differently … I just jumped and dared … which brings us back to the topic. 😉

It is just a SQLServer - dont be afraid of moving you databases into Azure

Just dare and try

Deepthi asked me yesterday what I would recommend to the participants if the client or employer wanted to start migrating to the cloud. I can only recommend anyone interested not to be afraid, just because the SQL Server is no longer in your own data center. Still, now in a Microsoft data center, this does not mean that Microsoft will take over all the work and you – as the database administrator – has nothing more to do anymore… ok, you have to learn something new and think a little bit different, but it’s still a SQL Server with its databases!
The first steps could be the following:

  • create an Azure Free Account
  • just deploy a free Linux VM and try out how to install a SQL server there 
  • or an Azure SQL Database to see which steps you have to take to achieve this 
  • and of course, all the great resources of Microsoft Learn – Learning Paths and Modules with lots of content and exercises
    Azure Fundamentals – AZ-900
    Azure Data Fundamentals – DP-900
  • If you have additional questions about the particular topics, you can also read the handy documentation, which often helps me better understand specific issues.

Various learning guides for Azure SQL available

Microsoft – mainly in Anna Hoffman and Bob Ward – also provides many interactive learning materials that can be formed in workshops, labs, or YouTube videos! There is material for several weeks of continuous learning about the SQL Server and/or Azure SQL … regardless of whether Azure SQL Database or the Managed Instance or as a SQL Server VM in Azure. Here you just have to start and try it out, don’t be afraid … it’s not magic. It’s actually quite simple because you can put everything together yourself as you need it. 😉

Microsoft Learn: Azure SQL fundamentals learning path
aka.ms/azuresqlfundamentals
Select the Azure SQL Workshop
aka.ms/sqlworkshops
How to choose tool
aka.ms/chooseazuresql
Azure SQL documentation
aka.ms/azuresqldocs
More videos from our team
aka.ms/dataexposed

Change your Skills - Learning - Ressourcen-Overview

Here are my slides from this talk:
Azure SQL – Change your skills to become a cloud DBA

And if you want to read something about Azure SQL Database, you can, of course either do so in my blog here or in the relatively new blog post on Azure SQL by Deepthi. 😉

Aha effect when setting SQL instance parameters with dbatools

In the last week, I installed several SQL Servers for one customer and had to install and configure them all identically. What is closer here than to do this with a Powershell script, so I made the “effort” and the individual steps of the SQL Server configuration with dbatools to realize. I do not want to go into all the steps of the configuration in this post, but show only a part of it, which brought me a specific “aha effect”.

As part of the SQL Server configuration, the default values of “MaxDop” and “Cost Threshold for Parallelism” should be adjusted. Setting MaxDoP using Powershell or dbatools is relatively easy as there is a separate command for this, but for the “Cost Threshold for Parallelism” dbatools has a “work-around”. Unfortunately, there is not (yet) direct command for this.

Set-DbaMaxDop -SqlInstance sql2008, sql2012

This command-line sets the value of Max DOP (Maximum Degree of Parallelism) to the recommended value for SQL Server instances SQL2008 and SQL2012. Always in connection with this configuration parameter is the “Cost Threshold”, which still set to 5 by default.

All SQL Server instances at once …

To configure all SQL Server instances relatively quickly and easily, I have the command “Get-DbaRegisteredServer” (as an alias of Get-DbaRegServer) made. As preparation for this, I have created the necessary servers (here 2 servers with 3 instances each) as “Registered Server” on all servers in the SQL Server Management Studio and was then able to access it with Powershell aka dbatools.

According to dbatools documentation, this command retrieves a list of SQL Server objects stored in locally registered groups in both the Azure Data Studio and the central management server.

originell from dbatools.io - Many thanks to Chrissy

With this command and the possibility to pass the objects from the result object as a pipeline, you can do beautiful things like configuring the value for “MaxDoP” on all servers or instances in a command-line…

Get-DbaRegisteredServer | Set-DbaMaxDop

However, now to my aha effect with another command line 😉

Cost Threshold For Parallelism

As indicated above, the adjustment is only a work-around with dbatools and not with a dbatools command, this I use now “Set-DbaSpConfigure”. Of course, I could also configure the “MaxDoP” with this command, but then I have to provide for the previous determination and calculation of the respective value for MaxDoP, so determine the existing cores, these match the Best Practice and the value of one Pass variable to the set command. I set these values to 40 in 98% of all instances (unless the customer or the application wants something else), so I do not need any logic here.

Following my above logic or the documentation, I tried it with the following command line:

Get-DbaRegisteredServer | Set-DbaSpConfigure -Name 'CostThresholdForParallelism' -Value 40

This command brought me to an (at first sight) incomprehensible error (even my attempt to pass the value as a string was not successful):

WARNING: [13:02:23][Set-DbaSpConfigure] Value out of range for Server1\Instanz1 ( <-> )
WARNING: [13:02:23][Set-DbaSpConfigure] Value out of range for Server2\Instanz1 ( <-> )
WARNING: [13:02:23][Set-DbaSpConfigure] Value out of range for Server1\Instanz2 ( <-> )
WARNING: [13:02:23][Set-DbaSpConfigure] Value out of range for Server2\Instanz2 ( <-> )
WARNING: [13:02:23][Set-DbaSpConfigure] Value out of range for Server1\Instanz3 ( <-> )
WARNING: [13:02:23][Set-DbaSpConfigure] Value out of range for Server2\Instanz3 ( <-> )

Unfortunately, I have not found the reason for this, and maybe someone else can bring me closer to the phenomenon… maybe this is by design or even a “bug”…

However, I’ve been so successful with pipelining before that I’ve used that too… so let’s first find out all the SQL instances, then find the current parameter for the “Cost Threshold For Parallelism” on those instances, and then set it to the new value 40.

Get-DbaRegisteredServer | Get-DbaSpConfigure -Name 'CostThresholdForParallelism' | Set-DbaSpConfigure -Value 40
ComputerName  : Server1
InstanceName  : Instanz1
SqlInstance   : Server1\Instanz1
ConfigName    : CostThresholdForParallelism
PreviousValue : 5
NewValue      : 40

ComputerName  : Server2
InstanceName  : Instanz1
SqlInstance   : Server2\Instanz1
ConfigName    : CostThresholdForParallelism
PreviousValue : 5
NewValue      : 40

ComputerName  : Server1
InstanceName  : Instanz2
SqlInstance   : Server1\Instanz2
ConfigName    : CostThresholdForParallelism
PreviousValue : 5
NewValue      : 40

ComputerName  : Server2
InstanceName  : Instanz2
SqlInstance   : Server2\Instanz2
ConfigName    : CostThresholdForParallelism
PreviousValue : 5
NewValue      : 40

ComputerName  : Server1
InstanceName  : Instanz3
SqlInstance   : Server1\Instanz3
ConfigName    : CostThresholdForParallelism
PreviousValue : 5
NewValue      : 40

ComputerName  : Server2
InstanceName  : Instanz3
SqlInstance   : Server2\Instanz3
ConfigName    : CostThresholdForParallelism
PreviousValue : 5
NewValue      : 40

Also, already I have found out something great for myself and am more enriched by experience in dealing with dbatools!

I love this Powershell module, which allows me to customize, optimize, and automate many (almost anything!) Things around and around SQL Server. I like to use it (as you can see my other blog posts) and meanwhile with all my clients. THANKS to @Chrissy and the many other contributors who are taking the trouble to make this community tool!

Photo by Ben White on Unsplash

Desired State Configuration #3 – Details about SQL Server Installation with murphy’s law

EDIT:
If you are looking for a ready solution, please be patient … I explain here my project history, how I proceeded when and how … As the last blog post I will present the “finished” solution.

Many Thanks for your understanding

In the following post, I would like to step a little more into depth to show you how I customized my scripts in order to copy and prepare everything for my Desired State Configuration SQL Server rollout. After explaining and showing the preparations and fundamentals of my deployment in the first two blog posts, today I would like to explain the actual installation as I had implemented it (there is also another way that I would like to try later – as you know there is no time in such projects ;-))

My scripts are in a state that you can call them “working” as I wrote this post. My scripts are currently able to create all the necessary folders on one (or more) server from a central server, they will copy all the required files (ISO image, updates, PowerShell modules, etc) to the target server, then according to the specifications they will install a SQL Server 2017 including the latest patch level and configure this SQL Server according to best practice requirements.

As explained in the second part of the DSC series, I have copied all the necessary files with my DSC-Configuration – including the current ISO image of the SQL Server 2017 and the last cumulative update – to the destination server. (In the picture only the TARGET state was checked, no actions were necessary!)

You can start with that actual state of the script right away and finally start with the real setup… for that we need the following

  • the ISO-Image (en_sql_server_2017_developer_x64_dvd_11296168.iso)
  • the latest patch (SQLServer2017-KB4466404-x64.exe)
  • the ConfigurationFile.ini
  • and the necessary Powershell-modules (e.g. dbatools, SecurityPolicyDsc, SqlServerDsc …)

In the file “ConfigData.psd” I have written down all necessary configuration variables, values and roles which were needed during the setup. In my current example, all variables and values are identical for all servers but they may be different in other environments.

@{
    AllNodes = @(
        @{
            NodeName        = "*"
            PSDscAllowPlainTextPassword = $true
            PSDscAllowDomainUser =$true
            SQLInstallPath     = "D:\SQL2017"
            InstanceName    = "MSSQLServer"
        },
        @{
            NodeName        = "sqlvm01"
            Role            = "SqlServer"
        },
        @{
            NodeName        = "sqlvm02"
            Role            = "SqlServer"
        }
        @{
            NodeName        = "sqlvm03"
            Role            = "SqlServer"
        }
        @{
            NodeName        = "sqlvm04"
            Role            = "SqlServer"
        }
    )
}
Dreams are broken - Photo by freestocks.org

Murphy has also struck

But from the beginning … On my test machines sqlvm01 and sqlvm02 I “played” a little bit without really logging what I did and why … so what did I do – for whatever reason – on those two machines locally or in which sequence and what caused my scripts to work ??? On sqlvm03 and later on sqlvm04, my scripts did not work as desired … Actually, I was very satisfied how my scripts ran on the first two machines, then SQL Server 2017 was installed and finally configured by Desired State Configuration. All four SQL Servers were configured identically within the deployment in the same resource group and I ran the same preparation scripts on all of them.

  • Network Discovery enabled
  • all discs were integrated and formatted
  • joined the target domain

And then from the domain controller using DSC, I pushed the configurations – as described before – to the target server … but unfortunately, the script resource did not like me … on 01/02 I was able to mount the ISO image without any problem. No matter what I did, no matter how I turned it, I could never get the ISO image mounted on 03/04 and therefore I wasn’t able to access the drive in order to run “setup.exe”.

The new beginning – everything back to 0.25

Sometimes things are going from bad to worse… of course, I didn’t have any backup of any script (whether working or not) … I created at least some kind of versioned folder structure if I was successful with a “milestones”, but I did not have a proper backup and certainly not included my work in a versioning software. So it came, how it had to come! I deleted the wrong folder with Powershell and everything was gone, only what was still open in Visual Studio code (3 files) could be saved.
Actually, I have – no idea where the error came from – deleted almost the entire C-disk… no more Internet browser, no PowerShell more, it was almost nothing there… it was just the easiest to rebuild the server (if you as I have no backup) => it was only the domain controller 😮

This unpleasant events, however, I have taken as an opportunity to clean up my entire development environment and choose “a little different” set up … actually, everything is the same, except that I will no longer develop on the domain controller itself and my scripts will no longer be located on the server, in other projects I have already gone this way. So I just set up a new Windows Server 2016 (standard DS1 v2), roll out and configure the Active Directory and DNS role. In addition, there is now a Windows 10 machine (Standard A4m v2) on which I will develop my scripts.

Learning from mistakes makes you stronger

In order to come closer to my goal or the project’s goal in terms of “automated installation of SQL servers with Desired State Configuration”, I, unfortunately, had to start over again … but this time with a slightly different approach. 😉
My first lessons-learned … Create a backup from a “blank machine” to get back to the original server faster. My second lessons-learned … do everything else with scripts (which I will publish in another blog post)

But back to the topic Desired State Configurations – I stayed with my starting activities, copying the Powershell modules and the installation media to the respective target server, what has remained is checking whether the necessary .NET framework is installed, otherwise, it will be installed. This time I oriented myself a bit (or a bit more) to Chris Lumnah (Blog), who has built a script based on an introduction to DSC from the Microsoft Virtual Academy and own knowledge, which I would like to adapt, since it doesn’t run with the DSC Script resource works, but with the “correct” SQL Server DSC resource.

More to come in the next blog post, because I had to spend another day on setting up everything again and start over with my new DSC scripts…
Sorry and thank you for your understanding!