Aha effect when setting SQL instance parameters with dbatools

In the last week, I installed several SQL Servers for one customer and had to install and configure them all identically. What is closer here than to do this with a Powershell script, so I made the “effort” and the individual steps of the SQL Server configuration with dbatools to realize. I do not want to go into all the steps of the configuration in this post, but show only a part of it, which brought me a specific “aha effect”.

As part of the SQL Server configuration, the default values of “MaxDop” and “Cost Threshold for Parallelism” should be adjusted. Setting MaxDoP using Powershell or dbatools is relatively easy as there is a separate command for this, but for the “Cost Threshold for Parallelism” dbatools has a “work-around”. Unfortunately, there is not (yet) direct command for this.

Set-DbaMaxDop -SqlInstance sql2008, sql2012

This command-line sets the value of Max DOP (Maximum Degree of Parallelism) to the recommended value for SQL Server instances SQL2008 and SQL2012. Always in connection with this configuration parameter is the “Cost Threshold”, which still set to 5 by default.

All SQL Server instances at once …

To configure all SQL Server instances relatively quickly and easily, I have the command “Get-DbaRegisteredServer” (as an alias of Get-DbaRegServer) made. As preparation for this, I have created the necessary servers (here 2 servers with 3 instances each) as “Registered Server” on all servers in the SQL Server Management Studio and was then able to access it with Powershell aka dbatools.

According to dbatools documentation, this command retrieves a list of SQL Server objects stored in locally registered groups in both the Azure Data Studio and the central management server.

originell from dbatools.io - Many thanks to Chrissy

With this command and the possibility to pass the objects from the result object as a pipeline, you can do beautiful things like configuring the value for “MaxDoP” on all servers or instances in a command-line…

Get-DbaRegisteredServer | Set-DbaMaxDop

However, now to my aha effect with another command line 😉

Cost Threshold For Parallelism

As indicated above, the adjustment is only a work-around with dbatools and not with a dbatools command, this I use now “Set-DbaSpConfigure”. Of course, I could also configure the “MaxDoP” with this command, but then I have to provide for the previous determination and calculation of the respective value for MaxDoP, so determine the existing cores, these match the Best Practice and the value of one Pass variable to the set command. I set these values to 40 in 98% of all instances (unless the customer or the application wants something else), so I do not need any logic here.

Following my above logic or the documentation, I tried it with the following command line:

Get-DbaRegisteredServer | Set-DbaSpConfigure -Name 'CostThresholdForParallelism' -Value 40

This command brought me to an (at first sight) incomprehensible error (even my attempt to pass the value as a string was not successful):

WARNING: [13:02:23][Set-DbaSpConfigure] Value out of range for Server1\Instanz1 ( <-> )
WARNING: [13:02:23][Set-DbaSpConfigure] Value out of range for Server2\Instanz1 ( <-> )
WARNING: [13:02:23][Set-DbaSpConfigure] Value out of range for Server1\Instanz2 ( <-> )
WARNING: [13:02:23][Set-DbaSpConfigure] Value out of range for Server2\Instanz2 ( <-> )
WARNING: [13:02:23][Set-DbaSpConfigure] Value out of range for Server1\Instanz3 ( <-> )
WARNING: [13:02:23][Set-DbaSpConfigure] Value out of range for Server2\Instanz3 ( <-> )

Unfortunately, I have not found the reason for this, and maybe someone else can bring me closer to the phenomenon… maybe this is by design or even a “bug”…

However, I’ve been so successful with pipelining before that I’ve used that too… so let’s first find out all the SQL instances, then find the current parameter for the “Cost Threshold For Parallelism” on those instances, and then set it to the new value 40.

Get-DbaRegisteredServer | Get-DbaSpConfigure -Name 'CostThresholdForParallelism' | Set-DbaSpConfigure -Value 40
ComputerName  : Server1
InstanceName  : Instanz1
SqlInstance   : Server1\Instanz1
ConfigName    : CostThresholdForParallelism
PreviousValue : 5
NewValue      : 40

ComputerName  : Server2
InstanceName  : Instanz1
SqlInstance   : Server2\Instanz1
ConfigName    : CostThresholdForParallelism
PreviousValue : 5
NewValue      : 40

ComputerName  : Server1
InstanceName  : Instanz2
SqlInstance   : Server1\Instanz2
ConfigName    : CostThresholdForParallelism
PreviousValue : 5
NewValue      : 40

ComputerName  : Server2
InstanceName  : Instanz2
SqlInstance   : Server2\Instanz2
ConfigName    : CostThresholdForParallelism
PreviousValue : 5
NewValue      : 40

ComputerName  : Server1
InstanceName  : Instanz3
SqlInstance   : Server1\Instanz3
ConfigName    : CostThresholdForParallelism
PreviousValue : 5
NewValue      : 40

ComputerName  : Server2
InstanceName  : Instanz3
SqlInstance   : Server2\Instanz3
ConfigName    : CostThresholdForParallelism
PreviousValue : 5
NewValue      : 40

Also, already I have found out something great for myself and am more enriched by experience in dealing with dbatools!

I love this Powershell module, which allows me to customize, optimize, and automate many (almost anything!) Things around and around SQL Server. I like to use it (as you can see my other blog posts) and meanwhile with all my clients. THANKS to @Chrissy and the many other contributors who are taking the trouble to make this community tool!

Photo by Ben White on Unsplash

Desired State Configuration #3 – Details about SQL Server Installation with murphy’s law

EDIT:
If you are looking for a ready solution, please be patient … I explain here my project history, how I proceeded when and how … As the last blog post I will present the “finished” solution.

Many Thanks for your understanding

In the following post, I would like to step a little more into depth to show you how I customized my scripts in order to copy and prepare everything for my Desired State Configuration SQL Server rollout. After explaining and showing the preparations and fundamentals of my deployment in the first two blog posts, today I would like to explain the actual installation as I had implemented it (there is also another way that I would like to try later – as you know there is no time in such projects ;-))

My scripts are in a state that you can call them “working” as I wrote this post. My scripts are currently able to create all the necessary folders on one (or more) server from a central server, they will copy all the required files (ISO image, updates, PowerShell modules, etc) to the target server, then according to the specifications they will install a SQL Server 2017 including the latest patch level and configure this SQL Server according to best practice requirements.

As explained in the second part of the DSC series, I have copied all the necessary files with my DSC-Configuration – including the current ISO image of the SQL Server 2017 and the last cumulative update – to the destination server. (In the picture only the TARGET state was checked, no actions were necessary!)

You can start with that actual state of the script right away and finally start with the real setup… for that we need the following

  • the ISO-Image (en_sql_server_2017_developer_x64_dvd_11296168.iso)
  • the latest patch (SQLServer2017-KB4466404-x64.exe)
  • the ConfigurationFile.ini
  • and the necessary Powershell-modules (e.g. dbatools, SecurityPolicyDsc, SqlServerDsc …)

In the file “ConfigData.psd” I have written down all necessary configuration variables, values and roles which were needed during the setup. In my current example, all variables and values are identical for all servers but they may be different in other environments.

@{
    AllNodes = @(
        @{
            NodeName        = "*"
            PSDscAllowPlainTextPassword = $true
            PSDscAllowDomainUser =$true
            SQLInstallPath     = "D:\SQL2017"
            InstanceName    = "MSSQLServer"
        },
        @{
            NodeName        = "sqlvm01"
            Role            = "SqlServer"
        },
        @{
            NodeName        = "sqlvm02"
            Role            = "SqlServer"
        }
        @{
            NodeName        = "sqlvm03"
            Role            = "SqlServer"
        }
        @{
            NodeName        = "sqlvm04"
            Role            = "SqlServer"
        }
    )
}
Dreams are broken - Photo by freestocks.org

Murphy has also struck

But from the beginning … On my test machines sqlvm01 and sqlvm02 I “played” a little bit without really logging what I did and why … so what did I do – for whatever reason – on those two machines locally or in which sequence and what caused my scripts to work ??? On sqlvm03 and later on sqlvm04, my scripts did not work as desired … Actually, I was very satisfied how my scripts ran on the first two machines, then SQL Server 2017 was installed and finally configured by Desired State Configuration. All four SQL Servers were configured identically within the deployment in the same resource group and I ran the same preparation scripts on all of them.

  • Network Discovery enabled
  • all discs were integrated and formatted
  • joined the target domain

And then from the domain controller using DSC, I pushed the configurations – as described before – to the target server … but unfortunately, the script resource did not like me … on 01/02 I was able to mount the ISO image without any problem. No matter what I did, no matter how I turned it, I could never get the ISO image mounted on 03/04 and therefore I wasn’t able to access the drive in order to run “setup.exe”.

The new beginning – everything back to 0.25

Sometimes things are going from bad to worse… of course, I didn’t have any backup of any script (whether working or not) … I created at least some kind of versioned folder structure if I was successful with a “milestones”, but I did not have a proper backup and certainly not included my work in a versioning software. So it came, how it had to come! I deleted the wrong folder with Powershell and everything was gone, only what was still open in Visual Studio code (3 files) could be saved.
Actually, I have – no idea where the error came from – deleted almost the entire C-disk… no more Internet browser, no PowerShell more, it was almost nothing there… it was just the easiest to rebuild the server (if you as I have no backup) => it was only the domain controller 😮

This unpleasant events, however, I have taken as an opportunity to clean up my entire development environment and choose “a little different” set up … actually, everything is the same, except that I will no longer develop on the domain controller itself and my scripts will no longer be located on the server, in other projects I have already gone this way. So I just set up a new Windows Server 2016 (standard DS1 v2), roll out and configure the Active Directory and DNS role. In addition, there is now a Windows 10 machine (Standard A4m v2) on which I will develop my scripts.

Learning from mistakes makes you stronger

In order to come closer to my goal or the project’s goal in terms of “automated installation of SQL servers with Desired State Configuration”, I, unfortunately, had to start over again … but this time with a slightly different approach. 😉
My first lessons-learned … Create a backup from a “blank machine” to get back to the original server faster. My second lessons-learned … do everything else with scripts (which I will publish in another blog post)

But back to the topic Desired State Configurations – I stayed with my starting activities, copying the Powershell modules and the installation media to the respective target server, what has remained is checking whether the necessary .NET framework is installed, otherwise, it will be installed. This time I oriented myself a bit (or a bit more) to Chris Lumnah (Blog), who has built a script based on an introduction to DSC from the Microsoft Virtual Academy and own knowledge, which I would like to adapt, since it doesn’t run with the DSC Script resource works, but with the “correct” SQL Server DSC resource.

More to come in the next blog post, because I had to spend another day on setting up everything again and start over with my new DSC scripts…
Sorry and thank you for your understanding!

Desired State Configuration # 2 – Getting Started with SQL Server Installation

After my first contribution – Preparing to use Desired State Configuration (DSC) – you can now take the first steps to install a SQL Server. So what do we need for prerequisites for installing a SQL Server on a new Windows Server?

To my test environment … I’m using a development environment in Azure consisting of at least two Windows servers for the development of my scripts in this case. I use one server as a domain controller and the second machine as SQL Server to test my scripts and test if they’re functional. I can and will roll out one or more servers as needed to show the scripts in my demos or to develop and test my scripts in a “larger” environment.

my dev environment for Desired State Configuration build in Azure - Overview - New Environment
my dev environment for DSC – Overview – New Environment

Installation Requirements

To perform a simple SQL Server installation – in the sense of a customer project – on a new server, we need three things:

  • .NET Framework 4.5 (minimum)
  • Installation media for SQL Server 2017 Standard Edition
  • and the last update in the sense of the last cumulative update

To deploy the installation media to the new server, you must have at least one folder on any drive. Now there are several possibilities to provide this folder…

  • Manual installation of the folder
  • Creation of the folder with Desired State Configuration
  • Creation of the folder using PowerShell

Since we are in this project, these contribution series endeavour to automate everything, of course, the manual creation of the folder is eliminated. Depending on how you want to proceed, what conditions you must/would like to meet or in which order you want to proceed, you can now choose between Desired State Configuration or Powershell. In this project, I would like to illustrate how to create a directory on the target server first and then get the installation media there with very simple steps.

Create the folder with PowerShell

In order to realize my project, theoretically, first I would have to create a Desired State Configuration for the folder structure, implement and start the copy action, then start the actual installation. I’ll try that in a second step, now I’m starting with the “simpler” approach. For this I use the “Invoke Command” and check if the folder exists on the target server or not, if not the folder will be created newly.

Invoke-Command -ComputerName $NodeName -ScriptBlock { 
param ($LocalInstallFolder)
if (!(Test-Path -Path $LocalInstallFolder )) {
    New-Item -ItemType directory -Path $LocalInstallFolder | Out-Null
} else {
    Remove-Item -Recurse -Force $LocalInstallFolder
    New-Item -ItemType directory -Path $LocalInstallFolder | Out-Null
} -ArgumentList "D:\$LocalInstallFolder\"

Why do I first delete the destination directory? Of course, if I run the script multiple times, then the current installation media and current update must be already there, therefore, once the directory will be deleted and then re-created. If the target directories exist, then you can copy the required files / directories.

Copy the installation media

Initially, I had copied all my files from a network drive to the destination server with “Copy-Item”, but since I was developing quite a lot with the Powershell ISE, I did not have a “progress bar” … later on, I switched back to “Start-BitsTransfer”.

Write-host "Copy SQL-Image to"$NodeName.ToUpper()
$DestinationPath = "\\$NodeName\d$\$LocalInstallFolder\"
Start-BitsTransfer -Source ..\SQL\* -Destination $DestinationPath -Description "..\SQL\* will be moved to $DestinationPath" -DisplayName "Copy SQL-Image" 
Start-BitsTransfer -Source ..\SQL\Updates\* -Destination "$DestinationPath\Updates\" -Description "..\SQL\Updates\* will be moved to $DestinationPath" -DisplayName "Copy SQL-Updates"

But that was somehow too much effort and complicated for me, so somehow like a “workaround” … but at least it worked 😉 That’s exactly how I started when I copied the necessary Powershell modules to the target server but with every day and every further attempt to deal with Desired State Configuration, I learned new things how to approach the goal and how to do so. By this learning, I changed the way I copied the necessary files via the Desired State Configuration.

Configuration CopyInstallationMedia
{
    Node $AllNodes.where{ $_.Role.Contains("SqlServer") }.NodeName
    {
        File InstallationFolder 
        {
            Ensure = 'Present'
            Type = 'Directory'
            SourcePath = "\\dc1\NetworkShare\SQL\"
            DestinationPath = "D:\SQL2017\"
            Recurse = $true
        }
    
        File PowershellModules 
        {
            Ensure = 'Present'
            Type = 'Directory'
            SourcePath = "\\dc1\NetworkShare\Modules\"
            DestinationPath = "C:\Windows\system32\WindowsPowerShell\v1.0\Modules\"
            Recurse = $true
        }
    }
}

Clear-Host
$OutputPath = "\\dc1\DSC-ConfigShare"
CopyInstallationMedia -OutputPath "$OutputPath\CopyInstallationMedia\"    

Start-DscConfiguration -ComputerName sqlvm02 -Path \\DC1\DSC-ConfigShare\CopyInstallationMedia -Wait -Verbose -Force

Building my DSC configurations

For the sake of a better overview and a more simplified, step-by-step testing and traceability, I have created several sections in my Desired State Configuration. I can call the particular configuration specifically and individually and the “refine” or correct the function.

Configuration CopyInstallationMedia
{
    Node $AllNodes.where{ $_.Role.Contains("SqlServer") }.NodeName
    {
        File InstallationFolder 
        {
            ...
        }

        File PowershellModules 
        {
            ...
        }

        Configuration ConfigureSQL
        {
            ...
        }
}

Of course, it also requires some parameters which can not be ignored, these are defined at the beginning of the script. In my case, for installation and later configuration, I need at least the path where the installation media should be stored centrally and the destination server where the DSC configuration should be rolled out to.

param
(
    # Path where all Install media will be located
    [Parameter(Mandatory=$true)]
    [String]
    $InstallPath,

    # Computer name to install SQL Server On
    [Parameter(Mandatory=$true)]
    [String]
    $ComputerName
)

Now that I have subdivided and defined the configuration, I can add necessary scripts or modules and initiate the creation of the MOF files. Based on these MOF files, the actual configuration is then compared with the target configuration and corrected accordingly. Since something could have changed in the SET configuration between the last time the MOF files were created and “Now”, I will always have the files re-created for safety reasons, in order to be able to roll them out directly to the target server. For a more detailed explanation of my script section … I call the respective configuration, assign it a configuration file and define a path for the filing of the MOF files. Finally, the respective SET configuration is rolled out from the central storage location to the destination server.
As you can now understand, I am also able to execute the individual steps separately for debugging purposes, either manually one by one or only individual configurations such as the mere configuration of a SQL Server.

Write-host "Starting DSC process on"$NodeName.ToUpper()
Import-Module $PSScriptRoot\ConfigureSQLServer.psm1 -Force

## Create MOF-Files
$OutputPath = "\\dc1\DSC-ConfigShare"
CopyInstallationMedia -ConfigurationData \\dc1\NetworkShare\scripts\configData.psd1 -OutputPath "$OutputPath\CopyInstallationMedia\"
SQLInstall -ConfigurationData \\dc1\NetworkShare\scripts\configData.psd1 -OutputPath "$OutputPath\SQLInstall\"
ConfigureSQL -ConfigurationData \\dc1\NetworkShare\scripts\configData.psd1 -OutputPath "$OutputPath\SQLConfig\"

## Use MOF-Files to establish desired state configuration
Start-DscConfiguration -ComputerName $Computername -Path \\DC1\DSC-ConfigShare\CopyInstallationMedia -Wait -Verbose -Force     
Start-DscConfiguration -ComputerName $Computername -Path \\DC1\DSC-ConfigShare\SQLInstall -Wait -Verbose -Force     
Start-DscConfiguration -ComputerName $Computername -Path \\DC1\DSC-ConfigShare\SQLConfig -Wait -Verbose -Force

This is what my folder structure or file structure looks like on the central server, there is a folder for each SET configuration and a single file for each target server.

Folder and File Structure for my first Desired State Configuration deployment
Folder and File Structure for my first DSC deployment

More about Desired State Configuration of course in the Microsoft documentation

Desired State Configuration – Simplify your SQL Server Administration with DSC

Two years ago, I first dealt with Desired State Configuration (DSC) to create a demo for SQL Saturday #605 that continually monitors and eventually corrects the configuration of the SQL server. Now I have another lovely customer project, where I have the choice between complete implementation in Powershell (via dbatools) or just a little DSC in connection with dbatools.

Tasks in this scenario

At a push on a button, two new SQL Servers should be installed and configured to run multiple databases in Basic Availability Groups. So I’ve started with expanding my Azure test environment and here is how I started my way how built that SQL Server environment in a fully automated way.

I will show you the following topics in the next few blog posts:

  • Preparation of the server
  • Installation of the SQL Server
  • Configuration of the SQL Server
  • Backup / Restore Automation
  • Creation of the BAGs from the backups
  • Activation and completion of the new server

Create preconditions for DSC

I’ll start here from with following considerations:

The new servers were installed and fully configured within the operating system, joined into a domain and had current Windows Management Framework installed, ideally Windows Remote Management (WinRM) already installed and activated… if not just install it before continuing.

Initially, I also wanted to join the server into the domain with Powershell, but I had a little bit trouble with that at the beginning (then done it manually) and want to show you a screenshot of my initial problem. 😉

Ok, so on… so I get rolled out and configured WinRM automatically on all target servers and opened the associated firewall ports… Google helped me a bit with the brainstorming, but my thoughts were already in the right direction = > Domain Group Policy (aka Group Policies).

To do this, create a group policy in the Group Policy Management of your domain so that you can build a proper and traceable structure.

1.) Set up service

Add a new service and configure it

Computer Configuration > Preferences > Control Panel Settings > Services

2.) Allow access to Windows Remote Shell

At least allow the service to access, should that not be enough, allow even the remote shell access.

Computer Configuration > Policies > Administrative Templates > Windows Components > Windows Remote Management (WinRM) > WinRM Services

Now continuing with the holes in the firewall

3.) Open firewall for winRM

Last step in the configuration so that all servers in the domain are configured identically using this group policy to be able to install a SQL Server over DSC (Desired State Configuration) later on.

Computer Configurations > Policies > Windows Settings > Security Settings > Windows Firewall and Advanced Security > Windows Firewall and Advanced Security 

Now create a new inbound rule, so that access from outside (here from the DomainController to the SQL server) are possible.

This rule explicitly created for the Windows Remote Management, to allow all connections within the private domain network, the public network not considered as a precaution.

Now you can roll out this new group policy on all servers and should then be able to configure and administer all servers using DSC.

4.) Update Group Policy

Now bring the new server into the domain and thus the server should receive the newly created policies automatically. However, to make sure that this is the case, you can manually update the group policies once on the server itself.

Now, there should not be anything in its way to configure the server with DSC, and you can start to create configurations on the central administration server, roll them out and see what’s happening. However, more about that in the next blog post. 😉
Quelle: How to Enable WinRM via Group Policy

Group Policy update due to errors in script execution

After I had finished the first post and researched and tested for further contributions as well as the customer project, I, unfortunately, had to conclude that in the later course of the script I am dependent on using WMI to determine specific values such as RAM and CPU. This is only possible if I can establish a connection from the central management server to the destination server. Unfortunately, there was a rule in the firewall of the server that allows inbound and outbound communication with the RPC server/service. So I had to make another adjustment to the group policy.

TSQL Tuesday # 96: Community people who have influenced my way

TSQL Tuesday # 96: Community people who have influenced my way

T-SQL Tuesday is a recurring blog series started by Adam Machanic (b | t), a blogger hosts a SQL Server topic every month, and anyone can write a blog post on this particular topic. This month, Ewald Cress (b | t) is our host and it’s about the people who have influenced us in our lives with data in the community.

For a long time, I had no relation to any community… in my youth, I was a volunteer at the German Red Cross and now on the road again with “you”.

Who influenced me in the last few years?

Actually, everything started with PASSCAMP 2013… Ok, maybe a bit earlier when I signed up for Twitter in mid-2012 and read a lot about a #sqlfamily.

Definitely worth mentioning Brent Ozar, who has inspired me with his knowledge and blog contributions for years and continues to bring in the daily DBA life. Also, his contributions to the topic “Why should I run a blog?” count among my reasons mentioned again and again. Unfortunately, I have not been able to meet him personally yet, but that’s guaranteed to happen. His funny way of explaining complicated things to one person is just great!

Then in 2013, it was my visit to the PASSCamp and thus my first direct contact with the German sqlfamily aka SQLPass. Here you can only list the usual suspects, who are always present at such events. => Oliver Engels, Tillmann Eitelberg, Kostja Klein, Niko Neugebauer and Andreas Wolter, just to name a few… I found the group dynamics awesome and all that love between them… everyone knows each other, everyone laughs at each and everyone talks to everyone, no one is left behind!

I wanted to join in, somehow belong to them… but how?

Henning L. @hlhr_dev Jun 2 thanks to all the speakers and specially to @sql_williamd for this great event #SQLGrillen

So after this experience, I started to get more involved with the PASS and its activities and found among others like Cathrine Wilhelmsen, whose community activities also spurred me and showed me how to start => more participation in local or national activities of the PASS. Then it came to SQLGrillen from William Durkin and the session of Andre Kamann about “PoSh meets Ola Hallengren” and another meeting with Andre Essing, which in total motivated me to jump over my own shadow and act as a spokesman in the PASS. During the following two SQLSaturdays (Rhineland and Munich 2016) I was on the road for my first time as a volunteer and was able to get a taste of the national activities of PASS Germany. As luck would have kicked in or the data network rolled the balls, the SQLSaturday in Munich was the next step in my “community career” and brought me the Azure Meetup Hamburg.

In 2017, I had my first public talks, some in-house, some in the PASS user group Hamburg, at SQLGrillen 2017 (Thanks to William) and twice at SQLSaturday Rheinland 2017 (thanks to Olli, Tillmann and Kostja)… all this in conjunction with my blog and Twitter activities helped me to be awarded by Microsoft with the MVP Award.

I would like to say thank you to following people of the German #sqlfamily:

Oliver Engels
Tillmann Eitelberg
Kostja Klein
Andre Essing
William Durkin
Andre Kamann

A special thanks go to Gabi Münster for the support during my first public appearance, in the end, it’s all about a “kick my ass” to jump over my shadow. Of course, many conversations and Twitter contacts with numerous other community members (Chrissy, Claudio, Rob, Dirk, Volker and, above all, Conny!) helped me to feel as a member (at least of the German) SQL family! THANK YOU SO MUCH! Further goals are planned for 2018 😉

A very special thanks go to my team lead Thorsten Moeller, who supports me again and again in all my activities and a much bigger “Thank you” goes to my wife, who also supports these activities and always “kept my back”!