TSQL Tuesday # 96: Community people who have influenced my way

TSQL Tuesday # 96: Community people who have influenced my way

T-SQL Tuesday is a recurring blog series started by Adam Machanic (b | t), a blogger hosts a SQL Server topic every month, and anyone can write a blog post on this particular topic. This month, Ewald Cress (b | t) is our host and it’s about the people who have influenced us in our lives with data in the community.

For a long time, I had no relation to any community… in my youth, I was a volunteer at the German Red Cross and now on the road again with “you”.

Who influenced me in the last few years?

Actually, everything started with PASSCAMP 2013… Ok, maybe a bit earlier when I signed up for Twitter in mid-2012 and read a lot about a #sqlfamily.

Definitely worth mentioning Brent Ozar, who has inspired me with his knowledge and blog contributions for years and continues to bring in the daily DBA life. Also, his contributions to the topic “Why should I run a blog?” count among my reasons mentioned again and again. Unfortunately, I have not been able to meet him personally yet, but that’s guaranteed to happen. His funny way of explaining complicated things to one person is just great!

Then in 2013, it was my visit to the PASSCamp and thus my first direct contact with the German sqlfamily aka SQLPass. Here you can only list the usual suspects, who are always present at such events. => Oliver Engels, Tillmann Eitelberg, Kostja Klein, Niko Neugebauer and Andreas Wolter, just to name a few… I found the group dynamics awesome and all that love between them… everyone knows each other, everyone laughs at each and everyone talks to everyone, no one is left behind!

I wanted to join in, somehow belong to them… but how?

Henning L. @hlhr_dev Jun 2 thanks to all the speakers and specially to @sql_williamd for this great event #SQLGrillen

So after this experience, I started to get more involved with the PASS and its activities and found among others like Cathrine Wilhelmsen, whose community activities also spurred me and showed me how to start => more participation in local or national activities of the PASS. Then it came to SQLGrillen from William Durkin and the session of Andre Kamann about “PoSh meets Ola Hallengren” and another meeting with Andre Essing, which in total motivated me to jump over my own shadow and act as a spokesman in the PASS. During the following two SQLSaturdays (Rhineland and Munich 2016) I was on the road for my first time as a volunteer and was able to get a taste of the national activities of PASS Germany. As luck would have kicked in or the data network rolled the balls, the SQLSaturday in Munich was the next step in my “community career” and brought me the Azure Meetup Hamburg.

In 2017, I had my first public talks, some in-house, some in the PASS user group Hamburg, at SQLGrillen 2017 (Thanks to William) and twice at SQLSaturday Rheinland 2017 (thanks to Olli, Tillmann and Kostja)… all this in conjunction with my blog and Twitter activities helped me to be awarded by Microsoft with the MVP Award.

I would like to say thank you to following people of the German #sqlfamily:

Oliver Engels
Tillmann Eitelberg
Kostja Klein
Andre Essing
William Durkin
Andre Kamann

A special thanks go to Gabi Münster for the support during my first public appearance, in the end, it’s all about a “kick my ass” to jump over my shadow. Of course, many conversations and Twitter contacts with numerous other community members (Chrissy, Claudio, Rob, Dirk, Volker and, above all, Conny!) helped me to feel as a member (at least of the German) SQL family! THANK YOU SO MUCH! Further goals are planned for 2018 😉

A very special thanks go to my team lead Thorsten Moeller, who supports me again and again in all my activities and a much bigger “Thank you” goes to my wife, who also supports these activities and always “kept my back”!

Björn works as a database administrator and Head of Competence for MS SQL and mySQL in Hamburg (Germany). He regularly participates in the PASS regional group meetings, events of the PASS such as SQLSaturday and SQLGrillen and he organizes the Azure Meetup group in Hamburg. He is interested in topics like SQL Server, Powershell and Azure for science fiction, snowboarding, baking and cycling.

TempDB-Performance-Tuning out of the box – T-SQL Tuesday #87

TempDB-Performance-Tuning out of the box - T-SQL Tuesday #87

T-SQL Tuesday is a recurring blog series which was launched by Adam Machanic (b | t). Every month, a blogger is hosting a topic around the SQL Server and anyone can write a blog post on this particular topic.

This month, Matt Gordon (b | t) our host and the topic is about new features in the SQL Server (starting with the 2014 version) which fixes old issues, see more details in his announcement post.

My topic for this TSQL2sDay is the new possibility to configure the TempDB as part of the SQL Server installation.

much easier Implementation of the Best Practice for TempDB

Previously, after you’ve finished a SQL Server installation, you had to implement best practice recommendations from Microsoftthe best practice recommendations from Microsoft in a more or less complicated way. Either you approach the topic in a more conventional way and adjust the values manually via the SQL Server Management Studio or you had made it a little bit easier to automate this by a T-SQL or Powershell script. But still, you had to add these configuration adjustments every time to achieve an optimized TempDB performance.

Essentially the TempDB performance comes basically with the hardware configuration, in this case, the physical separation between user databases and TempDB by transferring the TempDB onto a high-performance storage. Microsoft recommends the strict separation of UserDB and TempDB, not only through separate folder structures or separate partitions but should by separated (fast) disks. There is no need for a great RAID configuration, a RAID 1 or RAID 10 would be great, but no real need, because the TempDB is always “re-created” when restarting SQL Server, so it would not be a “drama” when it is corrupt or the discs below would crash. (Ok, here I don’t care about the possible loss of data and time).

Overview of Best Practice recommendations

So many adjustments are not necessary to get a better performance of TempDB, I would like to briefly summarize this:

  • Physical separation of user databases and TempDB
  • Splitting the TempDB into individual data files according to the number of processors
  • Autogrowth parameter adjustments
  • Trace flag depending on usage / requirement

Once this physical optimization has been implemented in the storage area, you can make the logical adjustments in the configuration. Depending on the processor/core count, Microsoft also recommends splitting data files into several separate files instead of one large data file in order to optimize the performance of the IO streams (parallelization). Likewise, depending on the application or requirements, the data files can initially be set to a fixed size so that the internal structure of the data files are optimized and all files are the same size. (Similar to trace flag 1117/1118). Otherwise, the Autogrowth parameter should be set to 10% so that the data files in conjunction with Trace flags 1117 and 1118 contribute to an optimized, uniform structure of the TempDB.

SQLServer2016 - Configure TempDB during Installation

SQL Server 2016 brings integrated simplifications

The SQL Server 2016 now brings this whole logical optimization right to the installation process… now you can tell the SQL Server how to set the number and distribution of TempDB data files and their auto growth parameters within the installation steps. Microsoft has finally integrated the long-published Best Practices recommendations into the installation process. So you only have to create the hardware prerequisites for the TempDB and can start to install.
Ok, here, too, you have to worry about which values you want to set, but at least you do not have to worry about these configurations afterward!

THANKS to Microsoft for this improvement.

During the installation process, you can set how many data files you want to create, the initial size and the Autogrowth parameter. Theoretically, you are even able to name a separate path for each individual data file (division by the number of cores) or even a separate (SSD) disc. Everything is displayed on a clear screen and can be used also from an inexperienced DBA.

Through this new “Toy” some old “problems” could be reduced … 😉

Björn works as a database administrator and Head of Competence for MS SQL and mySQL in Hamburg (Germany). He regularly participates in the PASS regional group meetings, events of the PASS such as SQLSaturday and SQLGrillen and he organizes the Azure Meetup group in Hamburg. He is interested in topics like SQL Server, Powershell and Azure for science fiction, snowboarding, baking and cycling.

Become a Speaker at a PASS Event or other Meetups – share your knowledge

This month Andy Yun is our TSQL2sDay #84 host and the subject is “Growing new speakers”. He wants us to talk about presenting and speaking at any SQL Server, PASS Event or any other Meetup and how to prepare, submit and speak at these events. Sharing Knowledge with others and getting in touch with other speakers and the community aka SQLFamily.
T-SQL Tuesday is a recurring blog party, that is started by Adam Machanic (b|t). Each month a blog will host the party, and everyone that want’s to can write a blog about a specific subject.

Overcome yourself and submit a session extract to your preferred event

My own story although I’ve just spoken once for at a public event (our regional PASS group), at my company I’ve spoken more often… and planning to speak much more often in 2017 😉
But I can still remember my first time… It was hard for me to choose the right topic, then write/create a cool designed but not too colorful technical presentation and extract an interesting extract of it…

The most effort was to do the decision itself to do a presentation in front of an “unknown” group of people (to become a speaker) but I have not regretted it so far. In 2015 I’ve made a general decision to my upcoming career I want to join more of our PASS meetings, to join more PASS events and to become a part of the German SQLFamily.

SQL PASS - Die Community rund den SQL Server - Logo

So I joined my first SQLSaturday in Sankt Augustin this year and enjoyed so much that I decided to (try to) visit all other PASS events in Germany this year (and the next years 😉 ). So I went to SQLGrillen – an event of the regional user group of PASS Emsland in Lingen – where I also visited several German speakers and had some barbecue and beer with them.
After convincing my employer I also visited the new SQLSaturday #555 in Munich as a volunteer and got in touch with even more guys of the European PASS Family. All these great experiences had made such a fun and convinced me that it is not at all bad to be a speaker at such an event.
So I want to become a speaker next year too!

Find the right topic and prepare your presentation

But what to speak about, what might be interesting enough for the audience to hear, what are actual topics in the community?
These are the hardest question before to start with your speaker career but do not make it too hard for you. As Brent Ozar had written in one of his blog posts… it doesn’t matter if any other spoke about your preferred topic before it is more important that you speak about this topic. You might tell it in a different way to another or a Level 100 audience so it might be easier for you to get your speaker experience or to keep your self-confidence.

Become a Speaker like Brent Ozar at SQLSaturday in Portugal

After you found your topic I prefer to prepare your presentation as a draft then show it to your wife or your friends for a first review. After your first review and edit phase present your session in front of known colleagues and listen carefully to their questions and remarks. That will help you very much to point out the most important missing information within your presentation.

my summary of a good presentation

  • use Powerpoint (or a similar software)
  • build your own “corporate design” and make sure you’ll use this design on all slides
  • use fewer colors, use an only font (in different sizes)
  • do not too much graphics
  • don’t go too deep into written details (just a header and some words, not a complete description)
  • don’t use too much animations/transitions

Submit your topic and abstract to your first “Call for Speaker”-Event

So now you’re prepared for your first event… Have a look for upcoming events in your regions like PASS Events or Meetup groups and submit your topic and abstract to the organizer or via a website. For such bigger events, there is mostly a “Call for speaker” and a closing date until when you have to submit your session.

After you have submitted your session there is only one thing to do … wait 😉  after the closing date will announce and contact all speaker for this upcoming event.

You will see presenting at a SQL Server Community event is a great experience and a lot of fun.
The first time will surely be very exciting but you’ll see that no one “bites” and all are very pleased that you’re sharing your knowledge.

You’ll grow with each session as a speaker and get known to the SQLFamily. On every event, you’ll be able to talk to other speakers and share your experiences with them and they’ll share their experiences with you. I think sharing knowledge and presenting on every event – doesn’t matter which topic or event or size – will teach you more.

 

“Share your knowledge. It is a way to achieve immortality.”
― Dalai Lama XIV

Björn works as a database administrator and Head of Competence for MS SQL and mySQL in Hamburg (Germany). He regularly participates in the PASS regional group meetings, events of the PASS such as SQLSaturday and SQLGrillen and he organizes the Azure Meetup group in Hamburg. He is interested in topics like SQL Server, Powershell and Azure for science fiction, snowboarding, baking and cycling.

SQL Server 2016 – TSQL2sDay – Backup to Azure Cloud

T-SQL Tuesday #82 – To the cloud… And beyond!!!

T-SQL Tuesday is a recurring blog party, that is started by Adam Machanic (b | t). Each month a blog will host the party, and everyone that want’s to can write a blog about a specific subject.

This month Jeffrey Verheul is our TSQL2sDay host and the subject is “To the cloud… And beyond!!!”.

My post about “to the cloud… and beyond!!!” will be about how to use the Azure Cloud as a Backup Storage and how to implement this with SQL Server 2016.

However, I won’t write about the general need for SQL server backups, why backups are so important and how to create those backups. I would like to explain the various ways a SQL Server can do backups into the Azure cloud and how PowerShell might help us.

An important objection I basically would like to brief because sometimes you might not think about if you create a solution … So working backups are absolutely necessary and very important, there are several ways to create secure backups and have them available if you need them. Normally, the fastest and easiest solution is a local “Backup-to-Disc”, but what about those backups if a disk fails? Optimum (or more failsafe) is a “backup-to-share” or better known as the backup-to-network device (where the network share should, of course, be configured as a RAID. But we all know a small company which is not able to host large file servers where those last ~ 5 days of SQL server backups (in addition ?!) can be stored. What might be easier than storing these backups on another (“cheaper”) storage box like an Azure Storage Accounts?

Options of SQL Server in combined with Azure

Since SQL Server 2012 Microsoft provides the possibility of “SQL Server Backup to URL” which creates various kinds of SQL server backups as usual. But in this way backup isn’t located on a local disc or local file share, it is pumped to a URL within the Azure cloud. This feature had been optimized in SQL Server 2016th with additional options (BlockBlobs, Shared Access Signatures and striping) to create faster and more reliable backups in the cloud.

With SQL Server 2014 Microsoft led “ SQL Server Managed Backup to Microsoft Azure ” into a field. This new feature enables the DBA a simplified and faster operation because the SQL Server will decide himself when he creates a backup. The DBA does not have to worry about the backup strategy and backup scripts. Of course, the options allows intervention by the DBA but it’s probably but it even more simple with onboard tools of SQL Server. For Azure SQL Server this feature is recommended to be used.

New in the SQL Server 2016 is following feature “ File snapshot backups for Database Files in Azure “. This new feature allows SQL Server 2016, which is runs in Azure to create a very, very fast backup of data files and even a quick restore.

SQL Server Backup and Restore with Azure Blob Storage

Backup to Azure Blob Storage

What do we need for deployment and connectivity of our SQL Servers to an Azure Blob storage?

Since PowerShell 2012 and 2014 had become more important to SQL Server, I would like to contribute a little guide to here. First, we have to start with the installation of the Azure Plugin for PowerShell. For instructions and a download link can be found here => https://azure.microsoft.com/de-de/documentation/articles/powershell-install-configure/
Of course, you’ll need an account for Azure Cloud (170 Euro Test-Zugang), for example as part of the Visual Studio licensing or MPN or a BizTalk customer.

Create an Access Policy und a Shared Access Signature

#
# This script uses the Azure Resource model and creates a new ARM storage account.
# Modify this script to use an existing ARM or classic storage account
# using the instructions in comments within this script
#
# Define global variables for the script
$prefixName = '' # used as the prefix for the name for various objects
$subscriptionName='' # the name of subscription name you will use
$locationName = '' # the data center region you will use
$storageAccountName= $prefixName + 'storage' # the storage account name you will create or use
$containerName= $prefixName + 'container' # the storage container name to which you will attach the SAS policy with its SAS token
$policyName = $prefixName + 'policy' # the name of the SAS policy

#
# Using Azure Resource Manager deployment model
# Comment out this entire section and use the classic storage account name to use an existing classic storage account
#

# Set a variable for the name of the resource group you will create or use
$resourceGroupName=$prefixName + 'rg'

# adds an authenticated Azure account for use in the session
Login-AzureRmAccount

# set the tenant, subscription and environment for use in the rest of
Set-AzureRmContext -SubscriptionName $subscriptionName

# create a new resource group - comment out this line to use an existing resource group
New-AzureRmResourceGroup -Name $resourceGroupName -Location $locationName

# Create a new ARM storage account - comment out this line to use an existing ARM storage account
New-AzureRmStorageAccount -Name $storageAccountName -ResourceGroupName $resourceGroupName -Type Standard_RAGRS -Location $locationName

# Get the access keys for the ARM storage account
$accountKeys = Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName

# Create a new storage account context using an ARM storage account
$storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys[0].Value

#
# Using the Classic deployment model
# Use the following four lines to use an existing classic storage account
#
#Classic storage account name
#Add-AzureAccount
#Select-AzureSubscription -SubscriptionName $subscriptionName #provide an existing classic storage account
#$accountKeys = Get-AzureStorageKey -StorageAccountName $storageAccountName
#$storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys.Primary

# The remainder of this script works with either the ARM or classic sections of code above

# Creates a new container in blob storage
$container = New-AzureStorageContainer -Context $storageContext -Name $containerName
$cbc = $container.CloudBlobContainer

# Sets up a Stored Access Policy and a Shared Access Signature for the new container
$permissions = $cbc.GetPermissions();
$policyName = $policyName
$policy = new-object 'Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPolicy'
$policy.SharedAccessStartTime = $(Get-Date).ToUniversalTime().AddMinutes(-5)
$policy.SharedAccessExpiryTime = $(Get-Date).ToUniversalTime().AddYears(10)
$policy.Permissions = "Read,Write,List,Delete"
$permissions.SharedAccessPolicies.Add($policyName, $policy)
$cbc.SetPermissions($permissions);

# Gets the Shared Access Signature for the policy
$policy = new-object 'Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPolicy'
$sas = $cbc.GetSharedAccessSignature($policy, $policyName)
Write-Host 'Shared Access Signature= '$($sas.Substring(1))''

# Outputs the Transact SQL to the clipboard and to the screen to create the credential using the Shared Access Signature
Write-Host 'Credential T-SQL'
$tSql = "CREATE CREDENTIAL [{0}] WITH IDENTITY='Shared Access Signature', SECRET='{1}'" -f $cbc.Uri,$sas.Substring(1)
$tSql | clip
Write-Host $tSql

Quelle: https://msdn.microsoft.com/en-us/library/dn466438.aspx

To use this script, we need to edit the following information in the script first.

Get-AzureSubscription | Format-Table

“North Europe” is the best location für Germany, you’ll have to choose your best fitting country/datacenter.

But to be able to make the selection of “North Europe”, we first have to log into our Azure account where we can display a list of available locations.

Login-AzureRmAccount
Get-AzureRmLocation | Format-Table Location, DisplayName

Location       DisplayName     
--------       -----------     
[...]
southcentralus South Central US
northeurope    North Europe    
westeurope     West Europe     
japanwest      Japan West      
[...]

These values inserted and executed into the script should result in the following message:

Shared Access Signature=  sv=2015-04-05&sr=c&si=backupdemopolicy&sig=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 
Credential T-SQL
CREATE CREDENTIAL [https://backupdemostorage.blob.core.windows.net/backupdemocontainer] WITH IDENTITY='Shared Access Signature', SECRET='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'

This TSQL Statement can be easily executed on SQL server which should be backup. With

SELECT * from sys.credentials

you can check if your new credentials were created successfully if you don’t trust the message in SQL Server Management Studio. 😉

Now we are able to create our first backup directly into Azure Storage. For this purpose, just run a backup-to-URL (replace all values appropriately):

BACKUP DATABASE AdventureWorks2014 TO URL = 'https://.blob.core.windows.net//AdventureWorks2014_onprem.bak'

I was very impressed with the speed of taking a SQL Server backup. Of course, the backup speed depends mostly on the network infrastructure and the connection to the internet – e.g. is it a DSL line or a T1 leased line. (in my case it is a DSL line with 100MB down/ 50MB up )
a ~43MB transaction log backup needs 12,7 seconds within my configuration.

BACKUP LOG successfully processed 13569 pages in 12.696 seconds (8.349 MB/sec).

To validate and compare soma data I’d also created a second transaction log backup directly to my local disc.

BACKUP LOG successfully processed 13569 pages in 1.746 seconds (60.652 MB/sec).

Which is much faster, of course, but also involves a degree of uncertainty. In this post, I just wanna give a short impression of the new backup feature of SQL Server 2016. Each one of you has to decide by himself if he wants (and will be able) to use this cool feature or if his customers won’t let him implement this stuff.

Edit:
There is a little mistake in those scripts from Microsoft to create the BlobStorages…
Line 38 says:
$storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys.[0].Value

which should be

$storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys[0].Value

My first attempt with this powershell based implementation of a new SQL Server 2016 feature and how it works had surprised me a little and took me only 20 minutes to implement and successfully run a backup.

Björn works as a database administrator and Head of Competence for MS SQL and mySQL in Hamburg (Germany). He regularly participates in the PASS regional group meetings, events of the PASS such as SQLSaturday and SQLGrillen and he organizes the Azure Meetup group in Hamburg. He is interested in topics like SQL Server, Powershell and Azure for science fiction, snowboarding, baking and cycling.

TSQL2sDay #80 – my open point list for more SQL Server knowledge

For T-SQL Tuesday #80, Chris Yates (b|t) asks “Is there something you’ve been wanting to do for a long time in SQL, but haven’t done it yet?”.

Yes, of cause 😉
There are some many topics I want to learn, to talk and to present about in the (near) future. Just to give you a quick look onto my open point list (see blog post picture) for SQL server topics to learn and write about.
SQL Server 2016 is a really awesome kind of software which includes so much new and very exciting topics which I want to get in touch with… ok, not all my open points are new in SQL server 2016 but very interesting and from earlier versions but I wanna learn them and then share my new knowledge with my colleagues.

  • SQL Server 2014 / 2016 – ColumnStore
  • SQL Server 2016 – Encrypted / Always Encrypted
  • SQL Server 2016 – Temporal Tables
  • SQL Server 2016 – Analytics with R

Or just the new tools around SQL Server like Power BI (Desktop).

Actually, I’m working on my Azure skills … it’s awful… just finished an MVA-course which opened several new topics I had to read about and get deeper into it.
Last month I had visited SQLSaturday Rheinland #525 and learn from Andre Kamann (b|t) new stuff about Desired State Configuration (DSC) that was also very very interesting and might be helpful for my daily business… So you can see (read) I’ll need more than 24h a day 😉

Also, Andreas Wolter spoke about a topic which could help me in future at my job => the new SQL Server 2016 QueryStore…

So who can tell me where to start with all those hot things? How to start with all this?

Finally, I would like to thank the community that all always share their knowledge and new information with us.

THANK YOU !!!

TSQL Tuesday Logo

PS: It is the birthday of our actual Host Chris Yates… Happy Birthday to you, I wish you all the best.

Björn works as a database administrator and Head of Competence for MS SQL and mySQL in Hamburg (Germany). He regularly participates in the PASS regional group meetings, events of the PASS such as SQLSaturday and SQLGrillen and he organizes the Azure Meetup group in Hamburg. He is interested in topics like SQL Server, Powershell and Azure for science fiction, snowboarding, baking and cycling.