TSQL2sday #94 – daily database copy using Powershell – dbatools
This post might contain affiliate links. We may earn a commission if you click and make a purchase. Your support is appreciated!
T-SQL Tuesday is a recurring blog party, that is started by Adam Machanic (b | t). Each month a blog will host the party, and everyone that want’s to can write a blog about a specific subject.
This month Rob Sewell is our TSQL2sDay host and his subject is “Let’s get all Posh!”.
As I had written in a former post I was inspired by Andre Kamann to start using Powershell to manage our SQL Server environments, since this year I’m a major contributor of dbatools – a multifunctional Powershell module for DBAs.
I use the functions from the dbatools module day by day more and more. And I try to write about some of those tasks here in my blog like this post 😉
So one of me ServiceManager asked me to write a job which should refresh the test-environment each day – only 3 databases (out of 12). So I just wrote a Powershell-script which copies those databases from production to test environment.
First, we start with the requirements:
- Currently approx. 280 GB
- Backup with Copy only
- Creation of database duplication as an automatic job
- Every morning at 4:00
My first thoughts about that were creating a SQL Server Agent Job with following steps:
- check the availability of Shared-Destination-Folder
- delete/clear Destination-Folder-Content
- Shrink all Transaction-Logfiles
- Backup all Databases from given list
- Restore each Backup-File from folder
- Check all orphaned user
- delete/clear Destination-Folder-Content
A year or two ago, I had built this with a lot of normal T-SQL-Agent-Jobs, now I’m doing this with dbatools which make it very easy and fast (and in one step)
I’m building such scripts in a very simple way, in order to make it easy to understand what a script is doing… so I’m not using any complex one-liner 😉
This time I need a job log and dbatools, so I started with importing those functionalities
. E:\SQL_Admin_Skripte\Function-Write-Log.ps1
$Network_Transfer_Folder = '\\DestinationShare\Backup'
$Local_Transfer_Folder = 'E:\BackupPath\'
if (-not (Get-Module -Name dbatools)) {
Import-Module E:\SQL_Admin_Skripte\dbatools-master\dbatools.psd1
}
Claudio Silva (b | t) helped me a little with the following „SHRINK“-command which was in a first stage a normal combination of Powershell „Invoke-Sqlcmd“ and T-SQL, now it is a dbatools-function called „Invoke-DbaDatabaseShrink“ which is a little bit tricky if you only want to shrink log files… but it works.
#Shrink TLogs
Invoke-DbaDatabaseShrink -SqlInstance Src-InstanceName -Database DB1,DB2,DB3 -LogsOnly -ShrinkMethod TruncateOnly
Now I had just to make a Backup and restore those Backups on the destination instance… no real magic 😉
#Backup named databases
Backup-DbaDatabase -SqlInstance Src-InstanceName -Databases DB1,DB2,DB3 -Type Full -FileCount 32 -CopyOnly -BackupDirectory $Network_Transfer_Folder
#Restore all databases in given folder
Restore-DbaDatabase -SqlServer Dest-InstanceName -Path $Local_Transfer_Folder -WithReplace -UseDestinationDefaultDirectories
Last but not least… I’ll have to check for orphaned user and clean up everything…
#Repair orphaned users
Repair-SqlOrphanUser -SqlServer Dest-InstanceName
#Cleanup after Restoring
Get-ChildItem -Path $Local_Transfer_Folder -Include *.bak -File -Recurse | foreach { $_.Delete() }
Now I’m triggering the script every day with a SQL Server Agent Job what makes it even easier for me as DBA. (but be careful – dbatools run only with a PowerShell version > 3 => SQL Server 2014 if you use a PowerShell step)
The job runs ~12 minutes including importing dbatools module, Backup three databases (~280GB) and restoring them on the test server which I think is a good runtime!
At the end, I’m having more time to read any books, tweets or other blog posts 😉
My former blog post about another database copy job can be found here: Copy Database with Rename using dbatools
Special Thanks to Jason Wasser @wasserja for his great logging function!
https://gallery.technet.microsoft.com/scriptcenter/Write-Log-PowerShell-999c32d0
and to Derik Hammer for his list of PowerShell version in SQL Server and how to implement PowerShell into Agent Steps…
https://www.sqlhammer.com/running-powershell-in-a-sql-agent-job/
This post might contain affiliate links. We may earn a commission if you click and make a purchase. Your support is appreciated!
Björn arbeitet auch weiterhin aus Mexiko als Senior Consultant – Microsoft Data Platform und Cloud für die Kramer&Crew in Köln. Auch der Community bleibt er aus der neuen Heimat treu, er engagiert sich auf Data Saturdays oder in unterschiedlichen Foren. Er interessiert sich neben den Themen rund um den SQL Server, Powershell und Azure SQL für Science-Fiction, Backen 😉 und Radfahren.
Amazon.com Empfehlungen
Damit ich auch meine Kosten für den Blog ein wenig senken kann, verwende ich auf diese Seite das Amazon.com Affiliate Programm, so bekomme ich - falls ihr ein Produkt über meinen Link kauft, eine kleine Provision (ohne zusätzliche Kosten für euch!).
Auto Amazon Links: Keine Produkte gefunden.
Do you realize you’re backing up to $Network_Transfer_Folder, but restoring (and cleaning up) from $Local_Transfer_Folder?
Yes of cause. Both folders are the same location… on destination server a local folder which is shared. I did it that way due to authentication requirements.