Everyone who has ever installed a SQL Server has also set up a backup procedure (hopefully :))

But, besides backing up your valuable data, how many of you are actually testing what has been backed up? Do you have a disaster recovery procedure?

If you do – how often do you do it? How much time does it take for an engineer to do it?

Recently I started working with a new team, where I found out that Operations have to go through “database restoring” procedure for each client at least 2 times per year – Great 🙂 Good thing to do. But all manually, which takes at least 2 hours per check 🙁 .

Powershell automation to the rescue.

Since we have 2 environments – Production and Acceptance – that are not connected in any way (when it comes to a networking part) I had to find a way of moving all those big backups from one to another.

Since we have an Azure subscription, I decided to create an environment that I will use for this. All I needed was one Service Principal Name (we want to be secure) and Blob storage.

I will not go through the creation of these 2 but need to tell you roles that you need to assign to SPN:

  • Storage Account Key Operator Service Role
  • Storage Blob Data Contributor

Also, here are some prerequisites that Servers must have:

And finally, the script itself. It is well commented but in case you need some help, just ask 😉

/**
<#
.Synopsis
   Copies data to Blob storage
.DESCRIPTION
   It uses SPN to connect to Azure subscription and then moves local data to a target blob storage
.EXAMPLE
   Copy-Local2Blob
.NOTES
	Author:            Milos Katinski
    Twitter:           @MilosKatinski
    Email:             milos.katinski@outlook.com
    Blog:              http://tech-trainer.info/
#>
function Copy-Local2Blob {
    [CmdletBinding()]
    param 
    (
        # In this example parameters are not mandatory because I was hardcoding them, but that you can always change
        # Subscription ID
        [Parameter(Mandatory = $false,
            Position = 1,
            ValueFromPipeline = $true,
            ValueFromPipelineByPropertyName = $true)]
        [ValidateNotNullOrEmpty()]
        [Alias("Subscription")]
        [string]$subscriptionId = "YOUR SUBSCRIPTION ID",
        # Name of the resource group
        [Parameter(Mandatory = $false,
            Position = 2,
            ValueFromPipeline = $true,
            ValueFromPipelineByPropertyName = $true)]
        [ValidateNotNullOrEmpty()]
        [Alias("Name")]
        [string]$ResourceGroup = "NAME OF THE RG WHERE STORAGE ACC IS",
        # Name of the Storage Account
        [Parameter(Mandatory = $false,
            Position = 3,
            ValueFromPipeline = $true,
            ValueFromPipelineByPropertyName = $true)]
        [ValidateNotNullOrEmpty()]
        [Alias("Storage")]
        [string]$storageAccountName = "STORAGE ACCOUNT NAME",
        # Name of the Container
        [Parameter(Mandatory = $false,
            Position = 4,
            ValueFromPipeline = $true,
            ValueFromPipelineByPropertyName = $true)]
        [ValidateNotNullOrEmpty()]
        [Alias("Container")]
        [string]$storageContainerName = "CONTAINER NAME"
    )

    BEGIN {
        # Here I was using the CredentialManager module - listed in server prerequisistes 
        $appCred = Get-StoredCredential -Target "NAME OF THE STORED CRED"
        Connect-AzAccount -ServicePrincipal -SubscriptionId $subscriptionId -Tenant "YOUR TENANT ID" -Credential $appCred
        
        # Number of unique databases that we have in backup location
        $DBcount = 2

        # Here you should enter your local environment data - where you keep backups
        $backupLocation = "D:\Backup"
        $tempbackup = "D:\Temp\backup"
        If (!(test-path $tempbackup)) {
            New-Item -ItemType Directory -Force -Path $tempbackup
        }

        # I like to log most of the things that my scripts do
        $logLocation = "C:\Scripts\Logs"
        If (!(test-path $logLocation)) {
            New-Item -ItemType Directory -Force -Path $logLocation
        }

        $LogFile = "C:\Scripts\Logs\BackupJob-$(Get-Date -Format "M-dd-yyyy").Log"

        # Here we are creating the SAS Token so we could actually access Blob storage
        $storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $ResourceGroup -AccountName $storageAccountName).Value[0]
 
        $destinationContext = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
 
        $containerSASURI = New-AzStorageContainerSASToken -Name $storageContainerName -Context $destinationContext -ExpiryTime(get-date).AddSeconds(3600) -FullUri -Permission rlw

    }

    PROCESS {
        # Filtering backup location for only bak files created in last 24H and that have a specific string in name
        $statuslocal = @(Get-ChildItem $backupLocation -Recurse -Include $("*.bak") | Where-Object { $_.LastWriteTime -gt $(Get-Date).AddDays(-1) -and $_.Name -like 'INSERT NAME*'})
        if ($statuslocal.Count -lt $DBcount) {
            Write-Verbose "Backup issue found on $env:computername"
            $ErrorMessage = "SQL Backup files are not found on $env:computername. Please review SQL backup and log files."
            # Send-EmailSendGrid is our internal helper function - If you don't manage to find example online, send me a request and I will email it to you
            Send-EmailSendGrid -EmailTo "milos.katinski@outlook.com" -Subject "[Warning] SQL Backup" -Body $ErrorMessage
            Add-Content -Path $LogFile -Value "$(Get-Date) - One or more backup files not found in $localPath."
            Add-Content -Path $LogFile -Value "$(get-date) - *** Logging end ***"
            exit 0
        }

        else {
            foreach ($file in $statuslocal){
                Copy-Item -Path $file.FullName -Destination $tempbackup
                }
            Write-Verbose "Moving local data to blob"
            azcopy copy $tempbackup $containerSASURI --recursive
            Add-Content -Path $LogFile -Value "$(Get-Date) - Backups are copied to Blob storage."
        }
    }

    END {
        # Cleanup
        Remove-Item -Path "D:\Temp\backup" -Recurse -Force
        Add-Content -Path $LogFile -Value "$(Get-Date) - Temp Backup Storage is cleaned up."
        Add-Content -Path $LogFile -Value "$(get-date) - *** Logging end ***"
    }
    
} # Function end
 */

In the next post, I will cover how to download this data to the Acceptance environment.

Stay tuned …

Cheers!

Please follow and like us:

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.