Welcome back to a resolution of  Admins nightmare – Primary site is down and backup files are useless.

Today we will go through the short procedure of getting the data from Blob storage to a local drive.

The script is very similar to the previous one, with few specifics which are well commented 😉

Don’t forget the prerequisites for Servers that are mentioned in previous post.

/**
<#
.Synopsis
   Copies data from Blob storage
.DESCRIPTION
   It uses SPN to connect to Azure subscription and then copies data from Blob to local storage
.EXAMPLE
   Copy-Blob2Local
.NOTES
	Author:            Milos Katinski
    Twitter:           @MilosKatinski
    Email:             milos.katinski@outlook.com
    Blog:              http://tech-trainer.info/
#>
function Copy-Blob2Local {
    [CmdletBinding()]
    param 
    (
        # Subscription ID
        [Parameter(Mandatory = $false,
            Position = 1,
            ValueFromPipeline = $true,
            ValueFromPipelineByPropertyName = $true)]
        [ValidateNotNullOrEmpty()]
        [Alias("Subscription")]
        [string]$subscriptionId = "YOUR SUBSCRIPTION ID",
        # Name of the resource group
        [Parameter(Mandatory = $false,
            Position = 2,
            ValueFromPipeline = $true,
            ValueFromPipelineByPropertyName = $true)]
        [ValidateNotNullOrEmpty()]
        [Alias("Name")]
        [string]$ResourceGroup = "NAME OF THE RG WHERE STORAGE ACC IS",
        # Name of the Storage Account
        [Parameter(Mandatory = $false,
            Position = 3,
            ValueFromPipeline = $true,
            ValueFromPipelineByPropertyName = $true)]
        [ValidateNotNullOrEmpty()]
        [Alias("Storage")]
        [string]$storageAccountName = "STORAGE ACCOUNT NAME",
        # Name of the Container
        [Parameter(Mandatory = $false,
            Position = 4,
            ValueFromPipeline = $true,
            ValueFromPipelineByPropertyName = $true)]
        [ValidateNotNullOrEmpty()]
        [Alias("Container")]
        [string]$storageContainerName = "CONTAINER NAME"
    )

    BEGIN {
        $TenantId = "YOUR TENANT ID"
        $appCred = Get-StoredCredential -Target "NAME OF THE STORED CRED"
        Connect-AzAccount -ServicePrincipal -SubscriptionId $subscriptionId -Tenant $TenantId -Credential $appCred

        # Number of databases that we should have in Blob
        $DBcount = 2

        $restoreLocation = "C:\Temp\DBs"

        # I like to log most of the things that my scripts do
        $logLocation = "C:\Scripts\Logs"
        If (!(test-path $logLocation)) {
            New-Item -ItemType Directory -Force -Path $logLocation
        }

        # Here we are creating the SAS Token so we could actually access Blob storage
        $LogFile = "C:\Scripts\Logs\BackupJob-$(Get-Date -Format "M-dd-yyyy").Log"

        $storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $ResourceGroup -AccountName $storageAccountName).Value[0]
 
        $destinationContext = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
 
        $containerSASURI = New-AzStorageContainerSASToken -Name $storageContainerName -Context $destinationContext -ExpiryTime(get-date).AddSeconds(3600) -FullUri -Permission rlw
        # Little hack for filtering with azcopy - execute the lines and find the difference between $containerSASURI versions
        $containerSASURI = $containerSASURI.Replace("?sv", "/*.bak?sv")
    }

    PROCESS {
        $statuslocal = @(Get-AzStorageBlob -Container $storageContainerName -Context $destinationContext -Include $("*.bak") | Where-Object { $_.LastModified -ge $(Get-Date).AddDays(-1) })
        # Here I have implemented waiting time for data to be available in Blob. The one minute time is just an example
        $timeout = New-TimeSpan -Minutes 1
        $sw = [diagnostics.stopwatch]::StartNew()
        while ($sw.Elapsed -lt $timeout) {
            if ($statuslocal.Count -lt $DBcount) {
                Write-Verbose "Backup not found on $containerSASURI"
                $ErrorMessage = "SQL Backup files are not found on $containerSASURI. Please review SQL backup and log files."
                Add-Content -Path $LogFile -Value "$(Get-Date) - One or more backup files not found in $restoreLocation."
                Add-Content -Path $LogFile -Value "$(get-date) - *** Logging end ***"
                Start-Sleep -Seconds 10
                $statuslocal = @(Get-AzStorageBlob -Container $storageContainerName -Context $destinationContext -Include $("*.bak") | Where-Object { $_.LastModified -ge $(Get-Date).AddDays(-1) })

            }
            break
        }
        if ($statuslocal.Count -lt $DBcount) {
            Send-EmailSendGrid -EmailTo "milos.katinsk@outlook.com" -Subject "[Warning] SQL Backup" -Body $ErrorMessage
        }
        else {
            Write-Verbose "Moving blob data to local"
            $cp2l = azcopy copy $containerSASURI $restoreLocation --recursive
        }

    }

    END {
        #cleanup
        if($cp2l){
            Get-AzStorageBlob -Container $storageContainerName -Context $destinationContext | Remove-AzStorageBlob
        }
    }
    
} # Function end
 */

In the next post, I will show you how I automated the SQL restore and Database checks.

Stay tuned …

Cheers!

Please follow and like us:

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.