Welcome to the final step of your future automatization 🙂 Quick reminder what we have done previously.

This one forced me to learn new stuff – writing SQL queries and grabbing the results. It was fun to do 🙂

So the general idea was to create a script that will connect to the SQL server, restore the database(s) and execute some checks that we use. Easy, right? 🙂

Here comes the script.

/**
function Test-SQLRestore {
    <#
    .Synopsis
       Restores Database to a SQL Server.
    .DESCRIPTION
       Reads from restore location and creates new DBs under specific names. It then executes test queries.
    .EXAMPLE
       Test-SQLRestore -restoreLocation $FolderPath
    .NOTES
        It takes $SqlQueryPath from New-SqlQuery.ps1 function that it calls during process.
    .NOTES
	Author:            Milos Katinski
    Twitter:           @MilosKatinski
    Email:             milos.katinski@outlook.com
    Blog:              http://tech-trainer.info/
    #>
    [CmdletBinding()]
    param (
        # Restore folder location
        [Parameter(Mandatory = $False,
            Position = 1,
            ValueFromPipeline = $true)]
        [ValidateNotNullOrEmpty()]
        [string]$restoreLocation,
        # Query path
        [Parameter(Mandatory = $False,
            Position = 2,
            ValueFromPipeline = $true)]
        [ValidateNotNullOrEmpty()]
        [string]$SqlQueryPath
    )
    
    begin {
        # We need this module in orther to execute some SQl commands
        Install-Module SQLSERVER -AllowClobber -Force

        # Again, I like to log what the script does :)
        $logLocation = "C:\Scripts\Logs"
        If (!(test-path $logLocation)) {
            New-Item -ItemType Directory -Force -Path $logLocation
        }
        $LogFile = "C:\Scripts\Logs\TestBRJob-$(Get-Date -Format "M-dd-yyyy").Log"

        # The location where we copied our data from the Blob
        $restoreLocation = "C:\Temp\DBs"
        If (!(test-path $restoreLocation)) {
            Add-Content -Path $LogFile -Value "$(get-date) - There is no restore point. Check Blob2Local process."
            $ErrorMessage = "Please review SQL restore process for $dbName."
            Send-EmailSendGrid -EmailTo "milos.katinski@outlook.com" -Subject "[Warning] OA SQL Restore procedure" -Body $ErrorMessage
            exit
        }

    }
    
    process {
        # Here we are finding if we have more than one SQl instance installed
        $SQLInstances = Invoke-Command -ComputerName $env:COMPUTERNAME {
            (Get-ItemProperty 'HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server').InstalledInstances
        }
        
        $BackupFiles = (Get-ChildItem $restoreLocation\* -Recurse -Include *.bak)

        foreach ($file in $BackupFiles.FullName) {
            foreach ($inst in $SQLInstances) {
                $name = (($file.split('\'))[-1]).split('.')[0]
                # New-SqlQuery is a helper function that will generate wanted queries
                New-SqlQuery -backupfile $file -QueryName "$name"
                $path = "$Script:SqlQueryPath.sql"
                Invoke-Sqlcmd -InputFile $path
            }
        }
        #Get all restored DBs on server and run check
        [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
        $s = New-Object ('Microsoft.SqlServer.Management.Smo.Server') "LOCALHOST"
        # Here I had to filter DBs by name, but maybe you don't have to
        $dbs = $s.Databases | Where-Object { $_.name -like '*NAME PATERN' }
        foreach ($db in $dbs) {
            try {
                $dbName = $db.name
                $resultfile = "C:\Scripts\Logs\$dbName-$(Get-Date -Format "M-dd-yyyy").Log"
                Write-Verbose "Processing $db"

                # This is the actual DB check - "DBCC checkdb" command
                $query = -join("USE", ' "' ,$dbName,'"',"; DBCC checkdb")

                # Important part - we are writing results of a query to a file
                Invoke-Sqlcmd -Query $query -Verbose 4>$resultfile
                # If all is OK with DB you should find this line in report ;)
                $patern = "CHECKDB found 0 allocation errors and 0 consistency errors in database"
                $paterncheck = Select-String -Path $resultfile -Pattern $patern
                if (!$paterncheck) {
                    Write-Output "Error with database restore for $dbName"
                    $ErrorMessage = "SQL DB restore has issues. Please review SQL restore process for $dbName."
                    Send-EmailSendGrid -EmailTo "milos.katinski@outlook.com" -Subject "[Warning] OA SQL Restore procedure" -Body $ErrorMessage
                }
                else { 
                    Add-Content -Path $LogFile -Value "$(get-date) - Database $db is OK."
                    Add-Content -Path $LogFile -Value "$(get-date) - *** Logging end ***" 
                    }
            }

            catch {
                #create an error message
                $errormsg = "Database $db check failed. $($_.Exception.Message)"
                $errorlog = Join-Path $logLocation 'errors.txt'
                Write-Error $errormsg
                Write-Verbose "Logging errors to $errorlog"
                "$(Get-Date): $errormsg" | Out-File -FilePath $Errorlog -Append
            }

            
        }
    }

    end {
        # Cleanup
        Remove-Item -Path "C:\Temp\" -Recurse -Force
        foreach ($db in $dbs) {
            Invoke-Sqlcmd -Query "DROP DATABASE $db;"
        }
    }
} #Function end
 */

Since I’m not using any SQL credentials in the script, the trick is to start it within the account that has R/W access to Server.

And here is the helper function – the one that is actually very important.

It creates the query for each Database that we are going to test. If you are not familiar with SQL query language, this will be all very confusing, but try to go line by line. Also, fill free to ask anything.

/**
function New-SqlQuery {
    <#
    .Synopsis
       Creates new SQL query file.
    .DESCRIPTION
       Creates SQL query file based on template and returns query path.
    .EXAMPLE
       $file = "'C:\Data4Import\MK-SQL-01_Test_FULL_20190912_120655.bak'"
       New-SqlQuery -backupfile $file -QueryName "$name"
    .NOTES
       Used as a helper function.
    .NOTES
	Author:            Milos Katinski
    Twitter:           @MilosKatinski
    Email:             milos.katinski@outlook.com
    Blog:              http://tech-trainer.info/
    #>
    [CmdletBinding()]
    param (
        # Restore folder location
        [Parameter(Mandatory = $True,
            Position = 1,
            ValueFromPipeline = $true)]
        [ValidateNotNullOrEmpty()]
        [string]$backupfile,
        # Query name
        [Parameter(Mandatory = $True,
            Position = 2,
            ValueFromPipeline = $true)]
        [ValidateNotNullOrEmpty()]
        [string]$QueryName)
    
    begin {
        # This is the location where we are going to store our new queries
        $location = "C:\Temp\Query"
                If (!(test-path $location)) {
                    New-Item -ItemType Directory -Force -Path $location
                }
    }
    
    process {
        <# This Table is created so I could pull some data from it. After that we are grabing the backup file and pulling the DB name out of it. 
        Try to figure out the rest of the query. If it gets hard, fill free to ask.
        Important notice - CONCAT is introduced in SQL 2012; DBPath depends on your SQL version #>
        Add-Content $location\$QueryName.sql "DECLARE @fileListTable TABLE (
            [LogicalName]           NVARCHAR(128),
            [PhysicalName]          NVARCHAR(260),
            [Type]                  CHAR(1),
            [FileGroupName]         NVARCHAR(128),
            [Size]                  NUMERIC(20,0),
            [MaxSize]               NUMERIC(20,0),
            [FileID]                BIGINT,
            [CreateLSN]             NUMERIC(25,0),
            [DropLSN]               NUMERIC(25,0),
            [UniqueID]              UNIQUEIDENTIFIER,
            [ReadOnlyLSN]           NUMERIC(25,0),
            [ReadWriteLSN]          NUMERIC(25,0),
            [BackupSizeInBytes]     BIGINT,
            [SourceBlockSize]       INT,
            [FileGroupID]           INT,
            [LogGroupGUID]          UNIQUEIDENTIFIER,
            [DifferentialBaseLSN]   NUMERIC(25,0),
            [DifferentialBaseGUID]  UNIQUEIDENTIFIER,
            [IsReadOnly]            BIT,
            [IsPresent]             BIT,
            [TDEThumbprint]         VARBINARY(32) -- remove this column if using SQL 2005
        )       
        INSERT INTO @fileListTable EXEC('RESTORE FILELISTONLY FROM DISK = ''$backupfile''')
        DECLARE @DBName NVARCHAR(50)
        DECLARE @LogName NVARCHAR(50)
        SET @DBName = (SELECT [LogicalName] FROM @fileListTable WHERE [type] = 'D')
        SET @LogName = (SELECT [LogicalName] FROM @fileListTable WHERE [type] = 'L')

        PRINT @DBName
        PRINT @LogName

        DECLARE @DBTest NVARCHAR(150)
        SET @DBTest = (CONCAT (@DBName, '_OAtest'))
        DECLARE @DBTestLog NVARCHAR(150)
        SET @DBTestLog = (CONCAT (@DBName, '_OAtest_log'))
        DECLARE @DBPath NVARCHAR(150)
        SET @DBPath = (CONCAT ('C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\', @DBTest, '.mdf'))
        DECLARE @LogPath NVARCHAR(150)
        SET @LogPath = (CONCAT ('C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\', @DBTestLog, '.ldf'))
        RESTORE DATABASE @DBTest
        FROM DISK = '$backupfile'
        WITH MOVE @DBName TO @DBPath,
        MOVE @LogName TO @LogPath

        GO
        "
    }
    
    end {
        # Here we are returning the valuable data that will be used in main script.
        $Script:SqlQueryPath = "$location\$QueryName"
    }
}
 */

Now, when we have all the scripts in place, we just need to create few Scheduled tasks and enjoy our engineer’s life 😉

That’s all folks 😀

Cheers!

Please follow and like us:

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.