How do you check and rebuild your indexes for Azure SQL DBs in Elastic Pool? That question is getting even more interesting if you have only Private Endpoint enabled to connect to the MSSQL DB. 

Lets get started! 

  1. Requirements
    • Azure SQL DBs in Elastic Pool
    • Only Private Endpoint
    • Simple and easy Maintenance for indexes
    • Password less authentication (service account)
  1. Solutions 
    • Elastic Database Jobs
    • Azure Automation
    • Windows Task Scheduler – the old way

Elastic Database Jobs

So first, we need to check Elastic Database Jobs. 
Elastic Database Jobs (preview) – Azure SQL Database | Microsoft Learn 

Unfortunately, there is one limitation right now, that Private endpoints aren’t supported. 
So this option is not going to work. 

Azure Automation

Azure Automation – Cloud Automation Service | Microsoft Azure 
Azure Automation in a hybrid environment – Azure Architecture Center | Microsoft Learn 
Run Azure Automation runbooks on a Hybrid Runbook Worker | Microsoft Learn

Theoretically, we can deploy a Runbook in combination with a Hybrid Worker. Why Hybrid Worker? Because that option is needed if you need connect to a Private endpoint. 
After another look at the architecture, it is quite a big solution with many Azure resources and we still have to deploy a Virtual Machine (VM) (for the installation of the hybrid worker). After a meeting with a customer, this solution was declared too complex, and we must check the next one.

PowerShell Script executed via Windows Task Scheduler

So let’s check the old way – PowerShell Script executed via Windows Task Scheduler. 
And you will already expect the result, it’s working like a charm. 

As mentioned in the requirements, we must authenticate without a hardcoded password in the PowerShell script. In that case we can use Azure Managed Identity. 
Managed identities for Azure resources – Microsoft Entra | Microsoft Learn

Step-by-Step: 

  • Create the Azure VM 
  • Enable Managed Identity in Azure for the VM (system assigned) 
  • Grant read permission for VM object to the Azure Elastic DB Server
    • This is required to get/query all DB names running in the elastic group
  • Create the login for the VM object on SQL Instance 
  • Create the user for the VM object on every DB 
  • Grant the right permission for the VM object to get the work done
    • in our case we grant dbo permission – please check the least privilege needed in your environment
  • Create and adjust the PowerShell script 
  • Create the scheduled task entry 
  • All done 🙂 now you can query your databases and maintain them

In our script example, we query Azure to get all DB names running in the elastic pool, then we loop through each database and execute a query to get the index stats. 
After that, the script created a file called table.csv with all index information. 
Of course you can replace the select query and execute any other statement.

#region info
#https://dbatools.io/offline/
#Get-ChildItem -Recurse C:\Temp\dbatools.1.1.145\ | Unblock-File
#$env:PSMODULEPATH
#C:\Program Files\WindowsPowerShell\Modules\dbatools\1.1.145


#to view log in GV:
#Import-Csv -LiteralPath $path_out_table_info -Delimiter ";" | Out-GridView

#endregion

#region functions
function add-logentry-to-json
{
    param(
        [Parameter(Mandatory=$true)]
        [AllowEmptyString()]
        [System.String]$logvariable

        ,[Parameter(Mandatory=$true)]
        [ValidateNotNull()]
        [System.String]$timestamp

        ,[Parameter(Mandatory=$true)]
        [ValidateSet("INFO", "WARN", "ERROR", "DEBUG")]
        [System.String]$loglevel

        ,[Parameter(Mandatory=$true)]
        [ValidateNotNull()]
        [System.String]$logtext

        ,[Parameter(Mandatory=$true)]
        [AllowNull()]
        [System.Collections.Hashtable]$content
    )

    $out = @()

    $tmp = @{
        timestamp = $timestamp;
        loglevel = $loglevel;
        logtext = $logtext;
        content = $content;
    }

    if(![string]::IsNullOrEmpty($logvariable))
    {
        $out += $logvariable | ConvertFrom-Json
    }

    $out += $tmp
    
    return ($out | ConvertTo-Json)
}
#endregion

#region vars
$TenantId = "XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX"
$Subscription = "my-cool-subscription"
$ResourceGroupName = "my-cool-resource-group"
$AzSqlElasticPoolDatabase_server = "azep-db-server"
$ElasticPoolName = "elastic-pool-name"
$SqlInstance = "azep-db-server,1433"

$path_log = "C:\temp\"
$path_out_log = Join-Path $path_log "log.json"
$path_out_table_info = Join-Path $path_log "table.csv"
#endregion


#region init vars
$json_log_content = ""
$table = @()
#endregion

$ErrorActionPreference = "STOP"

try
{
    $json_log_content = add-logentry-to-json -logvariable $json_log_content `
                        -timestamp $(Get-Date -Format o | ForEach-Object { $_ -replace ":", "." }) `
                        -loglevel "INFO" -logtext "Script Start" -content $null
    
    #import dbatools module with required version
    Import-Module -Name dbatools -RequiredVersion 1.1.145

    #connect to azure
    Connect-AzAccount -TenantId $TenantId -Subscription $Subscription -Identity

    #get all dbs from pool
    $AzSqlElasticPoolDatabase = Get-AzSqlElasticPoolDatabase -ResourceGroupName $ResourceGroupName -ServerName $AzSqlElasticPoolDatabase_server -ElasticPoolName $ElasticPoolName

    #get access token
    $token = New-DbaAzAccessToken -Type ManagedIdentity -Subtype AzureSqlDb

    #loop all dbs and fill table variable with index infos
    #sequential to lower dtu consumption
    $AzSqlElasticPoolDatabase | %{
        try
        {
            $Database = $_.DatabaseName

            $json_log_content = add-logentry-to-json -logvariable $json_log_content `
                            -timestamp $(Get-Date -Format o | ForEach-Object { $_ -replace ":", "." }) `
                            -loglevel "INFO" -logtext "Bearbeite DB: $Database ..." -content $null

            #connect to server
            $server = Connect-DbaInstance -SqlInstance $SqlInstance -Database $Database -AccessToken $token -TrustServerCertificate

            $tsql="
                SELECT DB_Name() as 'DBName'
                    ,OBJECT_SCHEMA_NAME(ips.OBJECT_ID) 'Schema'
                    ,OBJECT_NAME(ips.OBJECT_ID) 'Table'
                    ,i.NAME
                    ,ips.index_id
                    ,index_type_desc
                    ,avg_fragmentation_in_percent
                    ,avg_page_space_used_in_percent
                    ,page_count
                FROM sys.dm_db_index_physical_stats(DB_ID(), NULL, NULL, NULL, 'SAMPLED') ips
                INNER JOIN sys.indexes i ON (ips.object_id = i.object_id)
                        AND (ips.index_id = i.index_id)
                ORDER BY avg_fragmentation_in_percent DESC
            "
        
            #execute query and fill table variable
            $table += Invoke-DbaQuery -SqlInstance $server -Query $tsql
        }
        catch
        {
            $json_log_content = add-logentry-to-json -logvariable $json_log_content `
                                -timestamp $(Get-Date -Format o | ForEach-Object { $_ -replace ":", "." }) `
                                -loglevel "ERROR" -logtext "Script runtime error" `
                                -content @{Message = $($_.Exception.Message);
                                            ScriptLineNumber = $( $_.invocationinfo.ScriptLineNumber);
                                            Line = $($_.invocationinfo.Line);}
        }
        finally
        {
            #disconnect server
            $server | Disconnect-DbaInstance
        }
    }

    #after loop, export table varibale to disk
    $table | Sort-Object avg_fragmentation_in_percent,page_count -Descending | Export-Csv -LiteralPath $path_out_table_info -Delimiter ";" -NoTypeInformation -Encoding Default
    #Import-Csv -LiteralPath $path_out_table_info -Delimiter ";" | Out-GridView

    $json_log_content = add-logentry-to-json -logvariable $json_log_content `
                        -timestamp $(Get-Date -Format o | ForEach-Object { $_ -replace ":", "." }) `
                        -loglevel "INFO" -logtext "Script End" -content $null
}
catch
{
    $json_log_content = add-logentry-to-json -logvariable $json_log_content `
                        -timestamp $(Get-Date -Format o | ForEach-Object { $_ -replace ":", "." }) `
                        -loglevel "ERROR" -logtext "Script runtime error" `
                        -content @{Message = $($_.Exception.Message);
                                    ScriptLineNumber = $( $_.invocationinfo.ScriptLineNumber);
                                    Line = $($_.invocationinfo.Line);}
}
finally
{
    [system.io.file]::WriteAllText($path_out_log,$json_log_content)
}

Thats it! 🙂 Please leave a comment if you have any questions or contact us to get professional support.