Black Friday 2025 – NPI EA (cat = Baeldung on Ops)
announcement - icon

Yes, we're now running our Black Friday Sale. All Access and Pro are 33% off until 2nd December, 2025:

>> EXPLORE ACCESS NOW

Baeldung Pro – Ops – NPI EA (cat = Baeldung on Ops)
announcement - icon

Learn through the super-clean Baeldung Pro experience:

>> Membership and Baeldung Pro.

No ads, dark-mode and 6 months free of IntelliJ Idea Ultimate to start with.

Partner – Orkes – NPI EA (cat=Kubernetes)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

1. Overview

In this tutorial, we’ll explore how to execute SQL scripts using Azure DevOps pipelines. We’ll cover different approaches for running SQL scripts against Azure SQL Database, including using built-in tasks, PowerShell cmdlets, and authentication methods.

Modern DevOps practices require automated database deployments alongside application code. Azure DevOps provides several ways to execute SQL scripts as part of our continuous integration and deployment pipelines. Whether we’re deploying schema changes, running data migrations, or executing maintenance scripts, automating these tasks ensures consistency and reduces manual errors.

2. Prerequisites

Before we begin, let’s ensure we have the necessary components in place.

2.1. Azure Resources

We need the following Azure resources:

  • Azure SQL Database with an existing server
  • Azure DevOps organization with a project
  • Azure subscription with appropriate permissions
  • Service connection between Azure DevOps and Azure

2.2. Repository Structure

Let’s organize our SQL scripts in a structured folder:

/SQLScripts
  /Tables
    CreateCustomersTable.sql
    CreateOrdersTable.sql
  /StoredProcedures
    GetCustomerOrders.sql
  /Views
    CustomerOrdersView.sql
  /Data
    InitialData.sql

This hierarchical structure helps us control the execution order of our scripts. Tables need to be created before views or stored procedures that reference them. By organizing scripts into folders, we can easily process them in the correct sequence during deployment.

3. Using Azure SQL Database Deployment Task

The most straightforward approach is using the built-in SqlAzureDacpacDeployment task, which supports both DACPAC deployments and SQL script execution.

3.1. Basic SQL Script Execution

Let’s create a pipeline that executes a single SQL script:

trigger:
- main

pool:
  vmImage: 'windows-latest'

variables:
  azureSubscription: 'AzureServiceConnection'
  serverName: 'myserver.database.windows.net'
  databaseName: 'mydatabase'
  sqlUsername: 'sqladmin'
  sqlPassword: '$(SqlPassword)' # Stored as secret variable

steps:
- task: SqlAzureDacpacDeployment@1
  displayName: 'Execute SQL Script'
  inputs:
    azureSubscription: '$(azureSubscription)'
    ServerName: '$(serverName)'
    DatabaseName: '$(databaseName)'
    SqlUsername: '$(sqlUsername)'
    SqlPassword: '$(sqlPassword)'
    deployType: 'SqlTask'
    SqlFile: '$(Build.SourcesDirectory)/SQLScripts/Tables/CreateCustomersTable.sql'

In this configuration, we’re using the SqlTask deployment type, which is specifically designed for executing SQL scripts. The task handles the connection management and error reporting automatically. Let’s notice how we store the SQL password as a secret variable – this prevents it from appearing in logs. The $(Build.SourcesDirectory) variable automatically resolves to the root of our repository, making our paths portable across different agents.

3.2. Executing Multiple SQL Scripts

When we need to run multiple scripts in sequence, we can use the InlineSqlTask option:

- task: SqlAzureDacpacDeployment@1
  displayName: 'Execute Multiple SQL Scripts'
  inputs:
    azureSubscription: '$(azureSubscription)'
    ServerName: '$(serverName)'
    DatabaseName: '$(databaseName)'
    SqlUsername: '$(sqlUsername)'
    SqlPassword: '$(sqlPassword)'
    deployType: 'InlineSqlTask'
    InlineSql: |
      :r $(Build.SourcesDirectory)/SQLScripts/Tables/CreateCustomersTable.sql
      :r $(Build.SourcesDirectory)/SQLScripts/Tables/CreateOrdersTable.sql
      :r $(Build.SourcesDirectory)/SQLScripts/Views/CustomerOrdersView.sql

The :r command is a SQLCMD directive that reads and executes an external SQL file. This approach is particularly useful when we have a specific execution order for our scripts. The scripts run in a single database session, which means they can share variables and temporary tables. However, if one script fails, the entire task fails, so it’s important to ensure each script includes proper error handling.

4. PowerShell Approach with Invoke-Sqlcmd

For more flexibility, we can use PowerShell with the Invoke-Sqlcmd cmdlet. This approach gives us better control over error handling and logging.

4.1. Setting Up PowerShell Task

Let’s create a PowerShell script that executes SQL files:

- task: PowerShell@2
  displayName: 'Install SqlServer Module'
  inputs:
    targetType: 'inline'
    script: |
      if (-not (Get-Module -ListAvailable -Name SqlServer)) {
        Install-Module -Name SqlServer -Force -AllowClobber
      }

- task: PowerShell@2
  displayName: 'Execute SQL Scripts'
  inputs:
    targetType: 'inline'
    script: |
      $serverInstance = "$(serverName)"
      $database = "$(databaseName)"
      $username = "$(sqlUsername)"
      $password = "$(sqlPassword)"
      
      # Execute single script
      Invoke-Sqlcmd -InputFile "$(Build.SourcesDirectory)/SQLScripts/Tables/CreateCustomersTable.sql" `
        -ServerInstance $serverInstance `
        -Database $database `
        -Username $username `
        -Password $password `
        -QueryTimeout 3600 `
        -Verbose

The first task ensures the SqlServer PowerShell module is available on the agent. Microsoft-hosted agents don’t always have this module pre-installed, so we check and install it if necessary. The -Force and -AllowClobber parameters prevent installation conflicts.

In the second task, we use Invoke-Sqlcmd with several important parameters. The -QueryTimeout of 3600 seconds (1 hour) prevents long-running scripts from timing out. The -Verbose flag provides detailed output, which is invaluable for troubleshooting failed deployments. This approach gives us full control over the execution process and allows us to add custom logic between script executions.

4.2. Executing Multiple Scripts from Directory

We can enhance our PowerShell script to process all SQL files in a directory:

- task: PowerShell@2
  displayName: 'Execute All SQL Scripts in Directory'
  inputs:
    targetType: 'inline'
    script: |
      $scriptsPath = "$(Build.SourcesDirectory)/SQLScripts"
      $executionOrder = @("Tables", "Views", "StoredProcedures", "Data")
      
      foreach ($folder in $executionOrder) {
        $folderPath = Join-Path $scriptsPath $folder
        if (Test-Path $folderPath) {
          Get-ChildItem -Path $folderPath -Filter "*.sql" | ForEach-Object {
            Write-Host "Executing script: $($_.Name)"
            try {
              Invoke-Sqlcmd -InputFile $_.FullName `
                -ServerInstance "$(serverName)" `
                -Database "$(databaseName)" `
                -Username "$(sqlUsername)" `
                -Password "$(sqlPassword)" `
                -QueryTimeout 3600
              Write-Host "Successfully executed: $($_.Name)"
            }
            catch {
              Write-Error "Error executing $($_.Name): $_"
              exit 1
            }
          }
        }
      }

This script introduces several important concepts. The $executionOrder array defines the sequence in which folders are processed, ensuring database objects are created in the correct dependency order. We check if each folder exists before processing, making the script resilient to missing directories.

The try-catch block around each script execution allows us to capture and report specific errors while still failing the pipeline appropriately. By using exit 1 in the catch block, we ensure the pipeline stops on the first error, preventing cascading failures from dependent objects.

5. Azure Active Directory Authentication

For enhanced security, we can use Azure AD authentication instead of SQL authentication.

5.1. Using Managed Identity

Let’s configure our pipeline to use a managed identity:

- task: AzurePowerShell@5
  displayName: 'Execute SQL with AAD Token'
  inputs:
    azureSubscription: '$(azureSubscription)'
    ScriptType: 'InlineScript'
    azurePowerShellVersion: 'LatestVersion'
    Inline: |
      # Get access token for SQL Database
      $tokenResponse = Get-AzAccessToken -ResourceUrl https://database.windows.net
      $accessToken = $tokenResponse.Token
      
      # Execute SQL script using access token
      Invoke-Sqlcmd -ServerInstance "$(serverName)" `
        -Database "$(databaseName)" `
        -AccessToken $accessToken `
        -InputFile "$(Build.SourcesDirectory)/SQLScripts/Tables/CreateCustomersTable.sql" `
        -QueryTimeout 3600

This approach eliminates the need to store SQL credentials. The Get-AzAccessToken cmdlet retrieves a token for the SQL Database resource using the pipeline’s service connection identity. The token is valid for approximately one hour, which is usually sufficient for most deployment scenarios.

The key advantage here is that we’re using Azure AD for authentication, which provides better security through features like conditional access, multi-factor authentication for manual approvals, and detailed audit logs. The service principal used by the pipeline must be granted appropriate permissions on the SQL database beforehand.

5.2. Service Principal Authentication

We can also use a service principal for authentication:

- task: AzurePowerShell@5
  displayName: 'Get AAD Token for Service Principal'
  inputs:
    azureSubscription: '$(azureSubscription)'
    ScriptType: 'InlineScript'
    azurePowerShellVersion: 'LatestVersion'
    Inline: |
      $context = [Microsoft.Azure.Commands.Common.Authentication.Abstractions.AzureRmProfileProvider]::Instance.Profile.DefaultContext
      $sqlToken = [Microsoft.Azure.Commands.Common.Authentication.AzureSession]::Instance.AuthenticationFactory.Authenticate(
        $context.Account, 
        $context.Environment, 
        $context.Tenant.Id.ToString(), 
        $null, 
        [Microsoft.Azure.Commands.Common.Authentication.ShowDialog]::Never, 
        $null, 
        "https://database.windows.net"
      ).AccessToken
      
      Write-Host "##vso[task.setvariable variable=SQLTOKEN;]$sqlToken"

- task: SqlAzureDacpacDeployment@1
  displayName: 'Deploy using Access Token'
  inputs:
    azureSubscription: '$(azureSubscription)'
    AuthenticationType: 'connectionString'
    ConnectionString: 'Data Source=$(serverName);Initial Catalog=$(databaseName);'
    deployType: 'SqlTask'
    SqlFile: '$(Build.SourcesDirectory)/SQLScripts/Tables/CreateCustomersTable.sql'
    SqlAdditionalArguments: '-AccessToken $(SQLTOKEN)'

This more complex approach gives us fine-grained control over the authentication process. We’re manually extracting the authentication context from the Azure PowerShell session and using it to generate a SQL-specific access token. The token is then stored as a pipeline variable using the special ##vso[task.setvariable] syntax.

The second task uses the token with a connection string that notably lacks any password. This pattern is particularly useful when integrating with tasks that don’t natively support Azure AD authentication but do support access tokens through additional arguments.

6. Best Practices

Proper error handling ensures our pipeline fails appropriately when SQL scripts encounter issues.

6.1. Transaction Management

Let’s wrap our SQL scripts in transactions for atomicity:

- task: PowerShell@2
  displayName: 'Execute SQL Scripts with Transaction'
  inputs:
    targetType: 'inline'
    script: |
      $sqlScripts = Get-ChildItem -Path "$(Build.SourcesDirectory)/SQLScripts/Tables" -Filter "*.sql"
      
      # Create transaction wrapper
      $transactionScript = @"
      BEGIN TRANSACTION
      BEGIN TRY
      "@
      
      foreach ($script in $sqlScripts) {
        $content = Get-Content $script.FullName -Raw
        $transactionScript += "`n$content`n"
      }
      
      $transactionScript += @"
      COMMIT TRANSACTION
      PRINT 'All scripts executed successfully'
      END TRY
      BEGIN CATCH
        ROLLBACK TRANSACTION
        THROW
      END CATCH
      "@
      
      # Execute the wrapped script
      Invoke-Sqlcmd -Query $transactionScript `
        -ServerInstance "$(serverName)" `
        -Database "$(databaseName)" `
        -Username "$(sqlUsername)" `
        -Password "$(sqlPassword)" `
        -QueryTimeout 3600 `
        -ErrorAction Stop

This approach creates a single atomic operation from multiple scripts. If any script fails, all changes are rolled back, maintaining database consistency. The BEGIN TRY…CATCH block ensures proper error handling within SQL Server, while THROW re-raises the error for the pipeline to catch.

This pattern is particularly valuable for related changes that must succeed or fail together, such as creating a table and its associated indexes. However, we need to be cautious with long-running transactions as they can cause blocking issues in production databases.

6.2. Detailed Logging

We can enhance our pipeline with detailed logging for troubleshooting:

- task: PowerShell@2
  displayName: 'Execute SQL with Detailed Logging'
  inputs:
    targetType: 'inline'
    script: |
      $logFile = "$(Build.ArtifactStagingDirectory)/sql-execution.log"
      
      function Execute-SqlScript {
        param($ScriptPath)
        
        $startTime = Get-Date
        Add-Content -Path $logFile -Value "[$startTime] Starting execution of: $ScriptPath"
        
        try {
          $output = Invoke-Sqlcmd -InputFile $ScriptPath `
            -ServerInstance "$(serverName)" `
            -Database "$(databaseName)" `
            -Username "$(sqlUsername)" `
            -Password "$(sqlPassword)" `
            -QueryTimeout 3600 `
            -Verbose 4>&1
          
          $endTime = Get-Date
          $duration = $endTime - $startTime
          Add-Content -Path $logFile -Value "[$endTime] Completed in $duration"
          Add-Content -Path $logFile -Value "Output: $output`n"
          
          Write-Host "##vso[task.uploadfile]$logFile"
        }
        catch {
          Add-Content -Path $logFile -Value "ERROR: $_"
          throw
        }
      }
      
      # Execute all scripts
      Get-ChildItem -Path "$(Build.SourcesDirectory)/SQLScripts" -Filter "*.sql" -Recurse | 
        ForEach-Object { Execute-SqlScript -ScriptPath $_.FullName }

This logging approach creates a comprehensive audit trail of the deployment. The 4>&1 redirection captures verbose output that would normally go to the verbose stream. Recording execution times helps identify performance bottlenecks in deployment scripts.

The ##vso[task.uploadfile] command attaches the log file to the pipeline run, making it available for download even after the agent is cleaned up. This is invaluable for post-mortem analysis of failed deployments, especially in production environments where immediate debugging might not be possible.

6.3. Variable Groups

Let’s use variable groups to manage environment-specific settings:

variables:
- group: 'SQL-Common-Variables'
- group: 'SQL-$(Environment)-Variables'

stages:
- stage: Development
  variables:
    Environment: 'Dev'
  jobs:
  - job: DeployToDev
    steps:
    - template: sql-deployment-template.yml
      parameters:
        environment: $(Environment)

- stage: Production
  dependsOn: Development
  condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
  variables:
    Environment: 'Prod'
  jobs:
  - job: DeployToProd
    steps:
    - template: sql-deployment-template.yml
      parameters:
        environment: $(Environment)

This structure promotes reusability and reduces configuration errors. Common variables like connection timeouts or retry policies go in the shared group, while environment-specific settings like server names and database names go in environment-specific groups.

The production stage only runs for the main branch and requires the development stage to succeed first. This creates a natural promotion path for database changes, ensuring they’re tested in lower environments before reaching production. The template approach keeps the actual deployment logic DRY (Don’t Repeat Yourself).

6.4. Script Validation

We should also validate SQL scripts before execution:

- task: PowerShell@2
  displayName: 'Validate SQL Scripts'
  inputs:
    targetType: 'inline'
    script: |
      $hasErrors = $false
      Get-ChildItem -Path "$(Build.SourcesDirectory)/SQLScripts" -Filter "*.sql" -Recurse | 
        ForEach-Object {
          $content = Get-Content $_.FullName -Raw
          
          # Check for common issues
          if ($content -match 'DROP\s+DATABASE') {
            Write-Error "Dangerous operation found in $($_.Name): DROP DATABASE"
            $hasErrors = $true
          }
          
          if (-not $content.Trim()) {
            Write-Warning "Empty SQL file: $($_.Name)"
          }
          
          # Validate syntax (basic check)
          if ($content -notmatch ';') {
            Write-Warning "No semicolon found in $($_.Name)"
          }
        }
      
      if ($hasErrors) {
        exit 1
      }

This validation step acts as a safety net, catching potentially dangerous operations before they reach the database. The regex patterns can be customized based on your organization’s policies. For example, we might want to prevent TRUNCATE operations in production or ensure all scripts include proper header comments.

While this basic validation can’t catch all SQL syntax errors, it provides a quick sanity check that runs in seconds. For more comprehensive validation, consider using SQL Server Data Tools (SSDT) or running scripts against a disposable test database first.

7. Conclusion

In this article, we’ve explored various methods for executing SQL scripts using Azure DevOps pipelines. We covered the built-in Azure SQL Database deployment task, PowerShell approaches with Invoke-Sqlcmd, and different authentication methods, including Azure AD.

The sample SQL scripts are available over on GitHub.