Azure Pipelines is a cloud-native CI/CD service that enables you to build, test, and deploy applications to any platform. Whether you’re working with containers, microservices, or traditional applications, Azure Pipelines provides the flexibility and power you need for modern DevOps practices.

Why Azure Pipelines?

Azure Pipelines stands out in the crowded CI/CD landscape:

  • Multi-platform Support: Build on Windows, Linux, and macOS
  • Any Language, Any Platform: Support for .NET, Java, Node.js, Python, Go, PHP, Ruby, and more
  • Cloud-native: Fully managed service with elastic scaling
  • YAML Pipelines: Infrastructure as Code approach for version control
  • Rich Ecosystem: Integrates with GitHub, Azure Repos, Bitbucket, and more
  • Free Tier: 1,800 minutes/month for open-source projects

Getting Started: Your First Pipeline

Let’s create a simple CI pipeline for a Node.js application:

# azure-pipelines.yml
trigger:
  branches:
    include:
    - main
    - develop
  paths:
    exclude:
    - docs/*
    - README.md

pool:
  vmImage: 'ubuntu-latest'

variables:
  NODE_VERSION: '18.x'
  NPM_CACHE_FOLDER: $(Pipeline.Workspace)/.npm

steps:
- task: NodeTool@0
  displayName: 'Install Node.js'
  inputs:
    versionSpec: $(NODE_VERSION)

- task: Cache@2
  displayName: 'Cache npm packages'
  inputs:
    key: 'npm | "$(Agent.OS)" | package-lock.json'
    restoreKeys: |
      npm | "$(Agent.OS)"
    path: $(NPM_CACHE_FOLDER)

- script: |
    npm ci
  displayName: 'Install dependencies'

- script: |
    npm run lint
  displayName: 'Run linter'

- script: |
    npm run test:coverage
  displayName: 'Run tests with coverage'

- task: PublishTestResults@2
  displayName: 'Publish test results'
  condition: succeededOrFailed()
  inputs:
    testResultsFormat: 'JUnit'
    testResultsFiles: '**/test-results.xml'
    failTaskOnFailedTests: true

- task: PublishCodeCoverageResults@1
  displayName: 'Publish code coverage'
  inputs:
    codeCoverageTool: 'Cobertura'
    summaryFileLocation: '$(System.DefaultWorkingDirectory)/coverage/cobertura-coverage.xml'

- script: |
    npm run build
  displayName: 'Build application'

- task: PublishBuildArtifacts@1
  displayName: 'Publish artifacts'
  inputs:
    PathtoPublish: '$(Build.ArtifactStagingDirectory)'
    ArtifactName: 'drop'
    publishLocation: 'Container'

Multi-Stage Pipelines: Build, Test, Deploy

For production-grade pipelines, use multi-stage approach:

# azure-pipelines-multistage.yml
trigger:
  - main

variables:
  - group: production-secrets
  - name: vmImageName
    value: 'ubuntu-latest'
  - name: dockerRegistryServiceConnection
    value: 'myACR'
  - name: imageRepository
    value: 'myapp'
  - name: dockerfilePath
    value: '$(Build.SourcesDirectory)/Dockerfile'
  - name: tag
    value: '$(Build.BuildId)'

stages:
- stage: Build
  displayName: 'Build and Push Docker Image'
  jobs:
  - job: Build
    displayName: 'Build Job'
    pool:
      vmImage: $(vmImageName)
    steps:
    - task: Docker@2
      displayName: 'Build and push image to ACR'
      inputs:
        command: buildAndPush
        repository: $(imageRepository)
        dockerfile: $(dockerfilePath)
        containerRegistry: $(dockerRegistryServiceConnection)
        tags: |
          $(tag)
          latest
    
    - task: AquaSec@2
      displayName: 'Scan image for vulnerabilities'
      inputs:
        image: '$(imageRepository):$(tag)'
        scanType: 'local'
        exitCodeThreshold: 'critical'
    
    - task: CopyFiles@2
      displayName: 'Copy manifests'
      inputs:
        SourceFolder: '$(Build.SourcesDirectory)/k8s'
        Contents: '**/*.yml'
        TargetFolder: '$(Build.ArtifactStagingDirectory)/manifests'
    
    - task: PublishPipelineArtifact@1
      displayName: 'Publish Kubernetes manifests'
      inputs:
        targetPath: '$(Build.ArtifactStagingDirectory)/manifests'
        artifact: 'manifests'

- stage: DeployDev
  displayName: 'Deploy to Development'
  dependsOn: Build
  condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/develop'))
  jobs:
  - deployment: DeployDev
    displayName: 'Deploy to Dev Environment'
    pool:
      vmImage: $(vmImageName)
    environment: 'development'
    strategy:
      runOnce:
        deploy:
          steps:
          - task: DownloadPipelineArtifact@2
            inputs:
              artifactName: 'manifests'
              downloadPath: '$(System.ArtifactsDirectory)'
          
          - task: KubernetesManifest@0
            displayName: 'Deploy to AKS'
            inputs:
              action: 'deploy'
              kubernetesServiceConnection: 'aks-dev'
              namespace: 'development'
              manifests: |
                $(System.ArtifactsDirectory)/deployment.yml
                $(System.ArtifactsDirectory)/service.yml
              containers: |
                $(imageRepository):$(tag)

- stage: DeployStaging
  displayName: 'Deploy to Staging'
  dependsOn: Build
  condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
  jobs:
  - deployment: DeployStaging
    displayName: 'Deploy to Staging Environment'
    pool:
      vmImage: $(vmImageName)
    environment: 'staging'
    strategy:
      runOnce:
        deploy:
          steps:
          - task: DownloadPipelineArtifact@2
            inputs:
              artifactName: 'manifests'
              downloadPath: '$(System.ArtifactsDirectory)'
          
          - task: KubernetesManifest@0
            displayName: 'Deploy to AKS Staging'
            inputs:
              action: 'deploy'
              kubernetesServiceConnection: 'aks-staging'
              namespace: 'staging'
              manifests: |
                $(System.ArtifactsDirectory)/deployment.yml
                $(System.ArtifactsDirectory)/service.yml
              containers: |
                $(imageRepository):$(tag)
          
          - task: Bash@3
            displayName: 'Run smoke tests'
            inputs:
              targetType: 'inline'
              script: |
                echo "Running smoke tests..."
                curl -f https://staging.example.com/health || exit 1
                echo "Smoke tests passed!"

- stage: DeployProduction
  displayName: 'Deploy to Production'
  dependsOn: DeployStaging
  condition: succeeded()
  jobs:
  - deployment: DeployProduction
    displayName: 'Deploy to Production Environment'
    pool:
      vmImage: $(vmImageName)
    environment: 'production'
    strategy:
      canary:
        increments: [10, 25, 50, 100]
        preDeploy:
          steps:
          - script: echo "Starting canary deployment"
        deploy:
          steps:
          - task: DownloadPipelineArtifact@2
            inputs:
              artifactName: 'manifests'
              downloadPath: '$(System.ArtifactsDirectory)'
          
          - task: KubernetesManifest@0
            displayName: 'Deploy to AKS Production'
            inputs:
              action: 'deploy'
              kubernetesServiceConnection: 'aks-production'
              namespace: 'production'
              strategy: 'canary'
              percentage: '$(strategy.increment)'
              manifests: |
                $(System.ArtifactsDirectory)/deployment.yml
                $(System.ArtifactsDirectory)/service.yml
              containers: |
                $(imageRepository):$(tag)
        postDeploy:
          steps:
          - script: |
              echo "Monitoring canary deployment..."
              sleep 60
              # Add monitoring checks here

Reusable Templates

Create modular, reusable pipeline components:

# templates/build-template.yml
parameters:
  - name: nodeVersion
    type: string
    default: '18.x'
  - name: workingDirectory
    type: string
    default: '.'
  - name: runTests
    type: boolean
    default: true

steps:
- task: NodeTool@0
  displayName: 'Install Node.js ${{ parameters.nodeVersion }}'
  inputs:
    versionSpec: ${{ parameters.nodeVersion }}

- script: |
    npm ci
  displayName: 'Install dependencies'
  workingDirectory: ${{ parameters.workingDirectory }}

- script: |
    npm run build
  displayName: 'Build application'
  workingDirectory: ${{ parameters.workingDirectory }}

- ${{ if eq(parameters.runTests, true) }}:
  - script: |
      npm test
    displayName: 'Run tests'
    workingDirectory: ${{ parameters.workingDirectory }}

Use the template:

# azure-pipelines.yml
trigger:
  - main

jobs:
- job: BuildFrontend
  displayName: 'Build Frontend'
  pool:
    vmImage: 'ubuntu-latest'
  steps:
  - template: templates/build-template.yml
    parameters:
      nodeVersion: '18.x'
      workingDirectory: 'frontend'
      runTests: true

- job: BuildBackend
  displayName: 'Build Backend'
  pool:
    vmImage: 'ubuntu-latest'
  steps:
  - template: templates/build-template.yml
    parameters:
      nodeVersion: '18.x'
      workingDirectory: 'backend'
      runTests: true

Integrating with Azure Key Vault

Securely manage secrets using Azure Key Vault:

# azure-pipelines-secrets.yml
trigger:
  - main

variables:
  - group: production-variable-group

pool:
  vmImage: 'ubuntu-latest'

steps:
- task: AzureKeyVault@2
  displayName: 'Fetch secrets from Key Vault'
  inputs:
    azureSubscription: 'Azure-Service-Connection'
    KeyVaultName: 'my-key-vault'
    SecretsFilter: 'DB-PASSWORD,API-KEY,SSH-PRIVATE-KEY'
    RunAsPreJob: true

- task: Bash@3
  displayName: 'Deploy with secrets'
  inputs:
    targetType: 'inline'
    script: |
      echo "Using secrets securely..."
      # Secrets are available as environment variables
      echo "DB Password length: ${#DB_PASSWORD}"
      
      # Deploy application with secrets
      kubectl create secret generic app-secrets \
        --from-literal=db-password="$(DB-PASSWORD)" \
        --from-literal=api-key="$(API-KEY)" \
        --dry-run=client -o yaml | kubectl apply -f -

Advanced Features

Parallel Jobs and Matrix Strategy

Run tests across multiple configurations:

strategy:
  matrix:
    Node_18_Ubuntu:
      node_version: '18.x'
      imageName: 'ubuntu-latest'
    Node_18_Windows:
      node_version: '18.x'
      imageName: 'windows-latest'
    Node_20_Ubuntu:
      node_version: '20.x'
      imageName: 'ubuntu-latest'
    Node_20_MacOS:
      node_version: '20.x'
      imageName: 'macOS-latest'
  maxParallel: 4

pool:
  vmImage: $(imageName)

steps:
- task: NodeTool@0
  inputs:
    versionSpec: $(node_version)
- script: |
    npm ci
    npm test
  displayName: 'Install and test'

Conditional Execution

Execute tasks based on conditions:

steps:
- script: echo "This runs on main branch"
  condition: eq(variables['Build.SourceBranch'], 'refs/heads/main')

- script: echo "This runs on pull requests"
  condition: eq(variables['Build.Reason'], 'PullRequest')

- script: echo "This runs on manual triggers"
  condition: eq(variables['Build.Reason'], 'Manual')

- script: echo "This runs only on weekdays"
  condition: and(succeeded(), in(variables['System.DayOfWeek'], 'Monday', 'Tuesday', 'Wednesday', 'Thursday', 'Friday'))

Pipeline as Code: Complex Example

Complete enterprise-grade pipeline:

# azure-pipelines-enterprise.yml
name: $(Date:yyyyMMdd)$(Rev:.r)

trigger:
  batch: true
  branches:
    include:
    - main
    - release/*
    exclude:
    - feature/*

pr:
  branches:
    include:
    - main
  paths:
    exclude:
    - docs/*
    - '*.md'

schedules:
- cron: "0 2 * * *"
  displayName: Daily midnight build
  branches:
    include:
    - main
  always: true

resources:
  repositories:
  - repository: templates
    type: git
    name: DevOps/pipeline-templates
    ref: refs/heads/main

variables:
  - group: global-variables
  - name: buildConfiguration
    value: 'Release'
  - name: isMain
    value: $[eq(variables['Build.SourceBranch'], 'refs/heads/main')]

stages:
- stage: Validate
  displayName: 'Code Quality & Security'
  jobs:
  - job: CodeAnalysis
    displayName: 'Static Code Analysis'
    pool:
      vmImage: 'ubuntu-latest'
    steps:
    - task: SonarCloudPrepare@1
      inputs:
        SonarCloud: 'SonarCloud-Connection'
        organization: 'my-org'
        scannerMode: 'CLI'
        projectKey: 'my-project'
    
    - script: npm ci && npm run build
      displayName: 'Build for analysis'
    
    - task: SonarCloudAnalyze@1
    
    - task: SonarCloudPublish@1
      inputs:
        pollingTimeoutSec: '300'
    
    - task: SnykSecurityScan@1
      inputs:
        serviceConnectionEndpoint: 'Snyk'
        testType: 'app'
        severityThreshold: 'high'
        monitorWhen: 'always'
        failOnIssues: true

  - job: DependencyCheck
    displayName: 'Dependency Vulnerability Scan'
    pool:
      vmImage: 'ubuntu-latest'
    steps:
    - task: dependency-check-build-task@6
      inputs:
        projectName: 'MyApp'
        scanPath: '.'
        format: 'HTML,JSON'
        suppressionPath: 'dependency-check-suppressions.xml'

- stage: Build
  displayName: 'Build & Package'
  dependsOn: Validate
  condition: succeeded()
  jobs:
  - template: build-jobs.yml@templates
    parameters:
      buildConfiguration: $(buildConfiguration)

- stage: Test
  displayName: 'Testing'
  dependsOn: Build
  jobs:
  - job: UnitTests
    pool:
      vmImage: 'ubuntu-latest'
    steps:
    - script: npm ci && npm run test:unit
      displayName: 'Run unit tests'
  
  - job: IntegrationTests
    pool:
      vmImage: 'ubuntu-latest'
    services:
      postgres:
        image: postgres:14
        env:
          POSTGRES_PASSWORD: testpassword
      redis:
        image: redis:7
    steps:
    - script: npm ci && npm run test:integration
      displayName: 'Run integration tests'

- stage: DeployNonProd
  displayName: 'Deploy to Non-Production'
  dependsOn: Test
  condition: and(succeeded(), eq(variables.isMain, true))
  jobs:
  - deployment: DeployDev
    environment: dev
    pool:
      vmImage: 'ubuntu-latest'
    strategy:
      runOnce:
        deploy:
          steps:
          - template: deploy-steps.yml@templates

- stage: DeployProd
  displayName: 'Deploy to Production'
  dependsOn: DeployNonProd
  condition: succeeded()
  jobs:
  - deployment: DeployProd
    environment: production
    pool:
      vmImage: 'ubuntu-latest'
    strategy:
      blueGreen:
        deploy:
          steps:
          - template: deploy-steps.yml@templates

Monitoring & Observability

Integrate Application Insights:

- task: AzureCLI@2
  displayName: 'Log deployment metrics'
  inputs:
    azureSubscription: 'Azure-Service-Connection'
    scriptType: 'bash'
    scriptLocation: 'inlineScript'
    inlineScript: |
      az monitor app-insights events show \
        --app "myapp-insights" \
        --type customEvents \
        --query "[?name=='Deployment']" \
        --output table
      
      # Log custom deployment event
      curl -X POST https://dc.services.visualstudio.com/v2/track \
        -H "Content-Type: application/json" \
        -d '{
          "name": "Microsoft.ApplicationInsights.Event",
          "time": "'$(date -u +"%Y-%m-%dT%H:%M:%S")'.000Z",
          "data": {
            "baseType": "EventData",
            "baseData": {
              "name": "Deployment",
              "properties": {
                "BuildId": "$(Build.BuildId)",
                "Environment": "Production",
                "Status": "Success"
              }
            }
          }
        }'

Best Practices Summary

Use YAML pipelines for version control and code review
Implement multi-stage pipelines for clear separation of concerns
Leverage templates for reusability across projects
Secure secrets with Azure Key Vault integration
Enable caching to speed up builds
Run security scans (SAST, DAST, dependency checks)
Use service connections with minimal permissions
Implement approval gates for production deployments
Monitor pipeline performance and optimize bottlenecks
Document pipeline behavior with clear display names

Troubleshooting Common Issues

# Enable system diagnostics
variables:
  system.debug: true

# Check agent capabilities
az pipelines agent list --pool-id <pool-id>

# View pipeline run details
az pipelines runs show --id <run-id> --output table

# Cancel running pipeline
az pipelines run cancel --run-id <run-id>

Conclusion

Azure Pipelines provides enterprise-grade CI/CD capabilities with the flexibility to handle any deployment scenario. By following these patterns and best practices, you’ll build reliable, secure, and maintainable pipelines that accelerate your software delivery.

Resources


Questions or feedback? Let me know in the comments!

Continue Reading