Azure Pipelines, IaC

Azure Pipelines – Object Parameters and Terraform

I’ve been using Terraform for a while with Azure Pipelines and have always passed the pipeline parameters or variables to Terraform using the -var command line parameter. This has worked really well until I encountered a need to pass more complex objects into Terraform which supports objects, maps and lists.

The Problem

When attempting to pass complex objects into -var Azure Pipelines outputs errors like ‘object is not a string’. After trying a number of work arounds that failed I ended up changing my Terraform to take strings and then perform actions on them e.g. an array as string then using split in Terraform to re-create the array.

This lead me to thinking “There has to be a better way”. Naturally one option is to create a .tfvars.json file and then substitute the variables using the same technique I used in my previous article Azure Pipelines – Parameters + JSON File Substitution. This would work for the most part but would not solve using array types.

I started thinking, could I get a parameter into a script and then somehow workout if it was a complex object and then write code to extract the value into something useful like JSON. This lead me to a community post that mentioned a function convertToJson.

A Solution

Based on using convertToJson and combing the technique from my previous article I came up with a step to create a HCL formatted .auto.tfvars file. The only thing is that for objects the colons ‘:’ need converting to equals ‘=’.

- ${{ each item in parameters }}:
     - script: echo '${{ item.key }}=${{ replace(convertToJson(item.value), ':', '=')}}' >> parameters.auto.tfvars
       displayName: "JsonVar ${{ item.key }}"

The .auto.tfvars file is automatically loaded by Terraform which removes the need to specify any -var or -var-file options.

Example Pipeline

For my example pipeline I have used an object for Tags and an array for a list of Network addresses for use with a Network Security Group.
The initial pipeline setups up the parameters and the Azure Storage Account for my Terraform state files.

trigger: none
pr: none
parameters:
  - name: resourceGroup
    displayName: Resource Group
    type: string
    default: 'terraform-test'
  - name: resourceLocation
    displayName: Resource Location
    type: string
    default: 'uksouth'
  - name: projectName
    displayName: Project Tag Name
    type: string
    default: 'Demo'
  - name: tags
    type: object
    default:
      Project: "Demo"
      Environment: "Dev"
  - name: network_source_addresses
    displayName: Network Address List
    type: object
    default:
      - "192.168.1.20"
      - "192.168.1.254"
variables:
  subscription: 'My Subscription'
  terraformVersion: '0.14.6'
  terraformResourceGroup: 'test-deployment'
  terraformStorageName: 'demoterraformstore'
  terrformStorageSku: Standard_LRS
  terraformContainerName: 'terraform'
  terraformStateFilename: test.tfstate
pool:
  vmImage: "ubuntu-latest"
steps:
- task: AzureCLI@2
  displayName: 'Azure CLI'
  inputs:
    azureSubscription: $(subscription)
    scriptType: bash
    scriptLocation: inlineScript
    inlineScript: |
      az group create --location ${{ parameters.resourceLocation }} --name $(terraformResourceGroup)
      az storage account create --name $(terraformStorageName) --resource-group $(terraformResourceGroup) --location ${{ parameters.resourceLocation }} --sku $(terrformStorageSku) --tags "project=${{ parameters.projectName }}"
      az storage container create --name $(terraformContainerName) --account-name $(terraformStorageName)
    addSpnToEnvironment: false
- template: deploy.yml
  parameters:
    resourceGroup: ${{ parameters.resourceGroup }}
    resourceLocation: ${{ parameters.resourceLocation }}
    tags: ${{ parameters.tags }}
    network_source_addresses: ${{ parameters.network_source_addresses }}
    secret_value: $(secret_value)

I separated the Terraform parts into a template so that the loop only uses the parameters that are needed for Terraform and not any others that are in the main pipeline.
Note: I use the Microsoft Terraform Azure Pipelines Extension to deploy the Terraform scripts.

parameters:
  - name: resourceGroup
    type: string
  - name: resourceLocation
    type: string
  - name: tags
    type: object
  - name: network_source_addresses
    type: object
  - name: secret_value
    type: string
steps:
- ${{ each item in parameters }}:
     - script: echo '${{ item.key }}=${{ replace(convertToJson(item.value), ':', '=')}}' >> parameters.auto.tfvars
       displayName: "JsonVar ${{ item.key }}"
- bash: |
    cat parameters.auto.tfvars
  displayName: "Debug show new file"
- task: TerraformInstaller@0
  displayName: 'Install Terraform'
  inputs:
    terraformVersion: $(terraformVersion)
  
- task: TerraformTaskV1@0
  displayName: 'Terraform Init'
  inputs:
    backendServiceArm: $(subscription)
    backendAzureRmResourceGroupName: '$(terraformResourceGroup)'
    backendAzureRmStorageAccountName: '$(terraformStorageName)'
    backendAzureRmContainerName: '$(terraformContainerName)'
    backendAzureRmKey:  '$(terraformStateFilename)'
- task: TerraformTaskV1@0
  displayName: 'Terraform Plan'
  inputs:
    command: plan    
    environmentServiceNameAzureRM: $(subscription)
- task: TerraformTaskV1@0
  displayName: 'Terraform Apply'
  inputs:
    command: apply
    commandOptions: -auto-approve
    environmentServiceNameAzureRM: $(subscription)

Conclusion

I think this is a nice technique for using complex types in Azure Pipelines for use with Terraform deployments or anything else that would benefit from this idea. This was a fun problem to try and solve and I hope that sharing this helps others who have encountered the same problem.