Azure Pipelines, IaC

Azure Pipelines – Complex Types and Terraform

I’ve been using Terraform for a while with Azure Pipelines and have always passed the pipeline parameters or variables to Terraform using the -var command line parameter. This has worked really well until I encountered a need to pass more complex objects into Terraform which supports objects, maps and lists.

The Problem

When attempting to pass complex objects into -var Azure Pipelines outputs errors like ‘object is not a string’. After trying a number of work arounds that failed I ended up changing my Terraform to take strings and then perform actions on them e.g. an array as string then using split in Terraform to re-create the array.

This lead me to thinking “There has to be a better way”. Naturally one option is to create a .tfvars.json file and then substitute the variables using the same technique I used in my previous article Azure Pipelines – Parameters + JSON File Substitution. This would work for the most part but would not solve using array types.

I started thinking, could I get a parameter into a script and then somehow workout if it was a complex object and then write code to extract the value into something useful like JSON. This lead me to a community post that mentioned a function convertToJson.

A Solution

Based on using convertToJson and combing the technique from my previous article I came up with a step to create a HCL formatted .auto.tfvars file. The only thing is that for objects the colons ‘:’ need converting to equals ‘=’.

- ${{ each item in parameters }}:
     - script: echo '${{ item.key }}=${{ replace(convertToJson(item.value), ':', '=')}}' >> parameters.auto.tfvars
       displayName: "JsonVar ${{ item.key }}"

The .auto.tfvars file is automatically loaded by Terraform which removes the need to specify any -var or -var-file options.

Example Pipeline

For my example pipeline I have used an object for Tags and an array for a list of Network addresses for use with a Network Security Group.
The initial pipeline setups up the parameters and the Azure Storage Account for my Terraform state files.

trigger: none

pr: none

parameters:
  - name: resourceGroup
    displayName: Resource Group
    type: string
    default: 'terraform-test'
  - name: resourceLocation
    displayName: Resource Location
    type: string
    default: 'uksouth'
  - name: projectName
    displayName: Project Tag Name
    type: string
    default: 'Demo'
  - name: tags
    type: object
    default:
      Project: "Demo"
      Environment: "Dev"
  - name: network_source_addresses
    displayName: Network Address List
    type: object
    default:
      - "192.168.1.20"
      - "192.168.1.254"

variables:
  subscription: 'My Subscription'
  terraformVersion: '0.14.6'
  terraformResourceGroup: 'test-deployment'
  terraformStorageName: 'demoterraformstore'
  terrformStorageSku: Standard_LRS
  terraformContainerName: 'terraform'
  terraformStateFilename: test.tfstate

pool:
  vmImage: "ubuntu-latest"

steps:
- task: AzureCLI@2
  displayName: 'Azure CLI'
  inputs:
    azureSubscription: $(subscription)
    scriptType: bash
    scriptLocation: inlineScript
    inlineScript: |
      az group create --location ${{ parameters.resourceLocation }} --name $(terraformResourceGroup)
      az storage account create --name $(terraformStorageName) --resource-group $(terraformResourceGroup) --location ${{ parameters.resourceLocation }} --sku $(terrformStorageSku) --tags "project=${{ parameters.projectName }}"
      az storage container create --name $(terraformContainerName) --account-name $(terraformStorageName)
    addSpnToEnvironment: false
- template: deploy.yml
  parameters:
    resourceGroup: ${{ parameters.resourceGroup }}
    resourceLocation: ${{ parameters.resourceLocation }}
    tags: ${{ parameters.tags }}
    network_source_addresses: ${{ parameters.network_source_addresses }}
    secret_value: $(secret_value)

I separated the Terraform parts into a template so that the loop only uses the parameters that are needed for Terraform and not any others that are in the main pipeline.
Note: I use the Microsoft Terraform Azure Pipelines Extension to deploy the Terraform scripts.

parameters:
  - name: resourceGroup
    type: string
  - name: resourceLocation
    type: string
  - name: tags
    type: object
  - name: network_source_addresses
    type: object
  - name: secret_value
    type: string

steps:
- ${{ each item in parameters }}:
     - script: echo '${{ item.key }}=${{ replace(convertToJson(item.value), ':', '=')}}' >> parameters.auto.tfvars
       displayName: "JsonVar ${{ item.key }}"
- bash: |
    cat parameters.auto.tfvars
  displayName: "Debug show new file"
- task: TerraformInstaller@0
  displayName: 'Install Terraform'
  inputs:
    terraformVersion: $(terraformVersion)
  
- task: TerraformTaskV1@0
  displayName: 'Terraform Init'
  inputs:
    backendServiceArm: $(subscription)
    backendAzureRmResourceGroupName: '$(terraformResourceGroup)'
    backendAzureRmStorageAccountName: '$(terraformStorageName)'
    backendAzureRmContainerName: '$(terraformContainerName)'
    backendAzureRmKey:  '$(terraformStateFilename)'
- task: TerraformTaskV1@0
  displayName: 'Terraform Plan'
  inputs:
    command: plan    
    environmentServiceNameAzureRM: $(subscription)
- task: TerraformTaskV1@0
  displayName: 'Terraform Apply'
  inputs:
    command: apply
    commandOptions: -auto-approve
    environmentServiceNameAzureRM: $(subscription)

Conclusion

I think this is a nice technique for using complex types in Azure Pipelines for use with Terraform deployments or anything else that would benefit from this idea. This was a fun problem to try and solve and I hope that sharing this helps others who have encountered the same problem.

Architecture, Azure Pipelines, Diagrams

Azure Pipelines – Architecture Diagrams as Code

Over the past few years I have drawn many Architecture diagrams in a variety of tools like drawio, visio, Lucid Chart, etc. and always found there were many hours spent doing rework and updates.

I generally draw diagrams using the C4 model for detailed architecture but still would have drawn them by hand and totally forgetting about the fact there is a lot of support for building diagrams as code. First time I looked at the C4 model I used code to draw diagrams, so why had I not been using it? to be honest I have no idea why, but it’s definitely time to embrace it again.

If you are not familiar with the C4 model I suggest you checkout the website and I recommend Simon Brown’s books on Software Architecture for Developers, which is available on Leanpub.

Getting Started

So, I recently embarked on creating diagrams as code using the C4 model and Structurizr. First thing is create a free account on Structurizr to get started.

I now needed to decide what to language to use for my diagram as code. Structurizr supports a number of languages for authoring, Java, .NET, TypeScript, PHP, Python, Go and of course its own DSL. I choose the Structurizr DSL for my diagram as it looked easy enough. For editing the code I used Visual Studio Code and an extension for code highlighting by Ciaran Treanor. The DSL demo page and language reference were really helpful to get started as well as the examples.

Building architecture diagrams using the C4 model is great and using the DSL made it easy to build my diagram quite quickly and using the demo page I could see what my diagram was going to look like.

Publish

Having created my diagram as code and added it to source control, I now needed to push my diagram to my Structurizr workspace. Structurizr has a CLI that you can use to do this.

structurizr-cli push -id <my workspace> -key <workspace key> -secret <workspaceSecret> -workspace mydiagram.dsl

The details for the workspace can be found on your Structurizr page by selecting ‘Show more’.

The CLI has other features including exporting to different formats e.g. Mermaid (for more details on the supported outputs see the website)

structurizr export -workspace mydiagram.dsl -format mermaid

Having published my diagram to my workspace, it would be really good to automate this process so any changes to the diagram get pushed.

Currently I am using Azure Pipelines a lot, so it seemed fitting to create this process using Azure Pipelines YAML.

Building the Pipeline

This should be straight forward, I just need to perform the same steps as I did locally to push my diagram to Structurizr.

The build pipeline automatically checkouts my code from my repo so that step is done for me. The Microsoft Build Agents already have HomeBrew installed so that makes installing the CLI simple.

- bash: |
    brew install structurizr-cli
  displayName: 'Install Structurizr'

Pushing to Structurizr needs a number of parameters which should be kept secret so adding them as secure variables in the Pipeline is one solution.

- bash: |
    structurizr-cli push -id $(workspaceId) -key $(workspaceKey) -secret $(workspaceSecret) -workspace $(workspaceFile)
  displayName: 'Push Diagram to Structurizr'

Enhancing the Pipeline

The diagram has been updated in Structurizr but I now usually need images to add to a presentation or documentation. I could go to Structurizr and export the images for each diagram by hand but that takes time and is not helpful is someone else needs them.

Fortunately there are some examples of how to do this using Puppeteer with Structurizr on GitHub which is great and the export of private diagrams worked out of the box with no modification.

The pipeline can be updated now to Install Puppeteer

- bash: |
   npm install puppeteer 
  displayName: 'Install Puppeteer'

And get the example from the Structurizr/Puppeteer repo

- bash: |
    git clone 'https://github.com/structurizr/puppeteer.git'
  displayName: "Get Structurizr Puppeteer"    

Using the example and providing the required details, png files can now be exported.

- bash: |    
    cd $(Build.ArtifactStagingDirectory)
    node $(System.DefaultWorkingDirectory)/puppeteer/export-private-diagrams.js $(workspaceUrl) $(workspaceUsername) $(workspacePassword) png $(workspaceId)
  displayName: 'Export Diagram from Structurizr'

Or svg files.

- bash: |    
    cd $(Build.ArtifactStagingDirectory)
    node $(System.DefaultWorkingDirectory)/puppeteer/export-private-diagrams.js $(workspaceUrl) $(workspaceUsername) $(workspacePassword) svg $(workspaceId)
  displayName: 'Export Diagram from Structurizr'

The images are outputted to the ArtifactStagingDirectory and can now be published as an artifact.

- publish: '$(Build.ArtifactStagingDirectory)'
  displayName: Publish Diagrams
  artifact: 'mydiagrams'

The artifacts will be individual files and so it might be easier to zip them for easier download from Azure Pipelines using the ArchiveFiles task.

 - task: ArchiveFiles@2
   inputs:
      rootFolderOrFile: '$(Build.ArtifactStagingDirectory)'
      archiveFile: '$(Build.ArtifactStagingDirectory)/diagrams.zip' 
 - publish: '$(Build.ArtifactStagingDirectory)/diagrams.zip'
   displayName: Publish Diagrams
   artifact: 'mydiagrams'

Final Pipeline

Putting all that together, my final pipeline looks like this:

trigger: 
- main

pr: none

variables:
  workspaceFile: 'mydiagram.dsl'
  isMain: $[eq(variables['Build.SourceBranch'], 'refs/heads/main')]

jobs:
  - job: UpdateArchitecture
    displayName: Update Architecture
    condition: and(succeeded(), eq(variables.isMain, true))
    pool:
      vmImage: ubuntu-18.04
    steps:
      - bash: |
          brew install structurizr-cli
          brew info structurizr-cli
        displayName: 'Install Structurizr'
      - bash: |
          npm install puppeteer 
        displayName: 'Install Puppeteer'
      - bash: |
          git clone 'https://github.com/structurizr/puppeteer.git'
        displayName: 'Get Structurizr Puppeteer'
      - bash: |
          structurizr-cli push -id $(workspaceId) -key $(workspaceKey) -secret $(workspaceSecret) -workspace $(workspaceFile)
        displayName: 'Push Diagram to Structurizr'
      - bash: |    
          cd $(Build.ArtifactStagingDirectory)
          node $(System.DefaultWorkingDirectory)/puppeteer/export-private-diagrams.js $(workspaceUrl) '$(workspaceUsername)' '$(workspacePassword)' png $(workspaceId)
        displayName: 'Export Diagram from Structurizr'
      - publish: '$(Build.ArtifactStagingDirectory)'
        displayName: Publish Diagrams
        artifact: 'mydiagrams'

Final Thoughts

I started on a journey to automate building architecture diagrams and export the images and this satisfies todays need but in the future I may need to enhance the pipeline to push the images to another system or export them in to another format.

I will certainly be using this article to remind me about diagrams as code, I hope you consider diagrams as code for you own needs and that this has been useful to share.

Happy diagramming.