Azure, Bicep, DevOps, IaC, Security

Security Fundamentals with Bicep

Being a Snyk Ambassador has been fun so far and last year I got this very article published on the Snyk Blog and so thought I would also share it here on my personal blog.

If you want to know more about the Snyk Ambassador Program head over to the website its a great way to meet like minded people who have a passion for application security.

So on to the post, you can click the link to the Snyk Blog or you can continue reading 🙂

Azure Bicep is getting more popular by the day and is rapidly becoming the replacement for ARM templates. In this post I am going to go over some Security Fundamentals when using Bicep.

If you are not familiar with Bicep then I recommend taking a look at the Microsoft Learn Documentation to find out more.

Keep Secrets out of Source Control

We all know we want to keep our secrets out of source control, but it is very easy to accidently leave secrets in files especially when testing out your Bicep configurations locally.

Some ways to avoid this are:

  • Pass parameters in via command line
  • Use a parameters JSON file that is ignored by source control. For example add them to your .gitignore file (if you are using Git)

Secure Inputs

Passing in parameters from the outside is one thing but how do you make sure secrets are secure and not displayed in outputs?

Bicep provides an @secure decorator for String and Object type parameters e.g.

@secure()
param adminPassword string

@secure()
param adminCredentials object

Be Careful of Outputs

Adding outputs to your Bicep modules is very useful but there are a few things to be aware of:

If you are setting an output that looks like a secret then Bicep will provide a warning that you are exposing potential secrets.

The following output for a connection string to a Storage Account would output such a warning

output connection string = 'DefaultEndpointsProtocol=https;AccountName=${storageaccount.name};EndpointSuffix=${environment().suffixes.storage};AccountKey=${listKeys(storageaccount.id, storageaccount.apiVersion).keys[0].value}'

However, if the value was added to a variable before being assigned to the output, then no warning would be shown and would be easy to miss

var connectionString = 'DefaultEndpointsProtocol=https;AccountName=${storageaccount.name};EndpointSuffix=${environment().suffixes.storage};AccountKey=${listKeys(storageaccount.id, storageaccount.apiVersion).keys[0].value}'

output connection string = connectionString

Now let’s see what happens if a storage account resource is deployed to Azure using the following configuration

deploy.bicep

param location string = resourceGroup().location
param tags object = {}
param storageName string = 'stsecureteststore'
param sku string = 'Standard_LRS'

module storageModule 'modules/storage.bicep' = {
  name: 'StorageDeploy'
  params: {
    location: location
    storageName: storageName
    tags: tags
    sku: sku
  }
}

modules/storage.bicep

@description('The storage account name')
@minLength(3)
@maxLength(24)
param storageName string
@description('The storage account location')
param location string
@description('The tags for the storage account')
param tags object
@description('The storage account sku') 
@allowed([ 'Standard_LRS', 'Standard_GRS', 'Standard_GZRS', 'Standard_RAGRS', 'Standard_RAGZRS', 'Standard_ZRS', 'Premium_LRS', 'Premium_ZRS' ])
param sku string = 'Standard_LRS'
@description('The access tier for the blob services') 
@allowed([ 'Hot', 'Cool' ]) 
param accessTier string = 'Hot' 
@description('Allow public access to blobs') 
param allowBlobPublicAccess bool = false 

resource storageaccount 'Microsoft.Storage/storageAccounts@2022-05-01' = {
  name: storageName
  location: location
  kind: 'StorageV2'
  tags: tags
  sku: {
    name: sku
  }
  properties: {
    supportsHttpsTrafficOnly: true
    minimumTlsVersion: 'TLS1_2'
    accessTier: accessTier
    allowBlobPublicAccess: allowBlobPublicAccess
  }
}

var connectionString = 'DefaultEndpointsProtocol=https;AccountName=${storageaccount.name};EndpointSuffix=${environment().suffixes.storage};AccountKey=${listKeys(storageaccount.id, storageaccount.apiVersion).keys[0].value}'
output connection string = connectionString

Any outputs defined in Bicep can be seen as under Deployments for the resource group the resources have been deployed to

Looking at the StorageDeploy outputs the connection is shown including the account key in plain text

This means anyone with Access to view the resources in the Azure Portal can see these outputs. To maintain a good security posture, it is recommended to not return secrets as outputs in Bicep.

Hopefully Bicep will support the use of the @secure decorator for outputs in the future to make returning secrets safe and secure.

Secrets from Resources

If returning secrets from Bicep is a problem, then how do you get secrets from one module to another?

One option is to access an existing resource by using the existing keyword e.g.

param storageName string

resource storageaccount 'Microsoft.Storage/storageAccounts@2022-05-01' existing = {
  name: storageName  
}

var connectionString = 'DefaultEndpointsProtocol=https;AccountName=${storageName};EndpointSuffix=${environment().suffixes.storage};AccountKey=${listKeys(storageaccount.id, storageaccount.apiVersion).keys[0].value}'

This connection string could then be used as an input for another resource.

Secrets from Key Vault

Getting existing resources is one way of getting secrets but there is also support for using a Key Vault to retrieve secrets

Note: Make sure that the Key Vault Access Configuration allows access via “Azure Resource Manager for template deployment”

Key Vaults are accessed in the same way as in the previous section, by use of the existing keyword. One caveat to note however is that getSecret method can only be used when assigning to a module parameter with the @secure decorator e.g.

deploy.bicep

param location string = resourceGroup().location
param tags object
param sqlServerName string
param keyVaultName string
param keyVaultResourceGroupName string
param subscriptionId string = subscription().subscriptionId

resource vaultResource 'Microsoft.KeyVault/vaults@2022-07-01' existing = {
  name: keyVaultName 
  scope: resourceGroup(subscriptionId, keyVaultResourceGroupName  )
}

module sqlModule 'modules/sql.bicep' = {
  name: 'SqlDeploy'
  params: {
    location: location
    tags: tags
    sqlServerName: sqlServerName
    administratorLogin: vaultResource.getSecret('sqlUser')
    administratorLoginPassword: vaultResource.getSecret('sqlPassword')
  }  
}

modules/sql.bicep

@description('The resource location')
param location string
@description('The tags for the resources')
param tags object
@description('The name for the SQL Server')
param sqlServerName string
@secure()
@description('The SQL Administrator Login')
param administratorLogin string
@secure()
@description('The SQL Administrator password')
param administratorLoginPassword string

resource sqlServerResource 'Microsoft.Sql/servers@2022-05-01-preview' = {
  name: sqlServerName
  location: location
  tags:tags
  properties: {
    administratorLogin: administratorLogin
    administratorLoginPassword: administratorLoginPassword
  }
}

Scanning Bicep

Scanning of IaC (Infrastructure as Code) is becoming quite popular and it is good to see that there is interest in finding security issues as early as possible.

 Snyk has a free CLI that can be used to perform IaC scans locally against security and compliance standards. While  it does not directly support the Bicep format, it does support scanning of ARM templates that Bicep compiles down to.

To compile Bicep to ARM you need to have the Bicep CLI installed, and to get started with the Snyk CLI create a free account and then install Snyk CLI using npm. If you have Node.js installed locally, you can install it by running:

npm install snyk@latest -g

Once installed and setup you can then run the command

az bicep build -f {file_name}.bicep

This will produce a JSON file with the same name as the Bicep file and then you can run the Snyk Scan with the command

snyk iac test {file_name}.json

Final Thoughts

Security is something we all have to think about and it’s a constant moving target but the more we learn the more we can do to help secure our resources. I hope this post has been informative and provided some insights to securing your Bicep configurations.

Azure, Azure Pipelines, DevOps, Security

Create Issues in Azure DevOps via Snyk API

Snyk is a great tool for scanning your code and containers for vulnerabilities. Snyk is constantly evolving and adding new features and integrations so if you haven’t checked out the Snyk website, I highly recommend you do so. There also is a free tier for you to get started.

One of the features is Jira Integration, this allows you to create a Jira Issue from within Snyk. If you use Jira then I can see a benefit for this but what if you use Azure DevOps or you want to automate the issue creation.

This post goes though using an Azure Logic App to create an issue in Azure DevOps when a new issue is discovered (Note: the process works for Jira too, just use the Azure Logic App Jira connector).

To use the Snyk API you will need to be on the Business Plan or above (at the time of writing), this then allows the ability to add a webhook to receive events.


So the flow of the Logic App is something like this

On enable of the Logic App it will register as a webhook with your Snyk account and on disable will unregister the webhook.

Let’s build the logic app, step one create a new Logic App in Azure

Once that has been created, select a blank logic app and find the HTTP Webhook trigger

As detailed in the Snyk API Documentation set the subscribe method to POST and the URI for web hooks with your organization Id (this can be found on your Snyk account under Org Settings -> General)

Add the subscribe body that includes a secret defined by you and the url of the logic app (for the URL select the Callback Url in dynamic content)

Now set the unsubscribe method to DELETE and use an expression for the URI and leave the Body as blank

concat('https://snyk.io/api/v1/org/<your org id>/webhooks/',triggerOutputs().subscribe.body.id)

Now add new parameters for Subscribe – Headers and Unsubscribe – Headers

Authorization in both headers should be set to your API token (this can be found on your Snyk account Account Settings -> API Token)

When registering the application Snyk sends a Ping event which is determined by the X-Snyk-Event header, we really don’t want to run the rest of the workflow when this happens so we can add a condition to terminate

Select New Step then find Control and select it

Then select Condition

For the value use an expression to get the X-Snyk-Event header

@triggerOutputs()?['headers']?['X-Snyk-Event']

and then make the condition check it contains the word ping (the actual value is ping and the version e.g. ping/v0)

Now in the True side add an action to Terminate and Set it to Cancelled

Now if the message is anything other than a ping then we want to continue processing the response. We will want to validate that the message coming in is from Snyk and is intended for us as it will have created a signature using our custom secret and added to the header under X-Hub-Signature. To perform this validation we can use an Azure Function.

You can create an Azure Function via the Portal, I suggest you use Linux as the OS

Using Visual Studio Code with the Functions Runtime installed you can create and deploy the following function. If you are not sure how to do this take a look at the Microsoft Docs they are really helpful

I named the function ValidateRequest and used some code from the Snyk API documentation to perform the validation and return either OK (200) or Bad Request (400)

const crypto = require('crypto');
module.exports = async function (context, req) {
     
    context.log('JavaScript HTTP trigger function processed a request.');
    const secret = req.headers['x-logicapp-secret'];
    const hubsignature = req.headers['x-hub-signature'];
    const hmac = crypto.createHmac('sha256', secret);
    const buffer = JSON.stringify(req.body);
    hmac.update(buffer, 'utf8');
    const signature = `sha256=${hmac.digest('hex')}`;    
    const isValid = signature === hubsignature;
   
    context.res = {
        status: isValid ? 200 : 400,
        body: isValid
    };    
}

Now the Function is deployed we can add the next step to the Logic App.

Select the function app we created previously

And select the function we deployed previously

Now we need to pass the payload from the webhook into our ValidateRequest function

Add additional parameters for method and header

Set the method to POST and switch the headers to text mode

Then add the following expression to add the request headers and one with your secret

addProperty(triggerOutputs()['headers'], 'X-LogicApp-Secret', '<your secret>')

If the check is successful then the next step is to parse the json and loop through the new issues

For the Content add the payload as you did previously for the validate functionand the Schema add the following

{
    "properties": {
        "group": {
            "properties": {},
            "type": "object"
        },
        "newIssues": {
            "type": "array"
        },
        "org": {
            "properties": {},
            "type": "object"
        },
        "project": {
            "properties": {},
            "type": "object"
        },
        "removedIssues": {
            "type": "array"
        }
    },
    "type": "object"
}

Next we need to loop through the new issues, add a new action using the control For each and add newIssues

For this I am only interested in the high severity issues so we need to add another condition using an expression for the value and is equal to high (Note: I renamed the condition to Severity)

items('For_each')?['issueData']?['severity']

Now if the severity is high then create work item using the built-in Azure DevOps connector

This will ask you to sign in

Once signed in you can set the details of the Azure DevOps organization, project and work item type

To add the details from the issue as the title and description use the following expressions

items('For_each')?['issueData']?['title']
items('For_each')?['issueData']?['description']

And that is the Logic App complete, a created work item with the title and description fields from the payload looks like this

Perhaps the formatting could do with some work but the information is there and the workflow works.

Although I used Azure DevOps as the output, there is a Jira connector that will allow creation of an issue in Jira

Once you are happy everything is running there is one last step, securing the secrets inside the logic app so they can only be seen by the designer

For the webhook select the settings and turn on Secure Inputs and Secure Outputs

and for the ValidateRequest function turn on at least Secure Inputs.

I find Azure Logic Apps a great way to connect systems together for these types of workflows because it has so many connectors.

NOTE: If you run the logic app via Run Trigger it will fail when looking for the X-Snyk-Event header. Disable and Enable the logic app to register the connection with the API

I hope it helps others integrate Snyk with their workflows and can’t wait to see what other features the API will provide in the future.