DevOps, GitHub Actions, Security

Building Securely with Nuke.Build: Integrating Snyk Scans for .NET Projects

Introduction

As a .NET developer familiar with crafting CI/CD pipelines in YAML using platforms like Azure Pipelines or GitHub Actions, my recent discovery of Nuke.Build sparked considerable interest. This alternative offers the ability to build pipelines in the familiar C# language, complete with IntelliSense for auto-completion, and the flexibility to run and debug pipelines locally—a notable departure from the constraints of YAML pipelines, which often rely on remote build agents.

While exploring Nuke.Build’s developer-centric features, it became apparent that this tool not only enhances the developer experience but also provides an opportunity to seamlessly integrate security practices into the development workflow. As someone deeply invested in promoting developer-first application security, the prospect of incorporating security scans directly into the development lifecycle resonated strongly with me, aligning perfectly with my desire for rapid feedback on application security.

Given my role as a Snyk Ambassador, it was only natural to explore how I could leverage Snyk’s robust security scanning capabilities within the Nuke.Build pipeline to bolster the security posture of .NET projects.

In this blog post, I’ll demonstrate the creation of a pipeline using Nuke.Build and showcase the addition of Snyk scan capability, along with Software Bill of Materials (SBOM) generation. Through this proactive approach, we’ll highlight the ease of integrating a layer of security within the development lifecycle.

Getting Started

Following the Getting Started Guide on the Nuke.Build website, I swiftly integrated a build project into my solution. For the sake of simplicity, I opted to consolidate the .NET Restore, Build, and Test actions into a single target, along with adding a Clean target.

In Nuke.Build, a “target” refers to individual build steps that can be executed independently or in sequence. By combining multiple actions into a single target, I aimed to streamline the build process and eliminate unnecessary complexity.

The resulting build code looks like this:

using Nuke.Common;
using Nuke.Common.IO;
using Nuke.Common.ProjectModel;
using Nuke.Common.Tools.DotNet;

class Build : NukeBuild
{
    public static int Main() => Execute<Build>(x => x.BuildTestCode);

    [Parameter("Configuration to build - Default is 'Debug' (local) or 'Release' (server)")]
    readonly Configuration Configuration = IsLocalBuild ? Configuration.Debug : Configuration.Release;

    [Solution(GenerateProjects = true)] readonly Solution Solution;

    AbsolutePath SourceDirectory => RootDirectory / "src";
    AbsolutePath TestsDirectory => RootDirectory / "tests";

    Target Clean => _ => _
        .Executes(() =>
        {
            SourceDirectory.GlobDirectories("*/bin", "*/obj").DeleteDirectories();
        });

    Target BuildTestCode => _ => _
        .DependsOn(Clean)
        .Executes(() =>
        {
            DotNetTasks.DotNetRestore(_ => _
                .SetProjectFile(Solution)
            );
            DotNetTasks.DotNetBuild(_ => _
                .EnableNoRestore()
                .SetProjectFile(Solution)
                .SetConfiguration(Configuration)
                .SetProperty("SourceLinkCreate", true)
            );
            DotNetTasks.DotNetTest(_ => _
                .EnableNoRestore()
                .EnableNoBuild()
                .SetConfiguration(Configuration)
                .SetTestAdapterPath(TestsDirectory / "*.Tests")
            );
        });
}

Running nuke from my windows terminal built the solution and ran the tests.

Adding Security Scans

With the basic pipeline set up to build the code and run the tests, the next step is to integrate the Snyk scan into the pipeline. While Nuke supports a variety of CLI tools, unfortunately, Snyk is not among them.

To begin, you’ll need to create a free account on the Snyk platform if you haven’t already done so. Once registered, you can then install Snyk CLI using npm. If you have Node.js installed locally, you can install it by running:

npm install snyk@latest -g

Given that the Snyk CLI isn’t directly supported by Nuke, I turned to the Nuke documentation to explore possible solutions for running the CLI. Two options caught my attention: PowerShellTasks and DockerTasks.

To execute the necessary tasks for the Snyk scan, a few steps are required. These include authorizing a connection to Snyk, performing an open-source scan, potentially conducting a code scan, and generating a Software Bill of Materials (SBOM).

Let’s delve into each of these tasks using PowerShellTasks in Nuke. Firstly, let’s tackle authorization. The CLI command for authorization is:

snyk auth

Running this command typically opens a web browser to the Snyk platform, allowing you to authorize access. However, this method isn’t suitable for automated builds on a remote agent. Instead, we need to provide credentials. If you’re using a free account, your user will have an API Token available, which you can find on your account settings page under “API Token.” For enterprise accounts, you can create a service account specifically for this purpose.

To incorporate the Snyk Token into our application, let’s add a parameter to the code:

[Parameter("Snyk Token to interact with the API")] readonly string SnykToken;

Next, we’ll create a new target to execute the authorization command using PowerShellTasks and pass in the Snyk Token:

Target SnykAuth => _ => _
    .DependsOn(BuildTestCode)
    .Executes(() =>
    {          
        PowerShellTasks.PowerShell(_ => _
            .SetCommand("npm install snyk@latest -g")
        );
        PowerShellTasks.PowerShell(_ => _
            .SetCommand($"snyk auth {SnykToken}")
        );
    });

NOTE: This assumes that the build agent does not have the Snyk CLI installed

With authorization complete, our next task is to add a target for the Snyk Open Source scan, ensuring it depends on the Snyk Auth target:

 Target SnykTest => _ => _
    .DependsOn(SnykAuth)
    .Executes(() =>
    {
        // Snyk Test
        PowerShellTasks.PowerShell(_ => _
          .SetCommand("snyk test --all-projects --exclude=build")
        );
    });

Including the --all-projects flag ensures that all projects are scanned, which is good practice for .NET projects. Additionally, I’ve added an exclusion for the build project to focus the scan on application issues. I typically rely on Snyk Monitor attached to my GitHub Repo to detect issues in the entire repository, leaving this scan to concentrate solely on the application being deployed.

Finally, we need to update the Execute method to include the Snyk Test:

public static int Main() => Execute<Build>(x => x.BuildTestCode, x => x.SnykTest);

Running nuke again from the Windows terminal now prompts for Snyk authentication

Once authenticated

In order to prevent this we need to pass the API token value to nuke. It’s a good idea to set an environment variable for your API token e.g. with PowerShell

$env:snykApiToken = "<your api token>"
# or using the Snyk CLI
$env:snykApiToken = snyk config get api
# Run nuke passing in the parameter
nuke --snykToken $snykApiToken

Upon executing the scan, it promptly identified several issues:

Subsequently, the Snyk Test failed, flagging vulnerabilities in the code and failing SnykTest:

To control whether the build fails based on the severity of vulnerabilities found, we can add another parameter:

 [Parameter("Snyk Severity Threshold (critical, high, medium or low)")] readonly string SnykSeverityThreshold = "high";

Ensure that the value has been set before using it. Note that the threshold must be in lowercase.

Target SnykTest => _ => _
    .DependsOn(SnykAuth)
    .Requires(() => SnykSeverityThreshold)
    .Executes(() =>
    {
        // Snyk Test
        PowerShellTasks.PowerShell(_ => _
          .SetCommand($"snyk test --all-projects --exclude=build --severity-threshold={SnykSeverityThreshold.ToLowerInvariant()}")
        );
    });

Now, let’s address running Snyk Code for a SAST scan, which will also need a parameter to control the severity threshold:

[Parameter("Snyk Code Severity Threshold (high, medium or low)")] readonly string SnykCodeSeverityThreshold = "high";

We’ll create another target for the Code test:

Target SnykCodeTest => _ => _
    .DependsOn(SnykAuth)
    .Requires(() => SnykCodeSeverityThreshold)
    .Executes(() =>
    {
        PowerShellTasks.PowerShell(_ => _
            .SetCommand($"snyk code test --all-projects --exclude=build --severity-threshold={SnykCodeSeverityThreshold.ToLowerInvariant()}")
        );
    });

Update the Execute method to include the code test:

public static int Main() => Execute<Build>(x => x.BuildTestCode, x => x.SnykTest, x => x.SnykCodeTest);

With the severity set for both scans, SnykTest continues to find high vulnerabilities, while SnykCodeTest passes:

To generate an SBOM (Software Bill of Materials) using Snyk and publish it as a build artifact, let’s add an output path:

AbsolutePath OutputDirectory => RootDirectory / "outputs";

And include a Produces entry to ensure the artifact is generated and stored in the specified directory:

Target GenerateSbom => _ => _
   .DependsOn(SnykAuth)
   .Produces(OutputDirectory / "*.json")
   .Executes(() =>
   {
       OutputDirectory.CreateOrCleanDirectory();
       PowerShellTasks.PowerShell(_ => _
           .SetCommand($"snyk sbom --all-projects --format spdx2.3+json --json-file-output={OutputDirectory / "sbom.json"}")
       );
   });

Lastly, update the Execute method to include the generation of the SBOM:

public static int Main() => Execute<Build>(x => x.BuildTestCode, x => x.SnykTest, x => x.SnykCodeTest, x => x.GenerateSbom);

Now, when executing Nuke, the SBOM will be generated and stored in the specified directory, ready to be published as a build artifact.

Earlier, I mentioned that PowerShellTasks and DockerTasks were both viable options for integrating the Snyk CLI into the Nuke build. Here’s how you can achieve the same tasks using DockerTasks:

using Nuke.Common.Tools.Docker;
...
  Target SnykTest => _ => _
     .DependsOn(BuildTestCode)
     .Requires(() => SnykToken, () => SnykSeverityThreshold)
     .Executes(() =>
     {
         // Snyk Test
         DockerTasks.DockerRun(_ => _
             .EnableRm()
             .SetVolume($"{RootDirectory}:/app")
             .SetEnv($"SNYK_TOKEN={SnykToken}")
             .SetImage("snyk/snyk:dotnet")
             .SetCommand($"snyk test --all-projects --exclude=build --severity-threshold={SnykSeverityThreshold.ToLowerInvariant()}")
         );
     });
 Target SnykCodeTest => _ => _
     .DependsOn(BuildTestCode)
     .Requires(() => SnykToken, () => SnykCodeSeverityThreshold)
     .Executes(() =>
     {
         DockerTasks.DockerRun(_ => _
             .EnableRm()
             .SetVolume($"{RootDirectory}:/app")
             .SetEnv($"SNYK_TOKEN={SnykToken}")
             .SetImage("snyk/snyk:dotnet")
             .SetCommand($"snyk code test --all-projects --exclude=build --severity-threshold={SnykCodeSeverityThreshold.ToLowerInvariant()}")
         );
     });
 Target GenerateSbom => _ => _
     .DependsOn(BuildTestCode)
     .Produces(OutputDirectory / "*.json")
     .Requires(() => SnykToken)
     .Executes(() =>
     {
         OutputDirectory.CreateOrCleanDirectory();
         DockerTasks.DockerRun(_ => _
             .EnableRm()
             .SetVolume($"{RootDirectory}:/app")
             .SetEnv($"SNYK_TOKEN={SnykToken}")
             .SetImage("snyk/snyk:dotnet")
             .SetCommand($"snyk sbom --all-projects --format spdx2.3+json --json-file-output={OutputDirectory.Name}/sbom.json")
         );
     });

NOTE: Snyk Auth is not required as a separate task as that is done inside the snyk container

Automating Nuke.Build with GitHub Actions: Generating YAML

Nuke comes with another useful feature: the ability to see a plan, which shows which targets are being executed and when. Simply running nuke --plan provides an HTML output of the plan:

With everything configured for local execution, it’s time to think about running this in a pipeline. Nuke supports various CI platforms, but for this demonstration, I’ll be using GitHub Actions. Nuke provides attributes to automatically generate the file to run the code:

using Nuke.Common.CI.GitHubActions;

[GitHubActions(
    "continuous",
    GitHubActionsImage.UbuntuLatest,
    On = new[] { GitHubActionsTrigger.Push },
    ImportSecrets = new[] { nameof(SnykOrgId), nameof(SnykToken), nameof(SnykSeverityThreshold), nameof(SnykCodeSeverityThreshold) },
    InvokedTargets = new[] { nameof(BuildTestCode), nameof(SnykTest), nameof(SnykCodeTest), nameof(GenerateSbom) })]
class Build : NukeBuild
...

To pass in the parameters for GitHub Actions, we’ll need to designate the token as a Secret:

[Parameter("Snyk Token to interact with the API")][Secret] readonly string SnykToken;

Next, let’s remove the default values for the threshold parameters:

[Parameter("Snyk Severity Threshold (critical, high, medium or low)")] readonly string SnykSeverityThreshold;
[Parameter("Snyk Code Severity Threshold (high, medium or low)")] readonly string SnykCodeSeverityThreshold;

We’ll then add the values from the .nuke/parameters.json file:

{
  "$schema": "./build.schema.json",
  "Solution": "Useful.Extensions.sln",
  "SnykSeverityThreshold": "high",
  "SnykCodeSeverityThreshold": "high"
}

Running Nuke again produces the following auto-generated output for GitHub Actions YAML in the folder .github/workflows/continuous.yml:

# ------------------------------------------------------------------------------
# <auto-generated>
#
#     This code was generated.
#
#     - To turn off auto-generation set:
#
#         [GitHubActions (AutoGenerate = false)]
#
#     - To trigger manual generation invoke:
#
#         nuke --generate-configuration GitHubActions_continuous --host GitHubActions
#
# </auto-generated>
# ------------------------------------------------------------------------------

name: continuous

on: [push]

jobs:
  ubuntu-latest:
    name: ubuntu-latest
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      - name: 'Cache: .nuke/temp, ~/.nuget/packages'
        uses: actions/cache@v3
        with:
          path: |
            .nuke/temp
            ~/.nuget/packages
          key: ${{ runner.os }}-${{ hashFiles('**/global.json', '**/*.csproj', '**/Directory.Packages.props') }}
      - name: 'Run: BuildTestCode, SnykTest, SnykCodeTest, GenerateSbom'
        run: ./build.cmd BuildTestCode SnykTest SnykCodeTest GenerateSbom
        env:
          SnykToken: ${{ secrets.SNYK_TOKEN }}
          SnykSeverityThreshold: ${{ secrets.SNYK_SEVERITY_THRESHOLD }}
          SnykCodeSeverityThreshold: ${{ secrets.SNYK_CODE_SEVERITY_THRESHOLD }}
      - name: 'Publish: outputs'
        uses: actions/upload-artifact@v3
        with:
          name: outputs
          path: outputs

This YAML file is automatically generated by Nuke and is ready to be used in your GitHub Actions workflow. It sets up the necessary steps to run your build, including caching dependencies, executing targets, and publishing artifacts.

NOTE: When I first committed the nuke build files, GitHub Actions gave me a permission denied error when running build.cmd. Running these commands and committing them got over that problem

git update-index --chmod=+x .\build.cmd
git update-index --chmod=+x .\build.sh

Here is the output of the GitHub Actions run for this pipeline

After fixing the vulnerabilities in my code, the workflow successfully passed:

Here’s the full C# source code for both PowerShell and Docker versions:

using Nuke.Common;
using Nuke.Common.CI.GitHubActions;
using Nuke.Common.IO;
using Nuke.Common.ProjectModel;
using Nuke.Common.Tools.DotNet;
using Nuke.Common.Tools.PowerShell;

[GitHubActions(
    "continuous",
    GitHubActionsImage.UbuntuLatest,
    On = new[] { GitHubActionsTrigger.Push },
    ImportSecrets = new[] { nameof(SnykToken), nameof(SnykSeverityThreshold), nameof(SnykCodeSeverityThreshold) },
    InvokedTargets = new[] { nameof(BuildTestCode), nameof(SnykTest), nameof(SnykCodeTest), nameof(GenerateSbom) })]
class Build : NukeBuild
{
    public static int Main() => Execute<Build>(x => x.BuildTestCode, x => x.SnykTest, x => x.SnykCodeTest, x => x.GenerateSbom);

    [Parameter("Configuration to build - Default is 'Debug' (local) or 'Release' (server)")]
    readonly Configuration Configuration = IsLocalBuild ? Configuration.Debug : Configuration.Release;

    [Parameter("Snyk Token to interact with the API")][Secret] readonly string SnykToken;
    [Parameter("Snyk Severity Threshold (critical, high, medium or low)")] readonly string SnykSeverityThreshold;
    [Parameter("Snyk Code Severity Threshold (high, medium or low)")] readonly string SnykCodeSeverityThreshold;

    [Solution(GenerateProjects = true)] readonly Solution Solution;

    AbsolutePath SourceDirectory => RootDirectory / "src";
    AbsolutePath TestsDirectory => RootDirectory / "tests";
    AbsolutePath OutputDirectory => RootDirectory / "outputs";

    Target Clean => _ => _
        .Executes(() =>
        {
            SourceDirectory.GlobDirectories("*/bin", "*/obj").DeleteDirectories();
        });

    Target BuildTestCode => _ => _
        .DependsOn(Clean)
        .Executes(() =>
        {
            DotNetTasks.DotNetRestore(_ => _
                .SetProjectFile(Solution)
            );
            DotNetTasks.DotNetBuild(_ => _
                .EnableNoRestore()
                .SetProjectFile(Solution)
                .SetConfiguration(Configuration)
                .SetProperty("SourceLinkCreate", true)
            );
            DotNetTasks.DotNetTest(_ => _
                .EnableNoRestore()
                .EnableNoBuild()
                .SetConfiguration(Configuration)
                .SetTestAdapterPath(TestsDirectory / "*.Tests")
            );
        });
    Target SnykAuth => _ => _
     .DependsOn(BuildTestCode)
     .Executes(() =>
     {
         PowerShellTasks.PowerShell(_ => _
             .SetCommand("npm install snyk@latest -g")
         );
         PowerShellTasks.PowerShell(_ => _
             .SetCommand($"snyk auth {SnykToken}")
         );
     });
    Target SnykTest => _ => _
        .DependsOn(SnykAuth)
        .Requires(() => SnykSeverityThreshold)
        .Executes(() =>
        {
            PowerShellTasks.PowerShell(_ => _
                .SetCommand($"snyk test --all-projects --exclude=build --severity-threshold={SnykSeverityThreshold.ToLowerInvariant()}")
            );
        });
    Target SnykCodeTest => _ => _
        .DependsOn(SnykAuth)
        .Requires(() => SnykCodeSeverityThreshold)
        .Executes(() =>
        {
            PowerShellTasks.PowerShell(_ => _
                .SetCommand($"snyk code test --all-projects --exclude=build --severity-threshold={SnykCodeSeverityThreshold.ToLowerInvariant()}")
            );
        });
    Target GenerateSbom => _ => _
        .DependsOn(SnykAuth)
        .Produces(OutputDirectory / "*.json")
        .Executes(() =>
        {
            OutputDirectory.CreateOrCleanDirectory();
            PowerShellTasks.PowerShell(_ => _
                .SetCommand($"snyk sbom --all-projects --format spdx2.3+json --json-file-output={OutputDirectory / "sbom.json"}")
            );
        });
}
using Nuke.Common;
using Nuke.Common.CI.GitHubActions;
using Nuke.Common.IO;
using Nuke.Common.ProjectModel;
using Nuke.Common.Tools.Docker;
using Nuke.Common.Tools.DotNet;

[GitHubActions(
    "continuous",
    GitHubActionsImage.UbuntuLatest,
    On = new[] { GitHubActionsTrigger.Push },
    ImportSecrets = new[] { nameof(SnykToken), nameof(SnykSeverityThreshold), nameof(SnykCodeSeverityThreshold) },
    InvokedTargets = new[] { nameof(BuildTestCode), nameof(SnykTest), nameof(SnykCodeTest), nameof(GenerateSbom) })]
class Build : NukeBuild
{
    public static int Main() => Execute<Build>(x => x.BuildTestCode, x => x.SnykTest, x => x.SnykCodeTest, x => x.GenerateSbom);

    [Parameter("Configuration to build - Default is 'Debug' (local) or 'Release' (server)")]
    readonly Configuration Configuration = IsLocalBuild ? Configuration.Debug : Configuration.Release;

    [Parameter("Snyk Token to interact with the API")][Secret] readonly string SnykToken;
    [Parameter("Snyk Severity Threshold (critical, high, medium or low)")] readonly string SnykSeverityThreshold;
    [Parameter("Snyk Code Severity Threshold (high, medium or low)")] readonly string SnykCodeSeverityThreshold;

    [Solution(GenerateProjects = true)] readonly Solution Solution;

    AbsolutePath SourceDirectory => RootDirectory / "src";
    AbsolutePath TestsDirectory => RootDirectory / "tests";
    AbsolutePath OutputDirectory => RootDirectory / "outputs";

    Target Clean => _ => _
        .Executes(() =>
        {
            SourceDirectory.GlobDirectories("*/bin", "*/obj").DeleteDirectories();
        });

    Target BuildTestCode => _ => _
        .DependsOn(Clean)
        .Executes(() =>
        {
            DotNetTasks.DotNetRestore(_ => _
                .SetProjectFile(Solution)
            );
            DotNetTasks.DotNetBuild(_ => _
                .EnableNoRestore()
                .SetProjectFile(Solution)
                .SetConfiguration(Configuration)
                .SetProperty("SourceLinkCreate", true)
            );
            DotNetTasks.DotNetTest(_ => _
                .EnableNoRestore()
                .EnableNoBuild()
                .SetConfiguration(Configuration)
                .SetTestAdapterPath(TestsDirectory / "*.Tests")
            );
        });
    Target SnykTest => _ => _
        .DependsOn(BuildTestCode)
        .Requires(() => SnykToken, () => SnykSeverityThreshold)
        .Executes(() =>
        {
            // Snyk Test
            DockerTasks.DockerRun(_ => _
                .EnableRm()
                .SetVolume($"{RootDirectory}:/app")
                .SetEnv($"SNYK_TOKEN={SnykToken}")
                .SetImage("snyk/snyk:dotnet")
                .SetCommand($"snyk test --all-projects --exclude=build --severity-threshold={SnykSeverityThreshold.ToLowerInvariant()}")
            );
        });
    Target SnykCodeTest => _ => _
        .DependsOn(BuildTestCode)
        .Requires(() => SnykToken, () => SnykCodeSeverityThreshold)
        .Executes(() =>
        {
            DockerTasks.DockerRun(_ => _
                .EnableRm()
                .SetVolume($"{RootDirectory}:/app")
                .SetEnv($"SNYK_TOKEN={SnykToken}")
                .SetImage("snyk/snyk:dotnet")
                .SetCommand($"snyk code test --all-projects --exclude=build --severity-threshold={SnykCodeSeverityThreshold.ToLowerInvariant()}")
            );
        });
    Target GenerateSbom => _ => _
        .DependsOn(BuildTestCode)
        .Produces(OutputDirectory / "*.json")
        .Requires(() => SnykToken)
        .Executes(() =>
        {
            OutputDirectory.CreateOrCleanDirectory();
            DockerTasks.DockerRun(_ => _
                .EnableRm()
                .SetVolume($"{RootDirectory}:/app")
                .SetEnv($"SNYK_TOKEN={SnykToken}")
                .SetImage("snyk/snyk:dotnet")
                .SetCommand($"snyk sbom --all-projects --format spdx2.3+json --json-file-output={OutputDirectory.Name}/sbom.json")
            );
        });
}

Final Thoughts

Nuke.Build is a great concept for performing build pipelines and it really helps to be able to run the pipeline locally and test it out, making sure paths and everything are correct. Couple that with the capability to generate GitHub Actions and other support CI pipelines to run the code is a big benefit.

Adding Security Scans to catch things early is another plus and I am glad that it’s possible to run those scans in multiple ways in Nuke, hopefully the Snyk CLI can be supported in Nuke directly in the future.

If you haven’t checked out Nuke yet, I would definitely give it a try and see the benefits for yourself.

Azure, Bicep, Platform Engineering

Building a Self-Service Portal with Port, Azure and Bicep

Introduction

Platform Engineering has become the talk of the town and with it the rise of tooling aimed to help create an IDP (Internal Development Platform), it makes sense to take a look a what’s on offer. One such tool is Port. Port has a lot of features including Software Catalog, Automation, RBAC, Self-Service Actions, Scorecards, etc. as well as integrations into tools such as Datadog, Jira, Pager Duty, GitHub, ArgoCD, etc. Port can also import your existing cloud resources from AWS, Azure and GCP.

Our current cloud provider is Azure and the IaC is in Bicep deployed via Azure Pipelines, Port however does not support Bicep (or ARM) as a provider for importing resources from Azure. The question then is could Port be used to create a self-service portal using Bicep and Azure Pipelines?

The answer is yes, in this article we are going to look at creating a small self-service portal using the above mentioned technologies.

Blank Slate

When you first login to Port you are presented with a blank slate and the feeling of not sure what to do next but fortunately there is a Live Demo site which shows how some of the things go together and there is a lot of documentation as well as an active community to help out too.

Port has capability of being configured from code and supports a number of providers, however, for this article we are going to just add blueprints via the UI.

First Blueprint

Let’s first create a Cloud Host blueprint we can use to store information such as locations for use with the cloud provider (in this case Azure).

To add a new blueprint, select the button on the right hand side of the Builder screen.

Then fill in name as Cloud Host and pick an icon.

Once this has been created we can add some properties to associate with the provider.

Let’s start with some locations to use with creating some resources.

At this point there is just a blueprint for the cloud host and in order to be useful we will need to add some details for the Cloud Hosts in the Catalog, as Bicep is not a supported ingestion we’ll have to add this manually.

After adding the Cloud Host details manually the catalog looks like this:

Note: In this demo we are just using the location but other information could be added for each configuration.

Resource Blueprint

Let’s head back to the Builder screen and add a new blueprint for creating an Azure Resource. As previously, add a new blueprint for creating resources.

Once created we can add a relation link to the Cloud Host blueprint.

We can also add properties to the Create Resource Blueprint like with Cloud Host that we might want to see in the catalog e.g. IaC used, Environment, App name, etc.

Actions with Azure DevOps

So next up is to add some actions to the blueprint so that actually create some resources.

For our new Action, let’s create a small Azure resource like an Azure Storage Account.

On the next screen we get to define what parameters we want the user to provide when the action is ran, for this example we will add an AppName, Location, Environment, Sku and Access Tier.

Note: Location is going to be linked to the Cloud Host blueprint using the Entity Selection property type.

Note: The identifiers for the form fields must match the expected entries in Bicep e.g. Access Tier would default to access_tier but the Bicep parameter might be accessTier.

The next part is to configure the backend, for this we are going to hook up Azure Pipelines, this page provides a client secret to use to add to Azure DevOps.

In an Azure DevOps project setup a service connection for Port to connect to using “Incoming Webhook”.

Fill in the details from the Port configuration including the client secret and Http Header.

Once saved, fill in the details back in Port and go the final page which is permissions.

For the purposes of this demo we will leave this as is with the default settings.

Pipeline

Now everything is configured in Port, we need to add a pipeline to Azure DevOps to trigger on a selection from Port. The Backend page in the action setup gives an example of a starting pipeline but additional steps are needed to support creating resources using Bicep and we also didn’t think there needed to be a multiple job configuration for this purpose.

The below pipeline is triggered by the configured webhook, it deploys Azure resources using a Bicep template and communicates with Port.io. Key steps include fetching an access token, creating a Bicep parameters file, deploying resources, and updating Port.io with deployment status and information. The pipeline includes logging steps, outputs display, and interaction with the Port.io API for entity upserts and status updates.

The goal of this pipeline is to be one that can be reused for building different kinds of resources instead of having multiple pipelines for each resource. After multiple runs and attempts, we finally arrived at this configuration.

trigger: none
pr: none

resources:
  webhooks:
    - webhook: incomingport
      connection: Port.io

variables:
  subscription: 'MySubscription'
  System.debug: true
  runId: ${{ parameters.incomingport.context.runId }}
  deployParametersFileName: 'deploy.bicepparam'
  deployFileName: deploy_${{ lower(replace(parameters.incomingport.action,'create_', '')) }}.bicep
  deployStatus: "FAILURE"
  deployStatusMessage: "Azure Resource Creation Failed"
  deployMessage: "Deployment Pipeline Failed"

stages:
  - stage: run_resource_creation
    displayName: 'Run Resource Creation'
    jobs:
    - job: fetch_port_access_token
      displayName: 'Create Resources'
      pool:
            vmImage: 'ubuntu-latest'
      steps:
        - script: |
            accessToken=$(curl -X POST \
            -H 'Content-Type: application/json' \
            -d '{"clientId": "$(PORT_CLIENT_ID)", "clientSecret": "$(PORT_CLIENT_SECRET)"}' \
            -s 'https://api.getport.io/v1/auth/access_token' | jq -r '.accessToken')
            echo "##vso[task.setvariable variable=accessToken;issecret=true]$accessToken"
            echo "runId=$(runId)"
          displayName: Fetch Access Token and Run ID
          name: getToken
        - template: templates/sendlogs.yml
          parameters:
            Message: "Create parameters file"
            AccessToken: $(accessToken)
            RunId: $(runId)
            conditionLevel: succeeded()
        - pwsh: |
            $obj = $env:payload | ConvertFrom-Json -AsHashtable
            $additionalObj = $env:entityPayload ?? @() | ConvertFrom-Json -AsHashtable
            $excludeList = @()
            $filename = "$env:deployParametersFileName"

            Out-File -FilePath $filename
            "using '$(deployFileName)'" | Out-File -FilePath $filename -Append
            "param runId = '$env:runId'" | Out-File -FilePath $filename -Append
            # Payload Properties
            ($obj.Keys | ForEach-Object { 
              if ($_ -notin $excludeList) { 
                if($($obj[$_]).GetType().Name -eq "String") {
                  "param $_ = '$($obj[$_])'"
                } 
                else {
                  "param $_ = $($obj[$_])"
                }
              }
            }) | Out-File -FilePath $filename -Append
            # Entity Payload Properties
            if($additionalObj.count -ne 0) {
              $entityExcludeList = @("iac","provider","appname")
              ($additionalObj.Keys | ForEach-Object {
                  if ($_ -notin $entityExcludeList) {
                    if($($additionalObj[$_]).GetType().Name -eq "String") {
                      "param $_ = '$($additionalObj[$_])'"
                    } 
                    else {
                      "param $_ = $($additionalObj[$_])"
                    }
                  }
                }) | Out-File -FilePath $filename -Append
                if($env:entityIdentifier -ne $null) {
                  "param parentName = '$env:entityIdentifier'" | Out-File -FilePath $filename -Append
                }
            }
          displayName: 'Create Parameters File'
          env:
            runId: $(runId)
            payload: ${{ convertToJson(parameters.incomingport.payload.properties) }}
            entityPayload: ${{ convertToJson(parameters.incomingport.payload.entity.properties) }}
            entityIdentifier: ${{ parameters.incomingport.payload.entity.identifier }}
            deployParametersFileName: $(deployParametersFileName)
        - bash: |
            cat $(deployParametersFileName)
          displayName: 'Show File'
          condition: and(succeeded(), eq(variables['System.debug'], 'true'))
        - template: templates/sendlogs.yml
          parameters:
            Message: "Deploying Resources"
            AccessToken: $(accessToken)
            RunId: $(runId)
            conditionLevel: succeeded()
        - task: AzureCLI@2
          displayName: "Deploy Resources"
          inputs:
            azureSubscription: $(subscription)
            scriptType: "pscore"
            scriptLocation: "inlineScript"
            inlineScript: |
              $outputStatus = "SUCCESS"
              $outputStatusMessage = "Azure Resource Creation Succeeded"
              $resourceGroupName = "$env:environment-$env:appname-rg"
              $deploymentName = "deploy_$env:runId"
              if($(az group exists --name $resourceGroupName) -eq $false) {
                az group create --name $resourceGroupName --location $env:location
              }
              $output = $(az deployment group create --resource-group $resourceGroupName --template-file $env:deployFileName --parameters $env:deployParametersFileName --name $deploymentName 2>&1)
              if (!$?) {
                $outputStatus = "FAILURE"
                $outputStatusMessage = "Azure Resource Creation Failed"
                try {
                  $obj = $output.Exception.Message -replace '["()]', '\$&'
                  $output = $obj
                } catch {
                  $output = "Something went wrong"
                }
              } else {
                $output = $output -replace '["()]', '\$&'
              }
              $title = (Get-Culture).TextInfo.ToTitleCase($env:deployTitle)

              $resourceName = $(az deployment group show -g $resourceGroupName -n $deploymentName --query properties.outputs.resourceName.value -o tsv)
              Write-Host "##vso[task.setvariable variable=resourceName;]$resourceName"
              Write-Host "##vso[task.setvariable variable=deployMessage;]$output"
              Write-Host "##vso[task.setvariable variable=deployStatus;]$outputStatus"
              Write-Host "##vso[task.setvariable variable=deployStatusMessage;]$outputStatusMessage"
              Write-Host "##vso[task.setvariable variable=deployTitle;]$title"
          env:
            runId: $(runId)
            location: ${{ parameters.incomingport.payload.properties.location }}
            environment: ${{ coalesce(parameters.incomingport.payload.properties.environment, parameters.incomingport.payload.entity.properties.environment) }}
            appname: ${{ coalesce(parameters.incomingport.payload.properties.appname, parameters.incomingport.payload.entity.properties.appname) }}
            deployFileName: $(deployFileName)
            deployParametersFileName: $(deployParametersFileName)
            deployTitle: ${{ lower(replace(replace(parameters.incomingport.action,'create_', ''),'_',' ')) }}
        - script: |
            echo '$(resourceName)'
          displayName: 'Show Outputs'
        - script: |
            curl -X POST \
              -H 'Content-Type: application/json' \
              -H "Authorization: Bearer $(accessToken)" \
              -d '{
                    "identifier": "$(resourceName)",
                    "title": "$(deployTitle)",
                    "properties": {"environment": "${{ parameters.incomingport.payload.properties.environment }}","iac": "Bicep","appname": "${{ coalesce(parameters.incomingport.payload.properties.appname, parameters.incomingport.payload.properties.name) }}"},
                    "relations": {"cloud_host": "${{ parameters.incomingport.payload.properties.location }}"}
                  }' \
              "https://api.getport.io/v1/blueprints/${{ parameters.incomingport.context.blueprint }}/entities?upsert=true&run_id=$(runId)&create_missing_related_entities=true"
          displayName: 'Upsert entity'
        - template: templates/sendlogs.yml
          parameters:
            Message: $(deployMessage)
            AccessToken: $(accessToken)
            RunId: $(runId)
        - template: templates/sendlogs.yml
          parameters:
            Message: "Deployment Finished"
            AccessToken: $(accessToken)
            RunId: $(runId)
        - template: templates/sendStatus.yml
          parameters:
            Status: $(deployStatus)
            Message: $(deployStatusMessage)
            AccessToken: $(accessToken)
            RunId: $(runId)

sendlogs.yml

parameters:
- name: Message
  type: object  
- name: RunId
  type: string
- name: AccessToken
  type: string
- name: conditionLevel
  type: object
  default: always()
  values:
   - always()
   - succeeded()
   - failed()
steps:
- bash: |
    curl -X POST \
      -H 'Content-Type: application/json' \
      -H "Authorization: Bearer ${{ parameters.AccessToken }}" \
      -d '{"message": "${{ parameters.Message }}"}' \
      "https://api.getport.io/v1/actions/runs/${{ parameters.RunId }}/logs"
  displayName: Send Logs  
  condition: and(${{ parameters.conditionLevel }}, ne('${{ parameters.Message }}', ''))

sendStatus.yml

parameters:
- name: Status
  type: string
  default: 'FAILURE'
- name: Message
  type: string
  default: "Azure Resource Creation Successful"
- name: RunId
  type: string
- name: AccessToken
  type: string
steps:
- bash: |
    curl -X PATCH \
      -H 'Content-Type: application/json' \
      -H "Authorization: Bearer ${{ parameters.AccessToken }}" \
      -d '{"status":"${{ parameters.Status }}", "message": {"run_status": "${{ parameters.Message }}"}}' \
      "https://api.getport.io/v1/actions/runs/${{ parameters.RunId }}"
  displayName: 'Send Status'
  condition: always()

There is also a couple of required variables from Port that are needed by the pipeline, the port client id and port client secret

These values can be found in Port by selecting the … icon and then credentials.

Self-Service

With the pipeline created we can now use the Self-Service Hub in Port to create our new resource.

Add some details and execute.

When execute has begun then there is a status on the right hand side.

In Azure DevOps the webhook triggers the running pipeline.

When running the pipeline returns status information to Port and on completion updates the status.

And on the catalog screen there is now an entry for the storage account.

Additional Blueprints

The pipeline has been created to be generic and so should allow other types of resources to be created with accompanying Bicep configurations. The Create Azure Resource blueprint doesn’t seem to be the best place for resources that might need a Day-2 operation so, lets add another blueprint for SQL Server with a Day-2 operation to add a SQL Database into a built SQL Server.

Following the earlier example of creating blueprints and actions, first create a “Create Azure SQL Server” blueprint and then 2 actions “Create Azure SQL Server” using the Create type with user form of Environment, AppName and Location (as previously) and then “Create Azure SQL Database” using the Day-2 type with a single user form entry of Name.

This should then look something like this:

The Self-Service screen now includes the additional actions.

Trying to run the Day-2 operation at this point would not provide an entry for a SQL Server.

But once there is a SQL Server created this will allow it to be selected for the Day-2 operation and deploy a database.

Still not entirely sure how to get a display of the databases deployed in the catalog screen for the Day-2 operation but the run history shows the Create Azure SQL Database action and payload.

All the code shown here for the Bicep and Azure Pipelines can be found here in GitHub.

Final Thoughts

Before writing this article I had no prior experience of Port and there maybe different ways to achieve the above but after the initial “where does everything go” part it seemed a lot easier to see where effort is required to build something functional. You might think why use Bicep when you could import things using the supported integrations, mainly because I use a lot of Bicep and Pulumi and I wanted to see if Port was still an option even without direct support for those technologies and I think that it has merit and as something that is still evolving and improving it’s possible Bicep could be supported one day.

Exploring the Self-Service part of Port was the driving force for this article but there is so much more on offer to dive into and explorer. Port’s free tier supports 15 registered users and so it is a great place to get started and try it out without having to think about costs.

I really like the direction that Platform Engineering is taking and these types of tools are a game changer when it comes to reducing the cognitive load of deployment, infrastructure, etc.. from the Developers and allowing them to concentrate on the features they are delivering instead of how it gets where it needs to.

I hope this article has been interesting and directed you to take a look at Port for your own IDP needs, I am interested to see how Port evolves in the coming months/years.