Eric  Bukenya

Eric Bukenya

1624114320

Creating VM Images in Azure with Packer HCL, using Azure DevOps Pipelines.

In this blog post, I’ll show how to use a Packer file written in HCL, to create an image in Azure. We’ll be using an Azure DevOps pipeline to deploy the image. That resulting image can then be used with Terraform to deploy VMs!

Write Packer templates using HCL

As of v1.5 Packer supports Hashicorp Configuration Language, the language used by Terraform, which is much more human-readable to write and understand than JSON which was used prior to v1.5. Packer is at version 1.7.3 at the time of writing.

It is recommended to migrate away from using .json files for your Packer templates. You can automatically upgrade your old Packer json files to HCL as of v1.6.2 using the hcl2_upgrade command.

packer hcl2_upgrade filename.json

I have another post describing how to use .json templates with the Azure DevOps pipeline here, if you don’t want to upgrade your templates first to hcl.

Packer HCL template

My packer template below installs a plugin to perform windows updates on the image and then generalises it ready for VMs to use it as the source.

#azure-devops #terraform #ci-cd-pipeline #hashicorp-packer #azure

What is GEEK

Buddha Community

Creating VM Images in Azure with Packer HCL, using Azure DevOps Pipelines.
Eric  Bukenya

Eric Bukenya

1624114320

Creating VM Images in Azure with Packer HCL, using Azure DevOps Pipelines.

In this blog post, I’ll show how to use a Packer file written in HCL, to create an image in Azure. We’ll be using an Azure DevOps pipeline to deploy the image. That resulting image can then be used with Terraform to deploy VMs!

Write Packer templates using HCL

As of v1.5 Packer supports Hashicorp Configuration Language, the language used by Terraform, which is much more human-readable to write and understand than JSON which was used prior to v1.5. Packer is at version 1.7.3 at the time of writing.

It is recommended to migrate away from using .json files for your Packer templates. You can automatically upgrade your old Packer json files to HCL as of v1.6.2 using the hcl2_upgrade command.

packer hcl2_upgrade filename.json

I have another post describing how to use .json templates with the Azure DevOps pipeline here, if you don’t want to upgrade your templates first to hcl.

Packer HCL template

My packer template below installs a plugin to perform windows updates on the image and then generalises it ready for VMs to use it as the source.

#azure-devops #terraform #ci-cd-pipeline #hashicorp-packer #azure

Noah  Rowe

Noah Rowe

1595494080

Azure DevOps Pipelines: Multi-Stage Pipelines

The last couple of posts have been dealing with Release managed from the Releases area under Azure Pipelines. This week we are going to take what we were doing in that separate area of Azure DevOps and instead make it part of the YAML that currently builds our application. If you need some background on how the project got to this point check out the following posts.

Getting Started with Azure DevOps

Pipeline Creation in Azure DevOps

Azure DevOps Publish Artifacts for ASP.NET Core

Azure DevOps Pipelines: Multiple Jobs in YAML

Azure DevOps Pipelines: Reusable YAML

Azure DevOps Pipelines: Use YAML Across Repos

Azure DevOps Pipelines: Conditionals in YAML

Azure DevOps Pipelines: Naming and Tagging

Azure DevOps Pipelines: Manual Tagging

Azure DevOps Pipelines: Depends On with Conditionals in YAML

Azure DevOps Pipelines: PowerShell Task

Azure DevOps Releases: Auto Create New Release After Pipeline Build

Azure DevOps Releases: Auto Create Release with Pull Requests

Image for post

Recap

The current setup we have uses a YAML based Azure Pipeline to build a couple of ASP.NET Core web applications. Then on the Release side, we have basically a dummy release that doesn’t actually do anything but served as a demo of how to configure a continuous deployment type release. The following is the current YAML for our Pipeline for reference.

name: $(SourceBranchName)_$(date:yyyyMMdd)$(rev:.r)

resources:      
  repositories: 
  - repository: Shared
    name: Playground/Shared
    type: git 
    ref: master #branch name

trigger: none

variables:
  buildConfiguration: 'Release'

jobs:
- job: WebApp1
  displayName: 'Build WebApp1'
  pool:
    vmImage: 'ubuntu-latest'

  steps:
  - task: PowerShell@2
    inputs:
      targetType: 'inline'
      script: 'Get-ChildItem -Path Env:\'

  - template: buildCoreWebProject.yml@Shared
    parameters:
      buildConFiguration: $(buildConfiguration)
      project: WebApp1.csproj
      artifactName: WebApp1

- job: WebApp2
  displayName: 'Build WebApp2'
  condition: and(succeeded(), eq(variables['BuildWebApp2'], 'true'))
  pool:
    vmImage: 'ubuntu-latest'

  steps:
  - template: build.yml
    parameters:
      buildConFiguration: $(buildConfiguration)
      project: WebApp2.csproj
      artifactName: WebApp2

- job: DependentJob
  displayName: 'Build Dependent Job'
  pool:
    vmImage: 'ubuntu-latest'

  dependsOn:
  - WebApp1
  - WebApp2

  steps:
  - template: buildCoreWebProject.yml@Shared
    parameters:
      buildConFiguration: $(buildConfiguration)
      project: WebApp1.csproj
      artifactName: WebApp1Again

- job: TagSources
  displayName: 'Tag Sources'
  pool:
    vmImage: 'ubuntu-latest'

  dependsOn:
  - WebApp1
  - WebApp2
  - DependentJob
  condition: |
    and
    (
      eq(dependencies.WebApp1.result, 'Succeeded'),
      in(dependencies.WebApp2.result, 'Succeeded', 'Skipped'),
      in(dependencies.DependentJob.result, 'Succeeded', 'Skipped')
    )

  steps:
  - checkout: self
    persistCredentials: true
    clean: true
    fetchDepth: 1

  - task: PowerShell@2
    inputs:
      targetType: 'inline'
      script: |
        $env:GIT_REDIRECT_STDERR` = '2>&1'
        $tag = "manual_$(Build.BuildNumber)".replace(' ', '_')
        git tag $tag
        Write-Host "Successfully created tag $tag" 

        git push --tags
         Write-Host "Successfully pushed tag $tag"     

      failOnStderr: false

#azure-pipelines #azure #azure-devops #devops

Nella  Brown

Nella Brown

1621490340

Use Azure Static Web Apps with Azure DevOps pipelines

In this post, we discuss how to use Azure Static Web Apps from a pipeline in Azure DevOps.Last year, Microsoft released Azure Static Web Apps, a great way to bundle your static app with a serverless Azure Functions backend. If you have a GitHub repository, Azure Static Web Apps has you covered. You create an instance in Azure, select a GitHub repository, and Azure creates a GitHub Actions CI/CD pipeline for you that’ll automatically trigger when you merge a pull request into your main branch. It’s still in preview, but a GA release isn’t too far off.

To borrow from a famous Henry Ford quoteyou can have it from any repo you want, so long as it’s GitHub.

That has changed. Azure Static Web Apps now provides Azure DevOps support. If you have a repository in Azure DevOps, you can wire up an Azure Pipelines YAML file that builds and deploys your app to Azure Static Web Apps. While it isn’t as streamlined and elegant as the GitHub experience—you need to configure your deployment token manually, and you don’t get automatic staging environments—it sure beats the alternative for Azure DevOps customers (that is, no Azure Static Web Apps support at all).

#devops #azure #azure devops #pipelines

Automating deployments to on premise servers with Azure DevOps

As someone who has spent most of their (very short) career doing one click cloud resource deployments, I was shocked when I jumped onto a legacy project and realised the complexity of the deployment process to staging and production. Using a traditional .NET Framework application stack, the deployment process consisted of the following steps:

  1. Set the configuration target in Visual Studio to release
  2. Build the project
  3. Copy the .dlls using a USB to a client laptop which was configured for VPN access
  4. Copy the .dlls via RDP to the target server
  5. Go into IIS Manager and point the file path to the new version of the application

As you can see and may have experienced, this is a long, slow and error-prone process which can often take over an hour given likelihood of one of those steps not working correctly. For me it was also a real pain point having to use the client laptop, as it had 3 different passwords to get in, none of which I set or could remember. It also meant if we needed to do a deployment I had to be in the office to physically use the laptop — no working from home that day.

My first step was to automate the build process. If we could get Azure Pipelines to at least build the project, I could download the files and copy them over manually. There are plenty of guides online on how to set this up, but the final result meant it gave me a .zip artifact of all the files required for the project. This also took away a common hotspot for errors, which was building locally on my machine. This also meant regardless of who wrote the code, the build process was always identical.

The second step was to** set up a release pipeline**. Within Azure Pipelines, what we wanted to do was create a deployment group, and then register the server we want to deploy to as a target within that deployment group. This will allow us to deploy directly to an on premise server. So, how do we do this?

Requirements:

  • PowerShell 3.0 or higher. On our Windows Server 2003 box, we needed to upgrade from PowerShell 2.0. This is a simple download, install and restart.
  • .NET Framework x64 4.5 or higher

Steps:

  1. Navigate to Deployment Groups under Pipelines in Azure DevOps:

Image for post

Deployment groups menu item in Azure DevOps > Pipelines

2. Create a new deployment group. The idea is you can have several servers that are in the same group and deploy the code to all of them simultaneously (for example for load balancing reasons). In my case I only have one target in my deployment group, so the idea of a group is a bit redundant.

#azure #azure-pipelines #deployment-pipelines #windows-server #azure-devops #devops

How to Create an Azure API Management Instance using Bicep Lang via Azure DevOps

#Introduction

The more I use Bicep, the more I love it. This is what ARM Templates should have been. When it comes to IaC, I usually use Terraform. It’s the IaC tool we used at my last gig and I like that it has support for multiple clouds.

However, I’ve recently changed jobs and I’m finding that I’m using ARM templates more. With this in mind, I’ve been wanting to learn Bicep and use it in my own personal projects so when the day comes that I have to convert ARM templates to Bicep code, I’ll be prepared 😂

Coming back to this article, I’m working on a personal health application that has a bunch of APIs (Built using Azure Functions ⚡) that interacts with my data. Ideally, I’d like to integrate this within Azure API Management. I deploy these APIs using Azure DevOps so to be consistent, I want to deploy APIM using IaC via Azure DevOps.

I’m going to show you how we can provision an Azure API Management instance using Bicep code and then deploy it using Azure DevOps.

One thing to note before we get started is that of the time of writing, there are no officially supported tasks for Bicep in Azure DevOps. For Terraform and ARM templates, we can use tasks in DevOps to deploy our infrastructure. For this article, I’ve used some AZ CLI tasks to build and deploy my Bicep templates.

So if you’re reading this in a future where we can use officially supported Bicep tasks in DevOps, just keep this caveat in mind 😊

#What is Bicep Lang?

Bicep is a domain-specific language that uses declarative syntax to deploy Azure Resources. When we wrote ARM templates, we were essentially writing JSON to deploy resources to Azure. The syntax for this could get a little complex and for fancy stuff, we would need to write complicated expressions to get it working.

Bicep Lang reduces that complexity significantly. Bicep is a transparent abstraction over ARM templates and when we deploy Bicep templates, it comes with a CLI that transpiles the Bicep file into ARM template JSON.

As of v0.3, Bicep is supported by Microsoft and has 100% parity with ARM templates, meaning that you can start using it for production workloads!

If you want to learn more about Bicep, you can check out the documentation here: https://docs.microsoft.com/en-us/azure/azure-resource-manager/bicep/

#What is API Management

API Management (APIM) allows us to create consistent API gateways for back-end services. Using APIM, we can publish APIs and make them available for external and internal developers to consume.

APIM is made up of:

  • The Gateway which is the endpoint that accepts API calls and routes them to the right backend, verifies API Keys, enforces usage quotas and rate limits etc.
  • The Azure Portal which allows to administer our API Program, define API schema, package APIs into products, set up policies etc.
  • The Developer Portal which serves as the main web presence for developers, providing them with API documentation, allowing them to try out APIs via an interactive console and create an account that they can use to subscribe to APIs.

If you want to dive a bit deeper into APIM, check out the documentation: https://docs.microsoft.com/en-us/azure/api-management/

#Writing our Bicep Code for API Management

Let’s start writing our Bicep code! 💪 The best tool for writing Bicep code is Visual Studio Code. There’s also an awesome extension that you can download that will help validate your Bicep code and provide intellisense: https://github.com/Azure/bicep/blob/main/docs/installing.md#install-the-bicep-vs-code-extension

For this tutorial, I’m not going to focus too much on the complicated aspects of APIM. I just want to provision a simple configuration to get started with.

From what I can see from the docs, it looks like I’ll need the following properties:

  • Name (What the name of the APIM service will be)
  • Type (The Type of resource we’ll be provisioning)
  • ApiVersion (The version of the ARM API that we will be using)
  • Properties (Properties of the APIM that we want to configure, mainly the Publisher Email and Name)
  • Location (where we will provision our APIM instance)
  • SKU (The SKU properties of our APIM instance)

#azure #cloud #devops #programming #azure api #azure devops