When managing infrastructure with Terraform, I highly recommend surrounding your automation pipeline with a disciplined GitFlow process. This ensures your changes are trackable, maintainable, and consistent across environments. However, despite your best efforts to automate, there are times when you need to execute things locally — by “locally,” I mean outside of Azure, on your own laptop.

Running Terraform locally can be done using your own user credentials or an identity like a service principal (though not a managed identity, because we’re talking about running locally). Executing Terraform in this way provides finer-grained control over critical tasks like state management operations and allows you to quickly interrogate and diagnose problems using the console.

Setting Up a Runner Script for Local Execution

To streamline this process, it’s useful to have a runner script that will execute Terraform commands (any of the core workflow commands). More importantly, this script should set up the necessary execution context that a pipeline agent would typically handle for you. Depending on your setup, this can range from being relatively easy to quite challenging.

I try to design my Terraform solutions to be easier to execute locally, should the need arise. The common components of the execution context typically include the following:

  1. Backend Configuration: This includes the all-important backend key, which determines the state file location.
  2. Terraform Workspace (Optional): If you aren’t just using the backend key to differentiate state files, you might need to switch workspaces.
  3. Input Variables: Come in many shapes and sizes and most importantly from many sources — whether they be *.tfvar files or TF_VAR_ environment variables.

Input variables come in three distinct categories:

  1. Variables known at compile time and stored in code.
  2. Variables that come from another tool in your toolchain (in a pipeline, this is all handled automatically).
  3. Variables that need to be sourced from a secret management store, such as Azure Key Vault.

Accessing Secrets and Managing Identity

If your input variables are stored in a secret management store, you need to decide whether to access them using your credentials or a service principal. If you lack access, you might need to request temporary permissions.

When running locally, you may need to create a dedicated .debug.tfvars file that won’t be checked into version control. This file allows you to hard-code values for a known environment. For other cases, you might need to write a script to retrieve these values and set them using the TF_VAR_ environment variable.

A common issue arises when using a data source to access the current identity in Terraform. This can be a convenient way to set up your pipeline environment through role assignments. However, if you switch the identity to your personal credentials, you might inadvertently alter these assignments, causing issues in the environment. So, proceed with caution!

Interactive Login

You can either just login using az login or you can use device code login. This later gives you more control over the browser you’re logging in from but is sometimes restricted by organizations under certain situations.

If your organization supports it, or if you have multiple profiles, using the –use-device-code command line option to log in is a simpler method. This approach provides a URL and code, allowing you to log in from any web browser or device instead of launching in your default web browser.

Example 1: A Simple Runner Script for Local Execution

Here’s an example of how to create a simple script to streamline the local execution of Terraform:

export ARM_SUBSCRIPTION_ID=00000000–0000–0000–0000–000000000000

terraform init

terraform $*

By adding this script .debug.sh to the root folder of your Terraform root module, you ensure that the correct subscription is always used. This is required, now, when working with the latest version of the Azure Terraform provider azurerm version 4.0.0. No longer can you get away with using the currently selected Azure Subscription using az account set. You now have to explicitly set the Azure Subscription ID.

Ideally, this is not done in a *.tf file as this would mean tightly coupling your Terraform codebase to a particular subscription and making your module all the more less reusable and less flexible. That’s why I always recommend piping in the execution context dynamically and excluding this from the configuration itself.

Using terraform $* in a bash script adds significant flexibility when executing Terraform commands, allowing for a more dynamic and adaptable workflow. The $* is a special shell variable that expands to include all the arguments passed to the script. By appending $* after the Terraform command, you enable the script to accept any Terraform command, such as plan, apply, or destroy, along with additional flags or options. This allows you to pass in arguments on the fly without modifying the script itself.

For example, instead of writing separate scripts for different Terraform operations, such as terraform plan or terraform apply, you can use a single script that dynamically handles any Terraform command based on the input arguments. This is particularly useful when working locally or troubleshooting because it allows you to test various configurations, commands, and scenarios without the need to hard-code or modify the script repeatedly. You can invoke the script to run any Terraform command, however it works best if you contextualize the script for executing Terraform core workflow commands as they have acommon set of command line options — whether you are applying plans or destroying resources, and customize the execution by adding flags like -var-file or even -var if you so choose.

Moreover, this approach ensures consistency in your workflow by incorporating any necessary environment variables or backend configurations before executing Terraform. This reduces errors and maintains a stable execution environment when testing locally. Let’s see how we can expand on this script to make it really useful.

Example 2: Advanced Runner Script with AzureRM Backend and Environment-Specific Variables

This next script example demonstrates how to configure the azurerm Terraform provider with authentication, remote state using Azure Storage, and which environment we want to manage. Since a single Terraform codebase can be used to manage a multitude of environments for a given application, we set the application_name and the environment_name input variables in a convenient spot to allow us to both specify and clearly see what environment we are targeting.

#!/bin/bash

# 1. Execution Context
export ARM_SUBSCRIPTION_ID=00000000-0000-0000-0000-000000000000
export ARM_TENANT_ID=your-tenant-id
export ARM_CLIENT_ID=your-client-id
export ARM_CLIENT_SECRET=your-client-secret

# 2. Workload Identity
export TF_VAR_application_name="marksblog"
export TF_VAR_environment_name="dev"

# 3. Backend Configuration
export BACKEND_RESOURCE_GROUP="fizz"
export BACKEND_STORAGE_ACCOUNT="buzz"
export BACKEND_STORAGE_CONTAINER="wizz"
export BACKEND_KEY=$TF_VAR_application_name-$TF_VAR_environment_name

# 4. Initialize with Backend
terraform init \
    -backend-config="resource_group_name=${BACKEND_RESOURCE_GROUP}" \
    -backend-config="storage_account_name=${BACKEND_STORAGE_ACCOUNT}" \
    -backend-config="container_name=${BACKEND_STORAGE_CONTAINER}" \
    -backend-config="key=${BACKEND_KEY}"

# Execute Core Workflow
terraform $* \
    -var-file="env/${TF_VAR_environment_name}.tfvars"
  1. Execution Context: The Azure Subscription, Entra ID Tenant and credentials to use when authenticating.
  2. Workload Identity: Unique Identifier for the workload — that is, the specific environment of a specific Application or Service.
  3. Backend Configuration: Where Terraform State is stored and how to locate the specific Terraform State file for this specific workload.
  4. Initialize with Backend: Initialize Terraform with the correct Backend Configuration
  5. Execute Core Workflow: Run a Terraform core workflow command with the correct context. Using the $* technique allows us to execute this bash script with significant flexability.

Setting the execution context is the most important as it specifies where in Azure you want to provision and what Entra ID identity will be doing the work.

The next most important thing is identifying exactly what workload you are trying to manage. You might be surprised by this, thinking the Backend configuration might be more important but ultimately the Backend configuration is driven by what environment we are trying to deploy to and that is driven by logical constructs of “Application” and “Environment”. The Application or Service being the system or deployable unit we are targeting as a category and then Environment being which long-lived instance of that system we are trying to deploy or manage.

The azurerm backend can be configured with dynamic variables for the Backend’s Storage Account. In order to triangulate the terraform state file on this Azure Storage Account you need to know what Azure Resource Group the storage account is in, the Blob Storage container that the state file is in and the unique key for the Terraform State file, known as the ‘Backend Key’.

Make Sure to Update Your Git Ignore

Last thing — and it’s kind of a big deal. Since the .debug.sh and .debug.tfvars files are intended to run only locally, it’s critical to ensure they are included in your standard Terraform .gitignore file. Here’s what mine typically looks like:

**/.DS_Store
**/.terraform*
**/terraform.tfstate*
**/.debug.sh
**/.debug.pkrvars.hcl
**/.debug.tfvars

This will prevent you from accidentally checking them into your version control system, which could expose sensitive information or cause inconsistencies in your infrastructure across environments. Always double-check your .gitignore file to include any debug or local configuration files to maintain a clean and secure codebase.

Conclusion

Executing Terraform locally gives you the control and flexibility needed to manage your infrastructure efficiently, especially in a “break glass” scenario when diagnosing and solving problems quickly. Setting up a proper local environment with a runner script ensures you can easily transition between local and automated workflows without disrupting your pipeline configurations. Whether you’re accessing secrets or handling complex state management, having a local setup helps maintain consistency and reduce errors when precision and accuracy are most necessary! With a little upfront effort, you can make your Terraform environment as powerful on your laptop as it is in your CI/CD pipeline.

Until next time — Happy Azure Terraforming!!!