Published September 11, 2023

One of the first things you do at any new software engineering job is download the code and run it locally. You probably need to install a bunch of stuff first. And then you need to give it environment variables containing all the secrets that let you connect to external services.

At 5 of the 6 full-time software engineering jobs I’ve held, I got these environment variables by slack’ing my manager or another engineer (occasionally they would be shared through onetimesecret). Aside from this process having poor security, it starts to unravel once someone else on the team adds a new environment variable, or needs to rotate a key. This has always frustrated me.

I’m currently working on a side project (maybe I’ll share more about this in a future note 👀), and even though it has a team size of 1, I decided to solve this problem upfront in a pretty simple way.

First, create Google Cloud Secrets with Terraform. This ensures that we have consistent secret key names available across environments. To facilitate this, you could make a simple reusable module that replicates the secret resource across multiple projects.

variable "gcp_projects" {
	type = object({
    local   = string
    preview = string
    prod    = string
  })
}

variable "id" {
  type = string
}

variable "include_local" {
  type = bool
  default = false
}

resource "google_secret_manager_secret" "local" {
  count = var.include_local ? 1 : 0

  secret_id = var.id

  replication {
    automatic = true
  }

  project = var.gcp_projects.local
}

resource "google_secret_manager_secret" "preview" {
  secret_id = var.id

  replication {
    automatic = true
  }

  project = var.gcp_projects.preview
}

resource "google_secret_manager_secret" "prod" {
  secret_id = var.id

  replication {
    automatic = true
  }

  project = var.gcp_projects.prod
}

Note that this does not include the actual value of the secret, we’re just employing an infrastructure as code strategy for creating identical secret labels across each environment.

However, some secrets might not actually be secret in the context of local development. For example, your database might be running in a Docker container on your laptop. For this reason, I have the include_local variable, and in the root of my repository I created a .env.local which is committed to my Git repository, which contains all of my local “secrets that aren’t really secrets”.

Next, I created a custom startup script (eg bin/start.js).

Step one is to read the .env.local file into memory using dotenv,

const rootDirectory = process.cwd().endsWith('bin')
  ? path.join(process.cwd(), '../')
  : process.cwd();

const localDefaults = dotenv.parse(
  fs.readFileSync(path.join(rootDirectory, '.env.local'))
);

Step two is to leverage the Google Cloud CLI to retrieve all of the secrets and their values. For launching these shell commands I am using execa.

await $`gcloud config set project ${DEV_PROJECT_ID}`;

let gcpLocalSecretList = [];

try {
  const { stdout: gcpSecretsListCommandJson } =
    await $`gcloud secrets list --format json --project ${DEV_PROJECT_ID} --page-size unlimited`;
  gcpLocalSecretList = JSON.parse(gcpSecretsListCommandJson.toString());
} catch (error) {
  console.error(
    'Failed to fetch secrets from GCP, double check you are logged in "gcloud auth list" and have the correct permissions to access the dev project in GCP.'
  );
  console.error(error);
  process.exit(1);
}

const gcpLocalSecrets = {};

for (const gcpSecret of gcpLocalSecretList) {
  const gcpSecretName = gcpSecret.name.split('/').pop();
  const { stdout: gcpSecretValue } =
    await $`gcloud secrets versions access latest --secret ${gcpSecretName} --project ${DEV_PROJECT_ID}`;
  gcpLocalSecrets[gcpSecretName] = gcpSecretValue.toString().trim();
}

PS: I have no idea if ‘unlimited’ works. It’s a valid argument for --limit according to the CLI docs but I am currently too lazy to generate several hundred secrets and figure out the best practices for pagination in the GCP secrets command.