How to run your lambda code locally as its role (for testing)

While AWS Lambda is fantastic in providing a serverless platform with few worries about maintaining servers, it is not the easiest to test in an automated fashion with rapid feedback.

You could write end to end tests, but it means a deployment after each change and then checking the logs to see what failed. Even if you use iac (terraform/pulumi), the deployment will take seconds or a minute or two – not exact rapid test feedback.

What I have been doing is to set up a hook which is called from the lambda handler, which can also be called locally. Within the test, I then assume the role that runs the lambda and then test the hook.

This mechanism allows to me easily test that the permissions are set up correctly and that details are in place for the code to work.

For the full end to end test, I then have a simple smoke test or two.

The code samples are in golang(only because it happens to be my current language of choice), but the idea should be equally applicable in other languages.

Assuming The Role

  roleToAssume := os.Getenv("AUTH_LAMBDA_ROLE_ARN")

	ctx := context.TODO()
	cfg, err := config.LoadDefaultConfig(ctx)

	if err != nil {
		logger.Fatal("error: ", err)
	}
	// Create the credentials from AssumeRoleProvider to assume the role
	// referenced by the "myRoleARN" ARN using the MFA token code provided.
	creds := stscreds.NewAssumeRoleProvider(sts.NewFromConfig(cfg), roleToAssume)

	logger.Debugf("creds: %v", creds)

	cfg.Credentials = aws.NewCredentialsCache(creds)

the cfg is then passed into the New method for the resource you are interested in. e.g.:

	ssmClient := ssm.NewFromConfig(cfg)

Working Example

You can find a full, working example test in my github repo under post/2023/11/autolambdatest

NOTE: It WILL automatically try and deploy a role and a ssm parameter, and it will delete it after the test.

The BeforeSuite will deploy the minimum configuration to be able to run the test, and the AfterSuite will destroy the same stack.

You will likely need to log into pulumi to get this test to work.

If you run into permissions issue for AssumeRole, read on.

AssumeRole Permissions

For this to work, the user running the tests need to have permissions to AssumeRole.

There are two steps to this. The first part is to allow “anyone” to AssumeRole the relevant role:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": {
        "AWS": "arn:aws:iam::<your-account-id>:root"
      },
      "Action": "sts:AssumeRole"
    }
  ]
}

This will allow “any user” to assume the role, as long as they have the permission to do so.

arn:aws:iam::[your-account]:root is a special user the represents the account (and the non-IAM root user). Since IAM (user, roles etc.) exists under the “root” account, all calls are also authenticated by this account – i.e. all users, roles etc. in IAM is also this account. There is a post on reddit discussing what exactly the root iam principal is for more information

Finally, unless you have the AdministratorAccess policy set against your account, you will also need to attach a policy to the relevant group (or your user) that grants permissions to call sts:AssumeRole (or *)

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "123",
            "Effect": "Allow",
            "Action": [
                "sts:AssumeRole"
            ],
            "Resource": [
                "arn:aws:iam::123456789012:role/desired-role"
            ]
        }
    ]
}

You can of course, also use * for Resource above to allow the user/group to Assume Any role. In practice, you might want to automate this as part of the creation of the relevant roles. (i.e. create the role, then give the relevant group permissions to Assume that role).

Separating out integration tests for golang in Bazel

Why

There are many kinds of automated tests and two main kinds are integration tests and unit tests.

Unit tests are designed to run as fast as possible, so any slower processes like databases are mocked out. While super helpful and powerful in terms of providing confidence in the software, it should be only one part of the testing strategy.

Integration tests, as is implied runs tests of the different part of the software integrated together. Technically speaking, you can still mock out the database and other slower layers to keep it running quickly. However, there is value in including a database or other slower services in the process to test as them in an automated fashion.

What this does mean though, is that you want to be able to run only the unit tests or run the integration tests as well. You might also want to have smoke tests, which are run on your live production environment.

How

You could define a separate target in your BUILD file with the unit tests and let gazelle automatically build your default test target with all the tests. I found this frustrating to use as I had to keep tweaking the dependencies manually whenever anything changed (which happened often)

Tagging

The easiest way to achieve this for golang and bazel is to tag your source code files. You can do this by adding the following to the top of your integration test files

Continue reading

Streaming Progress With Pulumi Automation API

When using the pulumi automation API, you lose some of the niceties of the pulumi CLI, like having to set up command line args processing and the output is not as friendly or pretty as before. It also doesn’t stream the output – though this one is easier to fix.

This is lifted straight out of their golang example code, so if you’re working in another language – you should be able to find the relevant code in the same repo

	// wire up our update to stream progress to stdout
	stdoutStreamer := optup.ProgressStreams(os.Stdout)

	// run the update to deploy our fargate web service
	res, err := stack.Up(ctx, stdoutStreamer)
	if err != nil {
		fmt.Printf("Failed to update stack: %v\n\n", err)
	}

How to get current Function URL (aws-lambda + golang)

When deploying a function lambda that needs details of its own function URL. It’s an OAuth Callback, and needs to calculate the redirect. There are possible security issues doing it this way, so will switch to http gateway on launch. In the meantime, though, I ran into a bit of a chicken and egg problem.

In Pulumi, the function URL is created after the function (and even otherwise), I can’t pass the output of the lambda (or lambdaFunctionUrl) back in as an environment variable. Fortunately, there is an easy way to pick up the Function URL (or the function name for that matter) – if you know how 😉

	domainName := request.RequestContext.DomainName
	funcName := os.Getenv("AWS_LAMBDA_FUNCTION_NAME")
	return fmt.Sprintf("Domain: %s, funcName: %s", domainName, funcName), nil

There are other defined lambda function environment variables as well that you can use.

Bazel + Pulumi Automation API. Deploying an inline stack along with Pulumi.[stack].yaml

I believe in CI/CD/CD as in Continuous, integration, delivery and deployment. As part of this, I am setting up a workflow where on merge to develop (or main/trunk), the deployment is triggered automatically. Pulumi deploys the current state of code and infrastructure through GitHub actions and OpenID Connect(OIDC) as part of the GitHub Action.

I used to configure Pulumi to be triggered directly from the build process, but bazel (as far I know), does not support Pulumi. When I used pants, there was a custom module, developed by one of the community members which did support pulumi (You might have to ask in the slack channel if you’re interested), but they stopped maintaining it as they moved to the Pulumi Automation API.

I am using Automation API from the start, and configuring a “deployer” per product/project within the monorepo. The intention is for the deployer to be as smart as possible – eventually up-ing only the stacks that have changes since the last time- but that’s a way down the line.

Another benefit from the Automation API is to pick up the stack outputs automatically when running integration/e2e tests, making the test configuration smoother.

Continue reading

Including a built artifact in another target (Bazel, golang)

We use pulumi to do IaC and we use a monorepo with Bazel as the build tool. We have out modules set out as following

One of the requirements we have is to build a lambda module and then deploy it. The lambda module is a target being built by Bazel (golang, but shouldn’t matter):

go_binary(
    name = "lambda_module",
    visibility = ["//visibility:public"],
)

We then have the iac module, which should get the built version of the above module, so that it can then upload it into lambda

go_binary(
    name = "iac",
    args = [
        "-lambda_module",
        "$(location //products/productA/module/lambda_module)",
    ],
    data = ["//products/productA/module/lambda_module"],
    visibility = ["//visibility:public"],
)
Continue reading