top of page
Writer's pictureRan Isenberg

Start Your AWS Serverless Service With Two Clicks

Updated: Mar 4, 2023


shutterstock. Start Your AWS Serverless Service With Two Clicks
Two clicks, go!

Building a Serverless SaaS service is not an easy task.

Teams face challenges such as:

How do you deploy to the cloud?

How do you handle observability in the Serverless domain?

What makes an AWS Lambda handler resilient, traceable, and easy to maintain?

How do you write such a code?

How do you test your code?


What if you could have a more accessible gateway to the fantastic world of AWS Serverless and get a production-ready solution for these challenges?


In this blog, I will present my open-source template project that provides a fully deployable service (with AWS CDK), a CI/CD pipeline, 100% test coverage, and an AWS Lambda handler that contains all the best practices.

Start your Serverless journey with just TWO mouse clicks in Github TODAY.

This blog focuses on the AWS Lambda Cookbook GitHub Template project presented in the first six parts of the AWS Lambda Cookbook blog series.

In case you missed the last blogs:

  • Part 1 focused on Logging.

  • Part 2 focused on Observability: monitoring and tracing.

  • Part 3 focused on Business Domain Observability.

  • Part 4 focused on Environment Variables.

  • Part 5 focused on Input Validation.

  • Part 6 focused on Dynamic configuration and smart feature flags.

  • Part 8 focused on AWS CDK Best Practices.

 

Serverless Service Template - The Orders Service


The template project we will use is a simple order service.

It has an API GW that triggers an AWS Lambda function under the POST /api/orders path.

It stores all orders in an Amazon DynamoDB table.

It also deploys and stores dynamic configuration and feature flags in AWS AppConfig.

Read more about it here.

You are also getting:

  1. CI/CD pipeline based on GitHub actions with Python and security best practices (linters, code formatters, etc.).

  2. CDK 2 deployable project.

  3. AWS Lambda handler files with all the best practices described in my best practices blog series.

  4. Makefile utility that makes development straightforward.

  5. Unit tests, integration tests, infrastructure, security and E2E tests.

  6. 100% code coverage including security tests.

 

TL;DR Video

This blog post was presented at a conference and available as a video.



 

Getting Started

Click on 'Use this template' button, next to the 'About' section.

Next, fill in your repository name for your new & shiny AWS Serverless application.

You should select the 'Private' option. You should choose public only if you decide to open-source the project.

Click on 'Create repository from this template.'

For this example, I chose 'my-service' as the repository name.


 

A New AWS Serverless Application Is Born

Congratulations! You have created your very own Python 3.9 Serverless application.


Project Settings

Head over to the 'General' project's settings and enable the different scanners under security. These settings will keep you safe and up to date. Make sure also to allow code scanning. You can read more about dependabot integration here.

The starting point looks like this:


It should look like this:



 

Getting Started With Local Development Environment


Clone The Project


If you are unsure how to clone your project and set up your credentials properly, follow this guide.


Install Prerequisites

Go to the template's official documentation and follow the getting started guide.


Create Virtual Environment

Open the project via IDE. Start a terminal.

We use poetry to manage Python dependencies.

Run 'make dev.' This command will install all the required project dependencies and open a new virtual environment shell.

Makefile

The project has a fully working makefile covering all aspects of the pipeline: code formatting, linters, import sorter, and tests.

You can read more about the different makefile commands here.

 

Deploy To AWS

Run 'make deploy.'

Once CDK finishes the deployment, you will find a new stack called 'cookbook' in AWS CloudFormation. Now you can run integration and E2E tests.

When you want to delete the stack, run 'make destroy.'

Run Tests

Run 'make pr'. This command will run all the required checks, pre-commit hooks, linters, code formats, pylint, and tests, so you can be sure GitHub's pipeline will pass.

If there's an error in the pre-commit stage, it gets auto-fixed.

However, they must run 'make pr' again, so it continues to the next stages.

Be sure to commit all the changes that make pr does.

Unit Tests

Unit tests can be found under the tests/unit folder.

You can run the tests using the following command: 'make unit.'


Infrastructure & Security Tests

The tests can be found under the tests/infrastructure folder.

You can run the tests using the following command: 'make infra-tests.'

Read more about these tests in my CDK best practices guide.


Integration Tests

Make sure you deploy the stack first, as these tests trigger your lambda handler LOCALLY, but they can communicate with AWS services.

These tests allow you to debug in your IDE your AWS Lambda function.

Integration tests can be found under the tests/integration folder.

You can run the tests using the following command: 'make integration.'


E2E Tests

Make sure you deploy the stack first.

E2E tests can be found under the tests/e2e folder.

These tests send a 'POST' message to the deployed API GW and trigger the Lambda function on AWS.

The tests are run automatically by: 'make e2e'.

 

Project Structure

The project is constructed from two inner projects: infrastructure as code CDK project and the service code (AWS Lambda handler code with tests).

Folders:

  1. CDK folder: under 'cdk' folder. A CDK app deploys a stack that is consisted of one or more CDK constructs. CDK constructs contain AWS resources and their connections/relations. Read more about the CDK project details here and here.

  2. docs folder: GitHub pages documentation folder.

  3. service folder: AWS Lambda project files: handler, input/output schemas, etc., and utility folder. Each handler has its own schemas folder. Utils folder is shared code across multiple handlers.

  4. tests folder: unit/integration/infra and E2E tests.



 

CI/CD Pipeline

You can read more about it here.

The pipeline uses Makefile commands and is based on GitHub actions.

The actions can be found under the '.github/workflows' folder.

Adding New Code

When you add a new AWS Lambda handler or any AWS resource, you will add the IaC that deploys it in the 'cdk' folder, the handler code in the service folder, and test the code in the tests/integration and tests/e2e folders.


Got Questions?

Start a new discussion over at the GitHub discussions board of the template.

Want to learn more?

Follow the official template documentation here.

Click here to read the AWS Lambda CookBook blog series.



bottom of page