This repo aims to demonstrate the use of AWS lambda function using a Dockerized image. It uses the following tools
- AWS CLI (MakeFile interaction)
- An AWS account with sufficient permissions
- Terraform (IaC)
- Docker (container)
The repo allows you to choose between a python base image provided by aws public.ecr.aws/lambda/python:3.XX or from a custom image of your choice (See Dockerfile and Dockerfile.custom for each respective options)
.set-env.sh file in the tf/ folder
the whole infrastructure can be deployed with the following make commands
make create-registrymake deploy-lambdain tf/ecr you will find the terraform configuration to deploy the ecr registry.
The config will create an output registry_url that can be reused when deploying the lambda function
⏩ refer to the official documentation for more information
First login to the created ECR
docker login --username AWS --password-stdin $REPO_URLBuild & Push the image
docker build --no-cache --platform linux/amd64 -t $REPOSITORY_NAME:latest .; \
docker tag $REPOSITORY_NAME:latest $REPO_URL:latest; \
docker push $$REPO_URL:latestWhere:
$REPOSITORY_NAMEis the tag (name) of your docker image$REPO_URLis the full url to access your image (<$AWS_ECR_URL/$REPOSITORY_NAME>)
in the tf/lambda folder your will find the information to deploy your lambda function
💁 Terraform will prompt you for registry_url you can fetch the value from 2.
- Use poetry in the Dockerfile, poetry files have been added if you want to use these in the
DockerFile
- Lambda function gets a KMS access denied to read from registry from time to time, deployement is not idempotent w.r.t that error