logo
Creating a Lambda Function with a Container-Based Runtime

Creating a Lambda Function with a Container-Based Runtime

In this article we’ll host a Python application on Amazon Elastic Container Registry (Amazon ECR) and then create a Lambda function using this hosted image as the container runtime.

Published Dec 28, 2023

Amazon Elastic Container Registry (ECR) is a managed AWS service that allows you to host your Docker images on AWS. It is very reliable and allows secure deployment of your images and architectures. It enables you to create both public and private repositories so that you can control access to your images and architectures. AWS Lambda**** is a serverless platform that allows you to upload your code as a zip file or a container image without the hassle of managing infrastructures.
In this article, you’ll first create an ECR repository. Then, you’ll create a Cloud9 environment and configure a simple Python application. You’ll use the integrated terminal in Cloud9 to containerize this application and then push the Docker image to the ECR repository. You’ll then create a Lambda function using this container image as the runtime and then test this function.
By the end of this article, you’ll have hands-on experience in creating and managing Docker containers and Lambda functions, essential skills for any AWS architect.
The architecture diagram will look like the following:
LambdaAsContainer
Lambda Function as Container Runtime

You need to have an AWS account with administrative access to complete the article. If you don’t have an AWS account, kindly use the link to create a free trial account for AWS.
I am going to use my own AWS Account to carry out all the the steps.

Let’s start with creating a repository for the container image in AWS ECR. Later when you create a container image in Cloud9 Environment which works as the runtime for the Lambda Function.
  1. Login to the AWS Console. Select Paris as the region.
  2. Go to the AWS Elastic Container Registry (ECR) console and click on the Repositories menu in the left and then click on the Create repository button.
    ECR1
    ECR1
  3. On the next screen, Select Private visibility settings and enter samplerepo_piyush as the repository name. Keep rest of the configuration to the default and click on the Create repository button.
    ECR2
    samplerepo_piyush
  4. The repository is created in no time. Select the repository created and click on the View push commands button.
    ECR3
    repo_push_commands
  5. In popup window, you can see the commands which are used to push the image to the repository from the development environment. You will use these commands later in the Cloud9 IDE environment.
    ECR4
    ECR_push_commands
  6. The repository is ready. You now create and configure Cloud9 environment for Docker.

You launch AWS Cloud9 environment which works as the development environment to create container image.
  1. Go to the AWS Cloud9 console and click on the Create environment button.
    Cloud9
    piyushdemoenv
  2. On the next screen, enter piyushdemoenv as the name and click on the Next step button.
    piyushdemoenv_cloud9
    piyushdemoenv_cloud9
  3. On the next screen, select Environment type as Create a new instance for environment (EC2). Select Instance type as t2.micro (1 GiB RAM + 1 vCPU). Select Ubuntu Server 22.04 LTS for the Platform. The development environment will have Ubuntu as the operating system. Keep rest of the fields with the default values and click on the Next step button.
    Cloud9config
    Cloud9config
  4. By default AWS Systems Manager (SSM) access will be enabled for Cloud9 environment EC2 Instance and the necessary IAM resources will get created.
  5. On the next screen, click on the Create environment button.
  6. It will take couple of minutes to create the environment. Wait for the environment to be ready. Once it is ready, you can see console window in the bottom part of the screen. It provides console base access to the development machine.
  7. The Docker is already installed and configured on this Cloud 9 environment machine. Run the following command to check the version of the Docker.
    `
    Cloud9_docker_version
    Cloud9_docker_version
  8. With environment ready, it is time to create the container image.

You create a container image and then upload to the AWS ECR repository. The image is used as the runtime for the Lambda Function in the next step.
In AWS Cloud9 IDE, you create a new file app.py and save it with the following code. It is the handler code for the Lambda Function. It is not doing much and just returning a success message.
1
2
3
4
5
6
7
8
9
10
11
12
import json

def handler(event, context):
body = {
"message": "You reached to the container runtime"
}

response = {
"statusCode": 200,
"body": json.dumps(body)
}
return response
  1. pythonappfile
    pythonappfile
    In AWS Cloud9 IDE, Create another file named Dockerfile which holds the docker script for the image.
1
2
3
FROM public.ecr.aws/lambda/python:3.8
ADD app.py /var/task/
CMD [ "app.handler" ]
  1. Dockerfile
    Dockerfile
    The code above is very simple. You use public.ecr.aws/lambda/python:3.8 as the base image and add the handle file app.py to the /var/task/ folder. Finally the app handler is started. The files are ready. You will use the push commands from the AWS ECR repository to create and upload the container image. Run the following command in the console to authenticate to the ECR registry. Replace with the code of the region you are using. Replace with the account number of the AWS account you are using.
1
aws ecr get-login-password --region <Region-Code> | docker login --username AWS --password-stdin <Account-Number>.dkr.ecr.<Region-Code>.amazonaws.com
  1. `
    ECR_Login
    ECR_Login
    Next you run the following command in the console to create the container image. There is a dot in the end of the command, copy the complete command.
    ECR_Image_Build
    ECR_Image_Build
    Next you run the following command to tag the image so you can push the image to the repository. Replace with the code of the region you are using. Replace with the account number of the AWS account you are using.
    docker_image_tagged
    docker_image_tagged
    The Docker image has been pushed to the AWS ECR repository. You can verify it by opening the samplerepo_piyush repository in the AWS ECS console.
    ECR_Image
    ECR_Image
    The container image is ready. Time to configure Lambda Function using this container image as the runtime.

The container image is ready. You now configure a Lambda Function which uses container as the runtime environment.
  1. Goto the AWS Lambda console and click on the Create function button.
    LambdaFunction1
    LambdaFunction1
  2. On the next screen, select Container image option. Type lambdacontainerfunctiondemo for the function name. Use Browse images button to select the image you uploaded to ECR in the previous step. Select Create a new role with basic Lambda permissions option for the execution role. Keep rest of the configuration to the default and click on the Create function button.
    lambdafunction_container
    lambdafunction_container
  3. The function is created in no time. You can now test the function. Click on the Test button.
  4. On the next popup, type in lambdacontainer_by_piyush for the event name and click on the Create button.
    lambdatestevent
    lambdatestevent
  5. Click on the Test button. The Lambda Function runs successfully. You can see the execution result where the Lambda Function handler returns the message.
    lambdacontainerexecution
    lambdacontainerexecution
  6. You saw how you can configure a Lambda Function which uses container as the runtime. This finishes my practical hands-on article. Go to the next step to clean up the resources so that you don’t incur any cost.

Delete the resources used for the workshop to avoid any further cost.
  1. Delete lambdacontainerfunctiondemo Lambda Function.
  2. Delete piyushdemoenv Cloud9 Environment.
  3. Delete samplerepo_piyush ECR Repository.
Hope you enjoyed it.

This concludes the present article. Wishing you all a wonderful day! Stay tuned for more updates. Remember to show your support by liking and sharing this post on your social networks. I'll continue to share insights within this fantastic AWS community platform. Please make sure to follow for future content!
Let's build a community that thrives on the exchange of knowledge, where collaboration fuels innovation. Join me in this endeavor, and together, let's shape the future of technology.