Select your cookie preferences

We use essential cookies and similar tools that are necessary to provide our site and services. We use performance cookies to collect anonymous statistics, so we can understand how customers use our site and make improvements. Essential cookies cannot be deactivated, but you can choose “Customize” or “Decline” to decline performance cookies.

If you agree, AWS and approved third parties will also use cookies to provide useful site features, remember your preferences, and display relevant content, including relevant advertising. To accept or decline all non-essential cookies, choose “Accept” or “Decline.” To make more detailed choices, choose “Customize.”

AWS Logo
Menu

Q-Bits: Setting Up Lambda Functions in CloudFormation with Amazon Q Developer

Discover how Amazon Q Developer can help with generating Lambda function resources in CloudFormation for rapid development and deployment

Harish Vaswani
Amazon Employee
Published Jan 28, 2025
Last Modified Feb 28, 2025
Welcome to another installment of Q-Bits, our regular series showcasing cool ways Amazon employees are leveraging Amazon Q Developer. Today, we're diving into how Q Developer can assist with generating Lambda function resources in AWS CloudFormation.
Writing Infrastructure as Code (IaC) with AWS CloudFormation is something I always do when creating serveless applications. However, the manual process of writing them can be tedious and error prone. I often find myself spending more time in fixing things like indentation issues or missing dependent resources. I've been using Amazon Q Developer now to help me generate a template that I can quickly use and deploy to create the infrastructure I need.
This article focuses on how one could use Amazon Q Developer for generating AWS Lambda function resources in CloudFormation. I will show how Q can be used to:
  • Generate the template
  • Explain the resources
  • Assist with deployment

Generating the template

My application is simple. I have a need for a Lambda function that gets triggered by a scheduled event and needs to read objects from an Amazon S3 bucket and write items to an Amazon DynamoDB table.
For example, I can prompt Q by simply saying:
Create a CloudFormation template for a Lambda function. The function needs to be triggered from a scheduled EventBridge event and needs to read objects from an S3 bucket. It also needs permissions to write to a DynamoDB table.
Based on the prompt above, Q generated this template:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
AWSTemplateFormatVersion: '2010-09-09'
Description: 'CloudFormation template for Lambda function with EventBridge trigger, S3 read, and DynamoDB write access'

Parameters:
S3BucketName:
Type: String
Description: 'Name of the S3 bucket to read from'

DynamoDBTableName:
Type: String
Description: 'Name of the DynamoDB table to write to'

ScheduleExpression:
Type: String
Description: 'Schedule expression for EventBridge rule (e.g., rate(1 hour) or cron(0 12 * * ? *))'
Default: 'rate(1 hour)'

Resources:
LambdaExecutionRole:
Type: AWS::IAM::Role
Properties:
AssumeRolePolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Principal:
Service:
- lambda.amazonaws.com
Action:
- sts:AssumeRole
ManagedPolicyArns:
- arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
Policies:
- PolicyName: S3ReadAccess
PolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Action:
- s3:GetObject
- s3:ListBucket
Resource:
- !Sub 'arn:aws:s3:::${S3BucketName}/*'
- !Sub 'arn:aws:s3:::${S3BucketName}'

- PolicyName: DynamoDBWriteAccess
PolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Action:
- dynamodb:PutItem
- dynamodb:UpdateItem
- dynamodb:BatchWriteItem
Resource: !Sub 'arn:aws:dynamodb:${AWS::Region}:${AWS::AccountId}:table/${DynamoDBTableName}'

ProcessingFunction:
Type: AWS::Lambda::Function
Properties:
FunctionName: !Sub '${AWS::StackName}-processor'
Runtime: python3.11
Handler: index.handler
Role: !GetAtt LambdaExecutionRole.Arn
Code:
ZipFile: |
import json
import boto3
import os
from datetime import datetime

s3_client = boto3.client('s3')
dynamodb = boto3.resource('dynamodb')

def handler(event, context):
try:
# Get environment variables
bucket_name = os.environ['S3_BUCKET_NAME']
table_name = os.environ['DYNAMODB_TABLE_NAME']
table = dynamodb.Table(table_name)

# List objects in S3 bucket
response = s3_client.list_objects_v2(
Bucket=bucket_name,
MaxKeys=10
)

processed_files = []

# Process each object
for obj in response.get('Contents', []):
# Get object content
file_response = s3_client.get_object(
Bucket=bucket_name,
Key=obj['Key']
)
content = file_response['Body'].read().decode('utf-8')

# Write to DynamoDB
table.put_item(
Item={
'id': obj['Key'],
'content': content,
'processed_at': datetime.utcnow().isoformat(),
'size': obj['Size']
}
)

processed_files.append(obj['Key'])

return {
'statusCode': 200,
'body': json.dumps({
'message': 'Success',
'processed_files': processed_files
})
}

except Exception as e:
print(f"Error: {str(e)}")
return {
'statusCode': 500,
'body': json.dumps({
'message': 'Error',
'error': str(e)
})
}
Description: 'Lambda function to process S3 objects and write to DynamoDB'
Timeout: 300
MemorySize: 256
Environment:
Variables:
S3_BUCKET_NAME: !Ref S3BucketName
DYNAMODB_TABLE_NAME: !Ref DynamoDBTableName
Tags:
- Key: Purpose
Value: DataProcessor

EventBridgeScheduleRole:
Type: AWS::IAM::Role
Properties:
AssumeRolePolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Principal:
Service:
- scheduler.amazonaws.com
Action:
- sts:AssumeRole
Policies:
- PolicyName: InvokeLambda
PolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Action:
- lambda:InvokeFunction
Resource: !GetAtt ProcessingFunction.Arn

ScheduledRule:
Type: AWS::Scheduler::Schedule
Properties:
Description: "
Schedule for invoking the processing Lambda function"
FlexibleTimeWindow:
Mode: "OFF"
ScheduleExpression: !Ref ScheduleExpression
Target:
Arn: !GetAtt ProcessingFunction.Arn
RoleArn: !GetAtt EventBridgeScheduleRole.Arn

Outputs:
LambdaFunctionArn:
Description: 'ARN of the Lambda function'
Value: !GetAtt ProcessingFunction.Arn

LambdaFunctionName:
Description: 'Name of the Lambda function'
Value: !Ref ProcessingFunction

ScheduleRuleArn:
Description: 'ARN of the EventBridge Schedule'
Value: !Ref ScheduledRule
As seen above, Q generated all the resources needed, as a starting point, for what I need the Lambda function to do. It also generated code for my Lambda function to read objects from my S3 bucket and write data to the DynamoDB table. Of course, I may need to customize this and move the code to a separate Python file but this saves me a ton of time as compared to writing this template manually.

Explaining the resources and recommendations

As part of Q's output, it also explained the generated code for the resources and suggested improvements I could make:
Key components of this template:
  1. Parameters :
    • S3 bucket name
    • DynamoDB table name
    • Schedule expression for EventBridge
  2. IAM Roles :
    • Lambda execution role with permissions for:
      • CloudWatch Logs
      • S3 read access
      • DynamoDB write access
    • EventBridge schedule role for invoking Lambda
  3. Lambda Function :
    • Python 3.11 runtime
    • Reads from S3
    • Writes to DynamoDB
    • Includes error handling
    • Environment variables for configuration
  4. EventBridge Schedule :
    • Configurable schedule expression
    • Triggers Lambda function
    • Uses IAM role for permissions
The Lambda function:
  1. Lists objects in the S3 bucket
  2. Reads each object's content
  3. Writes the content and metadata to DynamoDB
  4. Handles errors and provides detailed logging
Important notes:
  1. Prerequisites :
    • S3 bucket must exist
    • DynamoDB table must exist
    • Table should have 'id' as the partition key
  2. Schedule Expression Examples :
    • rate(1 hour)
    • rate(5 minutes)
    • cron(0 12 * * ? *) (daily at noon UTC)
  3. Security Considerations :
    • Follows least privilege principle
    • Uses specific resource ARNs
    • Separates roles for Lambda and EventBridge
  4. Monitoring and Debugging :
    • CloudWatch Logs enabled
    • Error handling with detailed messages
    • Output values for resource ARNs
To enhance this template, you could:
  1. Add CloudWatch Alarms
  2. Include DynamoDB table creation
  3. Add S3 bucket creation
  4. Implement dead-letter queues
  5. Add custom metrics
  6. Include backup strategies
Remember to:
  1. Adjust the timeout and memory based on your needs
  2. Modify the DynamoDB schema as required
  3. Test the schedule expression before deployment
  4. Monitor the function's execution time
  5. Review CloudWatch Logs for errors

Deploying the stack

Finally, Q also generated a AWS CLI command to deploy this stack:
1
2
3
4
5
6
7
8
aws cloudformation deploy \
--template-file template.yaml \
--stack-name scheduled-processor \
--parameter-overrides \
S3BucketName=your-bucket-name \
DynamoDBTableName=your-table-name \
ScheduleExpression="rate(1 hour)" \
--capabilities CAPABILITY_IAM
Having Amazon Q Developer assist me in generating CloudFormation templates like these for Serverless applications has made my day-to-day builder experience much more efficient.
 

Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.

Comments

Log in to comment