Q-Bits: Setting Up Lambda Functions in CloudFormation with Amazon Q Developer
Discover how Amazon Q Developer can help with generating Lambda function resources in CloudFormation for rapid development and deployment
Welcome to another installment of Q-Bits, our regular series showcasing cool ways Amazon employees are leveraging Amazon Q Developer. Today, we're diving into how Q Developer can assist with generating Lambda function resources in AWS CloudFormation.
- Generate the template
- Explain the resources
- Assist with deployment
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
AWSTemplateFormatVersion: '2010-09-09'
Description: 'CloudFormation template for Lambda function with EventBridge trigger, S3 read, and DynamoDB write access'
Parameters:
S3BucketName:
Type: String
Description: 'Name of the S3 bucket to read from'
DynamoDBTableName:
Type: String
Description: 'Name of the DynamoDB table to write to'
ScheduleExpression:
Type: String
Description: 'Schedule expression for EventBridge rule (e.g., rate(1 hour) or cron(0 12 * * ? *))'
Default: 'rate(1 hour)'
Resources:
LambdaExecutionRole:
Type: AWS::IAM::Role
Properties:
AssumeRolePolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Principal:
Service:
- lambda.amazonaws.com
Action:
- sts:AssumeRole
ManagedPolicyArns:
- arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
Policies:
- PolicyName: S3ReadAccess
PolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Action:
- s3:GetObject
- s3:ListBucket
Resource:
- !Sub 'arn:aws:s3:::${S3BucketName}/*'
- !Sub 'arn:aws:s3:::${S3BucketName}'
- PolicyName: DynamoDBWriteAccess
PolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Action:
- dynamodb:PutItem
- dynamodb:UpdateItem
- dynamodb:BatchWriteItem
Resource: !Sub 'arn:aws:dynamodb:${AWS::Region}:${AWS::AccountId}:table/${DynamoDBTableName}'
ProcessingFunction:
Type: AWS::Lambda::Function
Properties:
FunctionName: !Sub '${AWS::StackName}-processor'
Runtime: python3.11
Handler: index.handler
Role: !GetAtt LambdaExecutionRole.Arn
Code:
ZipFile: |
import json
import boto3
import os
from datetime import datetime
s3_client = boto3.client('s3')
dynamodb = boto3.resource('dynamodb')
def handler(event, context):
try:
# Get environment variables
bucket_name = os.environ['S3_BUCKET_NAME']
table_name = os.environ['DYNAMODB_TABLE_NAME']
table = dynamodb.Table(table_name)
# List objects in S3 bucket
response = s3_client.list_objects_v2(
Bucket=bucket_name,
MaxKeys=10
)
processed_files = []
# Process each object
for obj in response.get('Contents', []):
# Get object content
file_response = s3_client.get_object(
Bucket=bucket_name,
Key=obj['Key']
)
content = file_response['Body'].read().decode('utf-8')
# Write to DynamoDB
table.put_item(
Item={
'id': obj['Key'],
'content': content,
'processed_at': datetime.utcnow().isoformat(),
'size': obj['Size']
}
)
processed_files.append(obj['Key'])
return {
'statusCode': 200,
'body': json.dumps({
'message': 'Success',
'processed_files': processed_files
})
}
except Exception as e:
print(f"Error: {str(e)}")
return {
'statusCode': 500,
'body': json.dumps({
'message': 'Error',
'error': str(e)
})
}
Description: 'Lambda function to process S3 objects and write to DynamoDB'
Timeout: 300
MemorySize: 256
Environment:
Variables:
S3_BUCKET_NAME: !Ref S3BucketName
DYNAMODB_TABLE_NAME: !Ref DynamoDBTableName
Tags:
- Key: Purpose
Value: DataProcessor
EventBridgeScheduleRole:
Type: AWS::IAM::Role
Properties:
AssumeRolePolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Principal:
Service:
- scheduler.amazonaws.com
Action:
- sts:AssumeRole
Policies:
- PolicyName: InvokeLambda
PolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Action:
- lambda:InvokeFunction
Resource: !GetAtt ProcessingFunction.Arn
ScheduledRule:
Type: AWS::Scheduler::Schedule
Properties:
Description: "Schedule for invoking the processing Lambda function"
FlexibleTimeWindow:
Mode: "OFF"
ScheduleExpression: !Ref ScheduleExpression
Target:
Arn: !GetAtt ProcessingFunction.Arn
RoleArn: !GetAtt EventBridgeScheduleRole.Arn
Outputs:
LambdaFunctionArn:
Description: 'ARN of the Lambda function'
Value: !GetAtt ProcessingFunction.Arn
LambdaFunctionName:
Description: 'Name of the Lambda function'
Value: !Ref ProcessingFunction
ScheduleRuleArn:
Description: 'ARN of the EventBridge Schedule'
Value: !Ref ScheduledRule
- Parameters :
- S3 bucket name
- DynamoDB table name
- Schedule expression for EventBridge
- IAM Roles :
- Lambda execution role with permissions for:
- CloudWatch Logs
- S3 read access
- DynamoDB write access
- EventBridge schedule role for invoking Lambda
- Lambda Function :
- Python 3.11 runtime
- Reads from S3
- Writes to DynamoDB
- Includes error handling
- Environment variables for configuration
- EventBridge Schedule :
- Configurable schedule expression
- Triggers Lambda function
- Uses IAM role for permissions
- Lists objects in the S3 bucket
- Reads each object's content
- Writes the content and metadata to DynamoDB
- Handles errors and provides detailed logging
- Prerequisites :
- S3 bucket must exist
- DynamoDB table must exist
- Table should have 'id' as the partition key
- Schedule Expression Examples :
rate(1 hour)
rate(5 minutes)
cron(0 12 * * ? *)
(daily at noon UTC)
- Security Considerations :
- Follows least privilege principle
- Uses specific resource ARNs
- Separates roles for Lambda and EventBridge
- Monitoring and Debugging :
- CloudWatch Logs enabled
- Error handling with detailed messages
- Output values for resource ARNs
- Add CloudWatch Alarms
- Include DynamoDB table creation
- Add S3 bucket creation
- Implement dead-letter queues
- Add custom metrics
- Include backup strategies
- Adjust the timeout and memory based on your needs
- Modify the DynamoDB schema as required
- Test the schedule expression before deployment
- Monitor the function's execution time
- Review CloudWatch Logs for errors
1
2
3
4
5
6
7
8
aws cloudformation deploy \
--template-file template.yaml \
--stack-name scheduled-processor \
--parameter-overrides \
S3BucketName=your-bucket-name \
DynamoDBTableName=your-table-name \
ScheduleExpression="rate(1 hour)" \
--capabilities CAPABILITY_IAM
Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.