
Stop Using Default Arguments in AWS Lambda Functions
Discover why your AWS Lambda costs might be spiralling out of control due to a common Python programming practice.
Stuart Clark
Amazon Employee
Published Feb 18, 2025
Picture this: your AWS bill just doubled, your Lambda functions are timing out, and your data is inexplicably stale. The culprit? A single line of Python code that seemed harmless at first. Default arguments in AWS Lambda functions can turn into an expensive nightmare if not handled correctly. In this blog post, we'll explore the potential pitfalls of using default arguments and how to architect your Lambda functions the right way.
Default arguments in Python can be a double-edged sword, especially in the context of AWS Lambda functions. While they provide a convenient way to set default values, they can also lead to unexpected behavior and performance issues.
Let's take a look at an example:

In this code, we define
user_cache
as an empty dictionary and settings
with default TTL and max items values. The function checks if the user ID exists in the cache, and if not, it fetches and processes the data, stores it in the cache, and returns the result.While the intention behind this implementation is sound (caching frequently accessed data to reduce database calls and processing time), it has a critical flaw due to Python's treatment of mutable default arguments.
- In Python, default arguments are evaluated when the function is defined, not when it's called.
- All invocations of the function share the same dictionary objects for
user_cache
andsettings
. - This leads to unexpected behavior and memory leaks in the Lambda environment.
The problem with using mutable default arguments in Lambda functions deepens when we consider memory persistence and concurrency.
Let's simulate the Lambda lifecycle with another example:

In this simulation, we see how memory usage grows over time in a Lambda container. During the first invocation, the function creates a cache entry for 'user1'. However, Lambda keeps containers warm for subsequent invocations, and each invocation adds to the same cache dictionary instead of creating a fresh one. The memory footprint grows linearly with each new user, but this memory isn't released between invocations, leading to potential memory leaks.
Additionally, when your function receives multiple concurrent requests, AWS spins up separate containers to handle them. Each container gets its own Python interpreter and, consequently, its own version of the default arguments. This means that different containers might have different data in their caches, leading to inconsistent user experiences.
- Memory usage grows linearly with each new user, but memory isn't released between invocations.
- Concurrent invocations can lead to inconsistent user experiences due to different cache states.
To fix the issues caused by mutable default arguments, we need to ensure that each function invocation gets its own fresh copies of the cache and settings dictionaries. Here's the correct implementation:

In this implementation, we use a common Python idiom to initialize the default arguments properly. Instead of using empty dictionaries as default values, we use
None
. When the function runs, we check if the arguments are None
and initialize new dictionaries if needed. This ensures that each invocation gets its own fresh copies of the cache and settings dictionaries, eliminating the risk of shared state, memory leaks, and inconsistent behavior with concurrent executions.- Use
None
as the default value for mutable arguments. - Initialize new dictionaries within the function if the arguments are
None
. - This ensures each invocation gets its own fresh copies of the cache and settings.
To ensure optimal performance and cost-effectiveness of your Lambda functions, it's crucial to implement proper monitoring. One tool that simplifies this process is the AWS Lambda Powertools, which is a utility library that helps implement AWS Lambda best practices.
Here's an example of how to use AWS Lambda Powertools to monitor memory usage:

In this code, we use Python decorators to automatically log metrics and trace execution. The
@metrics.log_metrics
decorator ensures that all our custom metrics are sent to CloudWatch, while the @tracer.capture_lambda_handler
decorator adds distributed tracing capability, helping us understand the end-to-end execution flow of our Lambda function.- Use AWS Lambda Powertools to simplify the implementation of AWS Lambda best practices.
- Log metrics and trace execution for better monitoring and debugging.
Default arguments in AWS Lambda functions can be a powerful tool when used correctly, but they can also lead to costly pitfalls if not handled properly. By understanding the potential issues caused by mutable default arguments, such as memory leaks and inconsistent behavior with concurrent executions, you can architect your Lambda functions in a way that ensures optimal performance and cost-effectiveness.
Take action now to review your existing Lambda functions for these patterns and implement proper initialization and monitoring. Embrace AWS Lambda Powertools to simplify the process of implementing best practices and stay on top of your Lambda function's performance and costs.
Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.