Implementing Envelope Encryption with Amazon MSK and AWS KMS
Learn how to use AWS KMS to encrypt and decrypt Apache Kafka messages using a concrete example with Java Spring Boot producers and consumers
Camille
Amazon Employee
Published Jun 10, 2025
As a solutions architect working with sensitive data in the cloud, I'm always looking for ways to enhance security while maintaining performance. Amazon Managed Streaming for Kafka (MSK) is a fantastic service for building real-time data streaming applications, but sometimes the default encryption options aren't enough for highly sensitive workloads.
While MSK already provides encryption in-transit and at-rest, certain compliance requirements or security best practices might demand application-level encryption. This is where envelope encryption with AWS Key Management Service (KMS) comes into play.
In this post, I'll walk you through a practical implementation of envelope encryption for Kafka messages using Spring Boot, AWS KMS, and Amazon MSK. I'll share the code I've developed and explain how you can deploy and test this solution yourself.
I purposely don't make use of the AWS encryption SDK which I recommend to use for production systems but which hides some of the inner workings of encryption and decryption and thus would defeat the purpose of this post!
Envelope encryption is a practice where you generate a data key from a master key, and then encrypt data with that data key. This approach offers several benefits:
- Performance - You don't need to call KMS for every message encryption/decryption
- Security - Your master key never leaves the KMS service
- Key rotation - You can rotate the master key without re-encrypting all your data (I will dive deep into key rotation later into this post)
- Message size - KMS can encrypt at most 4KB of data. Envelope encryption allows you to encrypt arbitrarily large payloads.

Here's the flow I have implemented:
- Producer requests a data key from KMS
- KMS returns both plaintext and encrypted versions of the data key
- Producer encrypts the message with the plaintext data key
- Producer sends the encrypted message and encrypted data key to Kafka
- Consumer receives the message and encrypted data key
- Consumer asks KMS to decrypt the data key
- Consumer decrypts the message with the plaintext data key
You can find the full solution on GitHub but for now let's have a look at the key components of my implementation.
The project is organized as a monorepo with the following structure:
This structure separates the infrastructure code from the application code, making it easier to manage and deploy each component independently.
Let's take a look at the application code. I will highlight the most important pieces.
The producer's
KmsEncryptionService.java
handles generating data keys from KMS and encrypting messages:Notice how I've implemented data key caching to improve performance. This is a crucial optimization that reduces the number of KMS API calls while maintaining security. An important piece is that this will not impact key rotation as you'll see later.
The producer service wraps the encrypted message in an envelope before sending it to Kafka:
Here's a visualization of the flow from the producer perspective:

On the consumer side, the
KmsEncryptionService.java
handles decrypting the data key and then the message:Again, I've implemented caching on the consumer side to improve performance.
And here's the flow from the consumer perspective:

Before I show you how to deploy and test the sample code, I want to bring your attention to the implementation of key rotation.
One of the major benefits of envelope encryption is how it handles key rotation. Let's explore what happens during key rotation and how our implementation makes this process transparent to the application.
When you enable automatic key rotation for a KMS key (as I did in my CDK code with
enableKeyRotation: true
), AWS KMS automatically creates a new backing key every year. This is important to understand:- The KMS key ID and ARN remain the same
- Only the backing key material changes
- AWS KMS maintains all previous backing key versions
- Each data key is encrypted with a specific backing key version
This means that even after rotation, KMS can still decrypt any data key that was encrypted with a previous backing key version. This is crucial for our envelope encryption pattern.
Since I cache data keys for performance (for up to an hour in my implementation), it's possible that key rotation could occur while keys are cached. This is perfectly fine:
- Existing cached data keys continue to work for encryption/decryption with the old backing key versions
- New data keys will be encrypted with the new backing key version
- Consumers can decrypt data keys encrypted with any backing key version by intercepting an exception if using the old plaintext key for a new message
This design ensures that key rotation is completely transparent to both producers and consumers, with no downtime or special handling required.
Here's the important code snippet for the consumer, extracted from the
KmsEncryptionService.java
I covered earlier:To make it easier to test this sample code, I used AWS CDK to provision all the necessary resources. The key parts of the infrastructure are:
- A KMS key with automatic rotation enabled
- An MSK cluster with IAM authentication
- ECS Fargate services for the producer and consumer applications
- Appropriate IAM permissions for KMS and MSK access
To deploy this solution, first deploy the CDK infrastructure:
Once deployed, you can test the solution using the Swagger UI:
- Access the producer service's Swagger UI at http://<producer-service-url>/swagger-ui.html
- Use the /api/messages POST endpoint to send an encrypted message
- Access the consumer service's Swagger UI at http://<consumer-service-url>/swagger-ui.html
- Use the /api/messages GET endpoint to view received and decrypted messages
In this post, I've shown you how to implement envelope encryption for Kafka messages using AWS KMS and Spring Boot. This approach provides an additional layer of security for sensitive data while maintaining good performance through data key caching.
The solution is fully automated with AWS CDK, making it easy to deploy and manage. It leverages AWS managed services like MSK and KMS, reducing the operational overhead while providing enterprise-grade security.
Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.