Cross account S3 data transfer using EventBridge and Lambda
Transferring data from S3 bucket in one account to another using Lambda with EventBridge scheduled trigger.
Published Jan 18, 2025
Organisations are increasingly adopting multi-account AWS strategies to enhance security, maintain compliance, and improve operational efficiency. One of the most common challenges enterprises face is managing secure data sharing between these accounts, particularly when it comes to Amazon S3 storage. Let's explore some real-world scenarios where cross-account S3 access becomes not just beneficial, but essential for business operations. Imagine a large financial institution where the development team needs to safely access production data for testing new features, while maintaining strict compliance requirements. Or consider a healthcare organization where different departments – from research to patient care – need to access shared resources while keeping sensitive data protected.
In this article, we'll dive deep into implementing secure cross-account S3 access patterns, exploring best practices, and examining how organizations can leverage AWS's robust security features to maintain control while enabling necessary data sharing. We'll look at practical implementations using IAM roles, bucket policies, Eventbridge and Lambda.
You need to already have two AWS accounts setup to work with (in the rest of the blog, these will be referred to as dev and prod). If you only have a single account, you can setup another one for the purpose of this blog following my other blog: Managing multiple AWS accounts with AWS Organisations
Switch to the dev account using the role switch option from the top right. Once inside the dev account, navigate to the S3 console and do the following:
- Create a Amazon S3 bucket
source-bucket-demo-ca
with few dummy txt files to be copied over. - In the source bucket, add a bucket policy, to trust cross account role created in the prod account to allow list and get actions on the bucket contents.

Switch to the Prod account. This is the account that will require cross-account access to
dev
account to get the objects from the S3 bucket in dev
account and copy into existing bucket. - In the S3 console, create an empty bucket for the files to be copied into. We will name it
dest-bucket-demo-ca
- In IAM console, we will need to create a new role for Lambda to assume. We added the
AWSLambdaBasicExecutionRole
managed policy and an inline policy for S3 access.

The inline policy will allow the following actions on S3 buckets in the
dev
and prod
accounts:- s3:GetObject on all objects in the
source-bucket-demo-ca
bucket arn in the dev account. This will need to have a*
after the backslash so it applies to all objects in the bucket - s3:ListBucket on the
source-bucket-demo-ca
arn. - s3:PutObject on the objects in the destination bucket in the
prod
account. Again, the bucket arn will need to be followed by a/*
.
Navigate to the Trust relationship tab and paste the following json policy. This will grant the Lambda service principal permission to assume your role.
Lambda
In the Lambda console, we will now create a function to read data from S3 bucket in
dev
Account and write contents to prod
account bucket. Click create function and enter the following:- Function Name
- RunTime: Python3.12
- Switch to
Use Existing Role
option and select the Role just created
The remaining options can be left as default. Click Create. Once the function has been created, we will need to got the function configuration and increase the timeout slightly (as default is 3 secs). Increase this to 15 secs.

Now under the code, paste the following code snippet which does the following:
- Create an S3 resource using the Boto3 session
- Lists all the objects in the non-prod bucket
- Parse the response to generate a list of object keys
- Iterate through the list of keys and create a source bucket dictionary named
copy_source
with the sourcebucket name
and the object key which needs to be copied to another bucket. - Create a Boto3 resource that represents your target AWS S3 bucket using the
s3.bucket()
function. - Uses the boto3 resource
copy()
function, to copy the source object to target. - Returns a
200
code in response if successful and message in the body.

Once code is pasted, click deploy to register the new changes. Now, we will need to configure a few environment variables for the source and destination bucket names as we have are fetching them from these environment variables in the code. Switch to the configuration tab and under environment variables add the following environment variables for
- SOURCE_BUCKET:
source-bucket-demo-ca
- DEST_BUCKET:
dest-bucket-demo-ca
Now navigate to the test tab. Create a test event and trigger the lambda function. If successful, you should see the following output with a
data copy
message in the response body.
Navigate to the bucket that was created previously in the S3 console. The objects should now be visible in the bucket in the prod account

The data will be copied on schedule as setup in the EventBridge rule to schedule daily trigger (see screenshots below) or can be triggered manually.
In the eventbridge console, a rule was created of type schedule of fixed rate of 1 day to trigger the target lambda function.

