Cross-account CloudFormation via CodePipeline

Cross-account CloudFormation via CodePipeline

This article shows you how to deploy CloudFormation templates to multiple accounts from a central CodeCommit repo using CodePipeline and cross account roles.

Published Jun 24, 2024


Generally, it is good practice to have separate accounts for various environments, such as dev, test, and prod. You also want your infrastructure consistent across those environments, with some parameter changes for larger instances or multi-AZ for prod. One method to achieve this is a single Git repository with multiple branches and a separate pipeline for each environment.
This article provides a step-by-step guide for implementing this strategy using specific AWS native tools, such as AWS CodeCommit, AWS CodeBuild, and AWS CodePipeline.
I have used this strategy with various customers. It provides comfort that what we deploy and test in non-production environments will be the same in production. The automation of deployment means less chance for human error.


The example used in this article has three AWS accounts: a central Shared Services account hosting the CodeCommit Repo and CodePipeline pipelines, a Development account, and a Production account. The various accounts will also have IAM roles and policies.
Pipeline high-level diagram
Pipeline high-level diagram
I have CloudFormation templates in my GitHub to deploy all the resources mentioned below. Please check out my Pipeline-Workshop repository for the details. The sample code assumes all accounts are part of the same AWS organisation. Manual deployment is needed to set up the infrastructure and permissions for our automation.

Initial Configuration

As a start, you'll need a CodeCommit repo. CloudFormation doesn't allow you to define branches, so you must manually create dev and prod branches after the repo is available. I also prefer to set dev as the default branch and delete the main or master branch.
Once you have the repo and branches, you can upload the sample code from my Pipeline-Workshop. I will assume you know how to use git commands. You can also upload the files using the GUI.
The GUI won't allow you to create a new directory. To do that, you must select the option to create a new file.
Select "Create file" from the GUI
Select "Create file" from the GUI
Then, enter the path to the new file, including the directory name. You will need to enter the code manually, but in this example, it is just a small JSON file with the configuration info.
Type the file name with the directory name to create
Directory creation
For now, only commit the code to the dev branch. We'll leave prod for later.

Central Resources

Now that we have the repo and some code, we can start with the infrastructure to deploy it. The resources we need are:
  • An S3 bucket encrypted with a customer-managed KMS key (CMK).
  • An S3 bucket policy to allow access to other accounts (via the OrgID).
  • An IAM policy to provide CodePipeline with the permissions it needs.
  • An IAM role that CodePipeline will assume.
These are all deployed via a single CloudFormation template. We need a CMK to allow other accounts access to the encryption key. You cannot share Amazon-managed keys outside their account.

Remote Resources

With the central resources created, the next step is to deploy the remote resources. Every account we want to use, e.g., dev and prod, needs these resources. The CloudFormation template that creates the resources needs information, exported as stack outputs, from the Central Resources stack.
The resources from the Remote IAM template are:
  • An IAM policy to allow cross-account access to the S3 artifact bucket and the permission to run CloudFormation.
  • An IAM policy to enable the use of the KMS CMK.
  • A cross-account role with those two policies attached.
  • An IAM policy to allow CloudFormation to deploy resources.
  • A role that CloudFormation will assume.
The two roles created here will not be used directly by any resources in these accounts. The pipelines in the central account will use these roles.


A single pipeline template ensures the various environments are consistent in their basic architecture. Configuration files allow for differences in the environments, e.g., prod may have larger instances or multi-AZ. These get passed as parameters to the CloudFormation templates. You can also use condition statements in the pipeline template to allow for dev, test, and prod variations. The template in my example is simple, but it gives some ideas on what you can do. The Environment parameter selects which branch of the repo to use.
Parameters for the dev pipeline
Parameters for the dev pipeline
Parameters for the prod pipeline
Parameters for the prod pipeline
When the dev pipeline is deployed, it runs and triggers the build of a CloudFormation template in the dev account. That run should be successful. The prod run should fail if you have not moved any files into the prod branch. The failure is intentional.
After you've deployed the prod pipeline and the execution fails, it is time to test the automation. Go into the CodeCommit repo and create a Pull Request from dev to prod. If there are no issues, merge the code. After a few seconds, the merge should trigger the pipeline to execute. All going well, the prod pipeline should now have a successful execution.
CodePipeline screen
CodePipeline screen


This blog demonstrates a straightforward process for creating an automated method for deploying code to remote environments. While the process is relatively simple, the key lies in the cross account roles and how they are called within the pipeline code. Pay attention to the 'RoleArn' lines in the CodePipeline template.
The final architecture will look like this:
Final architecture diagram

Next Steps

Now that you've reviewed this post and checked out the templates in my repo, you have the tools to implement this yourself. So, go out and have a play.
Let me know if you'd like me to cover any other cloud operations topics by dropping a comment.