
Zero to Hero: Automate installation of WordPress CMS with Terraform and Amazon Q
In this blog, we will see how to use prompts in Visual Studio Code to build a Terraform project with Amazon Q. We will also explore how to run Terraform init, Terraform plan, Terraform apply, and Terraform destroy commands. Additionally, we'll demonstrate how to install a WordPress Content Management System on an EC2 instance with RDS as the database.
Shashank C
Amazon Employee
Published Mar 23, 2025
Last Modified Apr 8, 2025
Welcome to another installment of Q-Bits, our regular series showcasing cool ways Amazon employees are leveraging Amazon Q Developer. Let's create a WordPress content management system on AWS using Terraform Infrastructure as Code (IaC) and Amazon Q. This guide assumes you have Visual Studio Code installed and have configured Amazon Q through your free Amazon Builder account. To install the Amazon Q Developer extension for your IDE, visit https://docs.aws.amazon.com/amazonq/latest/qdeveloper-ug/q-in-IDE-setup.html
Open Amazon Q in your preferred IDE to ask your software development questions.

Ask Amazon Q to help you build a three-tier WordPress CMS application using Terraform. Amazon Q will generate the necessary Terraform configuration files and provide step-by-step guidance.

The following files were generated through additional prompts to Amazon Q, including a customized userdata.sh file. Store all files in a directory named 'deploy-three-tier-webapp-on-aws-using-terraform'.
- providers.tf - Configures AWS as the cloud provider and sets up basic connectivity
- variables.tf - Defines input variables used across the project
- main.tf - Contains the core infrastructure code (VPC, subnets, etc.)
- outputs.tf - Specifies what values should be shown after deployment
- terraform.tfvars - Stores the actual values for variables (should be git-ignored if contains sensitive data)
- .terraform.lock.hcl - Locks provider versions for consistent deployments
- backend.tf - (Optional) Configures where Terraform stores its state file
Think of this configuration file as establishing the connection between Terraform and AWS - similar to how you configure your phone's WiFi settings with a network name and password. When you set up a new Terraform project, this provider configuration is typically one of your first files because it creates the essential link that lets Terraform communicate with AWS.

Key points in this configuration:
- The
provider
block specifies which AWS region to use - The
profile
parameter indicates which AWS credentials to use - The
required_providers
block ensures you use a compatible AWS provider version
Let's break it down into two main parts:
- The
terraform
block:
This
required_providers
block defines the essential tools for your Terraform configuration - similar to a shopping list for a specific project. Here's what each part specifies:- The AWS provider is your primary tool
- The source
hashicorp/aws
indicates where Terraform downloads the provider - The version constraint
~> 4.0
lets you use any 4.x version (for example, 4.1, 4.2, but not 5.0)
- The
provider
block:
The
provider
block configures your AWS connection settings - similar to selecting a specific AWS region for your resources. This configuration uses a variable (var.aws_region
) for the region value instead of a hardcoded string, which provides two key benefits:- Flexibility: You can change the region by updating a single variable
- Reusability: You can use the same configuration across different environments
This file defines Terraform variables that can be used throughout the configuration. Variables make the code more flexible and reusable by allowing users to specify values dynamically rather than hardcoding them.
This variable defines the AWS Region where resources will be deployed, defaulting to us-west-2 if not provided, which Terraform will use automatically.
Defines the IP address range for the Virtual Private Cloud (VPC) using a default CIDR block of 10.0.0.0/16, which provides up to 65,536 IP addresses.
Specifies the Amazon RDS instance class, with a default of db.t3.micro, which is a cost-effective instance type.

Here are the key commands and their explanations. We are going to run
terraform init
, followed by terraform plan
and terraform apply
in sequence.terraform init
- This is always the first command you run in a new or existing Terraform working directory
- Initializes the working directory
- Downloads required providers and modules
- Sets up the backend for storing state files
terraform plan
- Shows what changes Terraform will make to your infrastructure
- Creates an execution plan without actually applying changes
- Good practice to review changes before applying them
- Optional: Use
terraform plan -out=plan.tf
to save the plan to a file
terraform apply
- Applies the changes to reach the desired state
- Creates, updates, or deletes infrastructure resources
- Will show a plan and ask for confirmation before making changes
- To automatically approve:
terraform apply -auto-approve
- To apply a saved plan:
terraform apply plan.tf
terraform destroy
- Destroys all resources managed by your Terraform configuration
- Use with caution as this will delete resources
- Will ask for confirmation before destroying. To automatically approve:
terraform destroy -auto-approve
Watch the video demonstration of executing Terraform commands and installing WordPress on AWS.
Thank you for completing this tutorial. After completing this demonstration, you can remove all created resources to avoid unnecessary costs. Important: This command permanently deletes resources defined in your Terraform configuration. Make sure you're in the correct working directory and environment before proceeding. To decommission the resources, run:
Before we conclude, you can access all the files in GitHub here. Please ensure you run these Terraform files in your development or staging environment, adhere to best coding guidelines, and perform a security check. Happy building WordPress CMS on AWS with Terraform and Amazon Q! If this blog helped you leverage Amazon Q, please hit like and feel free to let us know in the comments. Thank you!
Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.