Select your cookie preferences

We use essential cookies and similar tools that are necessary to provide our site and services. We use performance cookies to collect anonymous statistics, so we can understand how customers use our site and make improvements. Essential cookies cannot be deactivated, but you can choose “Customize” or “Decline” to decline performance cookies.

If you agree, AWS and approved third parties will also use cookies to provide useful site features, remember your preferences, and display relevant content, including relevant advertising. To accept or decline all non-essential cookies, choose “Accept” or “Decline.” To make more detailed choices, choose “Customize.”

AWS Logo
Menu

Amazon Route 53: Backup Zone Data

Amazon Route 53 users should regularly back up their zone data to protect against accidental deletion. This article explains how to automate daily backups of your Route 53 zones using native AWS services.

Tracy Honeycutt
Amazon Employee
Published Oct 30, 2024

Introduction

AWS customers using Amazon Route 53 should back up their zones and resource records to protect against accidental deletion. This allows you to restore data if needed. This article explains how to create an automated backup of your Route 53 data to an S3 bucket.

Prerequisites

You need familiarity with Amazon Route 53, AWS Identity and Access Management (IAM), AWS Lambda, Amazon EventBridge, and Amazon S3. You need a secure S3 bucket to store backups. Consider adding a bucket lifecycle policy to automatically manage backup retention.

Overview

Route 53 backups are performed by an AWS Lambda function with a Python script. Create an IAM role with permissions to execute the Lambda function and write permissions to the S3 bucket. Schedule the Lambda function using Amazon EventBridge. The function exports Route 53 data as CSV and JSON files and writes to output to your S3 bucket.

Implementation Steps

  1. Create an IAM service role and policy for the Lambda function with permissions to read from Route 53 and write to the specified S3 bucket.
  2. Create a Lambda function to read Route 53 configurations and write both CSV and JSON files to the S3 bucket.
  3. Create an EventBridge schedule to run the Lambda function daily.

Lambda needs permission to write to the specified S3 bucket and read from Route 53. This requires a custom role with the required permissions to be created.
  1. Create an IAM policy named Route53_Backup_Policy using the following JSON:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Route53ReadAccess",
"Effect": "Allow",
"Action": [
"route53:Get*",
"route53:List*"
],
"Resource": "*"
},
{
"Sid": "S3FullAccess",
"Effect": "Allow",
"Action": [
"s3:*"
],
"Resource": [
"arn:aws:s3:::BUCKET_NAME",
"arn:aws:s3:::BUCKET_NAME/*"
]
}
]
}
  • Replace BUCKET_NAME with your S3 bucket name.
2. Create an IAM role named Route53_Backup_Role. Attach the Route53_Backup_Policy to this role.

Create a Lambda function to read Route 53 configurations and write both CSV and JSON files to the S3 bucket.
  1. Open the Lambda console and choose "Create function".
  2. Select "Author from scratch" and enter these details.
    • Function name: Route53_Backup
    • Runtime: Python 3.12
    • Architecture: x86_64
  3. Replace the default code with the following Python script:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
import os
import csv
import json
import time
from datetime import datetime
import boto3
from botocore.exceptions import ClientError

# Configuration constants
BUCKET_NAME = 'BUCKET_NAME' # Replace with your actual S3 bucket name
BUCKET_REGION = 'us-east-1' # AWS region where the S3 bucket is located

# Initialize AWS service clients
s3_client = boto3.client('s3', region_name=BUCKET_REGION)
r53_client = boto3.client('route53')

def upload_file(dir_path, file_path, bkt_name, obj_key):
"""
Upload a file to a specific folder in an Amazon S3 bucket.

Args:
dir_path (str): The directory path in the S3 bucket
file_path (str): The local path of the file to upload
bkt_name (str): The name of the S3 bucket
obj_key (str): The object key (file name) in S3
"""
full_key = f"{dir_path}/{obj_key}"
s3_client.upload_file(file_path, bkt_name, full_key)

def get_hosted_zones(next_marker=None):
"""
Retrieve all hosted zones from Amazon Route 53, handling pagination.

Args:
next_marker (tuple): Marker for pagination (DNSName, HostedZoneId)

Returns:
list: All hosted zones in Route 53
"""
params = {'DNSName': next_marker[0], 'HostedZoneId': next_marker[1]} if next_marker else {}
response = r53_client.list_hosted_zones_by_name(**params)
zones = response['HostedZones']
if response['IsTruncated']:
# Recursively get the next batch of zones if the list is truncated
zones.extend(get_hosted_zones((response['NextDNSName'], response['NextHostedZoneId'])))
return zones

def get_zone_info(zone_id):
"""
Retrieve information about a specific hosted zone from Amazon Route 53.

Args:
zone_id (str): The ID of the hosted zone

Returns:
dict: Information about the hosted zone
"""
info = r53_client.get_hosted_zone(Id=zone_id)
return info

def get_zone_records(zone_id, next_record=None):
"""
Retrieve all DNS records for a specific hosted zone, handling pagination.

Args:
zone_id (str): The ID of the hosted zone
next_record (tuple): Marker for pagination (RecordName, RecordType)

Returns:
list: All DNS records in the specified hosted zone
"""
params = {'HostedZoneId': zone_id}
if next_record:
params.update({'StartRecordName': next_record[0], 'StartRecordType': next_record[1]})
response = r53_client.list_resource_record_sets(**params)
records = response['ResourceRecordSets']
if response['IsTruncated']:
# Recursively get the next batch of records if the list is truncated
records.extend(get_zone_records(zone_id, (response['NextRecordName'], response['NextRecordType'])))
return records

def get_record_value(record):
"""
Extract the value(s) from a DNS record, handling both standard and alias records.

Args:
record (dict): A DNS record from Route 53

Returns:
list: The value(s) of the DNS record
"""
if 'AliasTarget' in record:
# Handle alias records
return [f"ALIAS:{record['AliasTarget']['HostedZoneId']}:{record['AliasTarget']['DNSName']}"]
# Handle standard records
return [v['Value'] for v in record.get('ResourceRecords', [])]

def safe_get(dict_obj, key, default=''):
"""
Safely retrieve a value from a dictionary, providing a default if the key doesn't exist.

Args:
dict_obj (dict): The dictionary to search
key (str): The key to look for
default (any): The default value to return if the key is not found (default is '')

Returns:
any: The value associated with the key, or the default value
"""
if isinstance(dict_obj, dict):
return dict_obj.get(key, default)
return default

def write_zone_info(zone_info):
"""
Write zone information to a temporary file.

Args:
zone_info (dict): The zone information to write

Returns:
str: The path of the written file
"""
file_name = f"/tmp/{zone['Name']}info.json"
file_path = os.path.join('/tmp', file_name)
with open(file_path, 'w') as f:
json.dump(zone_info, f, indent=2)
return file_path

def write_zone_to_csv(zone, records):
"""
Write the DNS records of a hosted zone to a CSV file.

Args:
zone (dict): The hosted zone information
records (list): The DNS records of the zone

Returns:
str: The path of the created CSV file
"""
file_name = f"/tmp/{zone['Name']}csv"
with open(file_name, 'w', newline='') as csv_file:
writer = csv.writer(csv_file)
writer.writerow(['NAME', 'TYPE', 'VALUE', 'TTL', 'REGION', 'WEIGHT', 'SETID', 'FAILOVER', 'EVALUATE_HEALTH'])
for record in records:
base_row = [
record['Name'],
record['Type'],
'',
safe_get(record, 'TTL'),
safe_get(record, 'Region'),
safe_get(record, 'Weight'),
safe_get(record, 'SetIdentifier'),
safe_get(record, 'Failover'),
safe_get(safe_get(record, 'AliasTarget'), 'EvaluateTargetHealth')
]
for value in get_record_value(record):
writer.writerow(base_row[:2] + [value] + base_row[3:])
return file_name

def write_zone_to_json(zone, records):
"""
Write the DNS records of a hosted zone to a JSON file.

Args:
zone (dict): The hosted zone information
records (list): The DNS records of the zone

Returns:
str: The path of the created JSON file
"""
file_name = f"/tmp/{zone['Name']}json"
with open(file_name, 'w') as json_file:
json.dump(records, json_file, indent=4)
return file_name

def lambda_handler(event, context):
"""
Main handler function for the AWS Lambda.
This function orchestrates the backup process for all Route 53 hosted zones.

Args:
event (dict): The event dict that triggers the Lambda function
context (object): Provides methods and properties about the invocation, function, and execution environment

Returns:
bool: True if the backup process completed successfully, False otherwise
"""
timestamp = datetime.utcnow().strftime("%Y-%m-%dT%H:%M:%SZ")

for zone in get_hosted_zones():
# Extract zone ID from the full ID string (removing '/hostedzone/')
zone_id = zone['Id'].split('/')[-1]
# Create a unique folder name for each zone backup
zone_folder = f"{timestamp}/{zone['Name'][:-1]}_{zone_id}"
zone_records = get_zone_records(zone['Id'])
zone_info = get_zone_info(zone['Id'])

for file_type in ['csv', 'json']:
# Choose the appropriate function based on the file type
file_func = write_zone_to_csv if file_type == 'csv' else write_zone_to_json
file_path = file_func(zone, zone_records)
obj_key = f"{zone['Name']}{file_type}"
upload_file(zone_folder, file_path, BUCKET_NAME, obj_key)

file_path = file_func(zone, zone_info)
obj_key = f"zone_info_{zone['Name']}json"
upload_file(zone_folder, file_path, BUCKET_NAME, obj_key)

return True

if __name__ == "__main__":
lambda_handler(None, None)
Replace BUCKET_NAME and BUCKET_REGION with your S3 bucket details.
4. Deploy the function.
5. Set the function timeout to 60 seconds in the Configuration tab.

Create an EventBridge schedule to run the Lambda function daily.
  1. Open the EventBridge Scheduler console.
  2. Choose "Create schedule" and enter these details:
    • Schedule name: Route53_Backup_Lambda
    • Schedule type: Rate-based schedule
    • Rate expression: 1 days
    • Flexible time window: Off
  3. Select "AWS Lambda Invoke" as the target and choose the Route53_Backup_Lambda function.
  4. Enable the schedule and create a new execution role.
  5. Review and create the schedule.

Output

The Lambda function writes backup files to your S3 bucket in this format:
1
2
3
4
[S3 BUCKET NAME]/[TIMESTAMP]/[ZONE NAME]_[ZONE ID]/
├── zone_info_[ZONE NAME].json
├── [ZONE NAME].json
└── [ZONE NAME].csv
Output files:
  • zone_info_[ZONE NAME].json: Zone configuration information
  • [ZONE NAME].json: All resource records for the zone
  • [ZONE NAME].csv: Resource records in a readable CSV format

Conclusion

With this setup, you'll have daily backups of your Route 53 zones to protect against accidental data loss.
 

Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.

2 Comments

Log in to comment