AWS Logo
Menu

Migrate SQL Server to Aurora PostgreSQL with Amazon Q

Discover how Amazon Q Developer CLI simplifies SQL Server to Aurora PostgreSQL migration through conversation, reducing complexity and technical effort.

Gonzalo Vásquez
Amazon Employee
Published May 29, 2025

Introduction

Database migration is a critical undertaking for organizations looking to modernize their data infrastructure, reduce costs, or leverage cloud-native capabilities. In this post, I'll share my experience using Amazon Q Developer CLI to streamline the migration of a SQL Server database to Amazon Aurora PostgreSQL.

The Migration Scenario

My scenario involved migrating a retail database from Microsoft SQL Server to Amazon Aurora PostgreSQL. The database consisted of 20 tables with complex relationships, approximately 15,000 rows of data, and 20 stored procedures that performed various business operations and analytics.

Leveraging Amazon Q Developer CLI for Database Migration

Amazon Q Developer CLI's chat feature served as my primary interface for orchestrating the entire migration process. With just a series of conversational prompts, I was able to implement a comprehensive migration strategy that would typically require multiple specialized tools and significant manual effort.

My Instructions and Amazon Q's Interpretation

I provided a series of high-level instructions through chat, which Amazon Q interpreted and executed:
1. Initial Request: "Create RDS SQL Server Express instance and an Aurora PostgreSQL cluster with Babelfish support"
Amazon Q helped me:
• Create an Amazon RDS SQL Server Express instance with appropriate configuration
• Create an Amazon Aurora PostgreSQL cluster with Babelfish capability enabled
• Configure security groups to allow access
• Store credentials securely in AWS Secrets Manager
2. Database Schema Creation: "Create 20 tables on SQL Server instance, consider primary keys, foreign keys between tables and indexes, populate tables with 500-1000 rows each"
Amazon Q assisted with:
• Designing a comprehensive retail database schema with proper relationships
• Creating 20 tables with appropriate primary and foreign keys
• Implementing suitable indexing strategy
• Generating realistic test data (500-1000 rows per table)
3. Stored Procedure Development: "Create 20 stored procedures in SQL Server DB that interact with the previously created tables"
Amazon Q helped me develop:
• Stored procedures for various business operations
• Procedures for data analysis, reporting, and data manipulation
• Proper error handling and transaction management
• Best practice implementations
4. Testing Request: "Test exhaustively the SPs created, to validate they are working properly"
Amazon Q created:
• Comprehensive test cases for each stored procedure
• Validation for different input parameters
• Tests for error handling scenarios
• Data integrity verification
5. Migration Request: "Use AWS schema conversion tool to migrate tables, indexes SPs from SQL Server to the PostgreSQL instance"
Amazon Q guided me through:
• Preparing AWS SCT configuration for the migration
• Converting SQL Server schema to PostgreSQL format
• Transforming stored procedures to PostgreSQL functions
• Handling naming conventions and data type differences
6. Data Migration and Validation: "Proceed with the migration, execute it, but not the data part (DMS) yet" followed by "Proceed with next steps 1 & 2, do not consider cutover"
Amazon Q helped me:
• Execute schema migration but defer data migration
• Set up AWS DMS for data migration
• Perform data migration
• Validate the migrated data and functionality
• Exclude the final cutover step

How Amazon Q Developer CLI Streamlined the Migration Process

Phase 1: Infrastructure Setup

With Amazon Q Developer CLI, I was able to:
• Create an Amazon RDS SQL Server Express instance (db.t3.small)
• Create an Amazon Aurora PostgreSQL cluster with Babelfish support (db.t4g.medium)
• Configure security groups to allow access from my IP address
• Store credentials securely in AWS Secrets Manager
• Get connection details for both instances
This eliminated the need to navigate the AWS Management Console or write complex AWS CLI commands.

Phase 2: Database Schema Design and Creation

Amazon Q helped me:
• Design a comprehensive retail database schema with 20 tables
• Create appropriate primary keys, foreign keys, and indexes
• Generate SQL scripts for table creation
• Execute the scripts on the SQL Server instance
• Populate tables with realistic test data
The entire database structure was created through a single conversation, without requiring me to write any SQL code.

Phase 3: Stored Procedure Development

Through conversational prompts, Amazon Q helped me:
• Develop 20 stored procedures covering various business functions
• Implement proper error handling and transaction management
• Create procedures for customer analysis, inventory management, sales reporting, and order processing
• Generate and execute SQL scripts to create all procedures
This eliminated the need for manual T-SQL coding and testing.

Phase 4: Comprehensive Testing

Amazon Q Developer CLI helped me:
• Create a testing framework with a test log table
• Develop test cases for each stored procedure
• Execute tests with various parameters
• Validate results and data integrity
• Generate a detailed test report
The testing phase, which would typically require significant manual effort, was automated through simple conversation.

Phase 5: Migration Planning and Execution

Amazon Q streamlined the migration by helping me:
• Create AWS SCT project configuration files
• Generate PostgreSQL schema scripts
• Convert stored procedures to PostgreSQL functions
• Handle naming conventions and data type differences
• Create a comprehensive migration guide

Phase 6: Data Migration and Validation

With minimal prompting, Amazon Q helped me:
• Set up AWS DMS configuration
• Create source and target endpoints
• Configure and execute the migration task
• Validate row counts and data integrity
• Test PostgreSQL functions with migrated data
• Generate a detailed migration report

Alternative Approach: Using Amazon Aurora PostgreSQL with Babelfish

Amazon Q also provided an alternative migration approach using Amazon Aurora PostgreSQL with Babelfish, explaining:
• How Babelfish reduces schema conversion requirements
• The benefits for stored procedure compatibility
• How it minimizes application changes
• The potential use as an intermediate step in a longer migration journey

The Power of Amazon Q Developer CLI in Database Migration

The entire migration process—from infrastructure setup to schema conversion, data migration, and validation—was orchestrated through simple conversational prompts with Amazon Q Developer CLI. This demonstrates the transformative potential of AI-assisted database migration:
1. Reduced Complexity: Complex migration tasks were simplified to conversational instructions.
2. Accelerated Timeline: What would typically take days or weeks of work was accomplished in a single conversation session.
3. Comprehensive Approach: Amazon Q handled all aspects of the migration, from infrastructure setup to validation.
4. Reduced Expertise Requirements: I didn't need deep expertise in AWS services, SQL Server, or PostgreSQL to execute a successful migration.
5. Documentation Generation: Amazon Q automatically generated comprehensive documentation of the migration process.

Best Practices for Using Amazon Q Developer CLI for Database Migration

Based on my experience, I recommend the following best practices:
1. Start with Clear Objectives: Clearly articulate your migration goals to Amazon Q.
2. Provide Context: Give Amazon Q information about your database size, complexity, and specific requirements.
3. Review Generated Scripts: While Amazon Q is highly capable, always review generated scripts before execution.
4. Iterative Approach: Break complex migrations into smaller steps, reviewing results at each stage.
5. Leverage Alternative Suggestions: Consider alternative approaches suggested by Amazon Q, such as using Babelfish.

Conclusion

Amazon Q Developer CLI's chat feature represents a paradigm shift in how database migrations can be executed. By transforming complex technical tasks into simple conversations, it democratizes database migration, making it accessible to a broader range of users while reducing time, cost, and risk.
Whether you're migrating directly to Amazon Aurora PostgreSQL or leveraging Amazon Aurora PostgreSQL with Babelfish as an intermediate step, Amazon Q Developer CLI provides an unprecedented level of assistance throughout the journey.
It handles everything from infrastructure setup to schema conversion, data migration, and validation—all through simple conversational prompts.
This approach not only streamlines the migration process but also ensures comprehensive documentation and testing, setting the stage for a successful transition to Amazon Aurora PostgreSQL's powerful, cost-effective database platform.

A Note on This Post

In the spirit of sharing my experience, I should mention that Amazon Q also helped me draft this post. After completing the migration, I asked Amazon Q to help me document the process, and it assisted in organizing my thoughts and experiences into this structured format. This further demonstrates how Amazon Q can help not only with technical implementations but also with knowledge sharing.
Have you used Amazon Q Developer CLI for database migrations or other AWS tasks? I'd love to hear about your experiences in the comments!
 

Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.

Comments