AWS Logo
Menu
AI-Powered Database Intelligence: Remote MCP Server for Amazon Q CLI

AI-Powered Database Intelligence: Remote MCP Server for Amazon Q CLI

Authors: Ramesh Kumar Venkatraman and Godwin Sahayaraj Vincent | AWS Solutions Architect.

Published May 6, 2025
In today's data-driven world, database performance can make or break application user experience. The recent addition of Model Context Protocol (MCP) support to Amazon Q Developer CLI opens exciting new possibilities for database administrators and developers. While the initial announcement showcased local MCP servers using stdio transport, this blog post explores the next frontier: deploying remote MCP servers with Server-Sent Events (SSE) transport to provide teams with centralized, secure access to database performance tools.
This post explores how deploying remote PostgreSQL performance analyzer MCP servers enables teams to:
  • Centralize database optimization tools across your organization
  • Leverage natural language interactions for complex performance tuning
  • Secure sensitive database access through proper authentication
The Power of Remote MCP Servers for Database Optimization
The Model Context Protocol (MCP) standardizes how applications can integrate with large language models (LLMs), providing a common way to share context and access data sources. Amazon Q Developer CLI now supports this protocol, allowing you to extend its capabilities with custom tools.
While local MCP servers (using stdio transport) are great for individual development, remote MCP servers (in this case, using SSE transport) unlock team-wide benefits for database performance optimization:
  • Centralized Performance Monitoring: A single, well-maintained MCP server can analyze database performance for your entire organization
  • Consistent Analysis Tools: Everyone accesses the same version with identical capabilities
  • AI-Powered Recommendations: Leverage Amazon Q's intelligence to suggest optimizations
Architecture Diagram for Remote MCP Server with Amazon Q Developer CLI
High level Architecture Diagram
High level Architecture Diagram

Data Flow:

  1. User issues a natural language query in Amazon Q Developer CLI
  2. CLI identifies the need for database tools and routes the request to the remote MCP server
  3. Request goes through the Application Load Balancer to an available MCP server instance
  4. MCP server retrieves database credentials from AWS Secrets Manager
  5. MCP server connects to the PostgreSQL database and executes the necessary queries
  6. MCP server extracts structured data (metrics, execution plans, schema information)
  7. The structured data is processed by the MCP server's code to identify common patterns
  8. The MCP server formats this structured data in a way that leverages the LLM's knowledge
  9. Response is sent back through the same path to the Amazon Q Developer CLI
  10. The LLM in Amazon Q analyzes the structured data using its knowledge of database optimization
  11. The LLM generates natural language recommendations and explanations based on the data
  12. CLI presents the final analysis and recommendations to the user

Important Security Note

Throughout this blog post, we use HTTP connections for demonstration purposes to simplify the setup. This is not recommended for production environments. When deploying your remote MCP server in a production environment:
  • Always use HTTPS: Configure your Application Load Balancer with a valid SSL/TLS certificate
  • Remove the --allow-http flag: Update your mcp.json configuration to use HTTPS and remove this flag
  • Implement proper authentication: Add an authentication layer using Amazon Cognito, API Gateway authorizers, or similar solutions
  • Network isolation: Deploy your MCP server in a private subnet with restricted access
The examples in this blog demonstrate the art of the possible, but proper security controls must be implemented before using this solution with sensitive database information.

The Role of mcp-remote in Remote MCP Server Architecture

In our architecture, the mcp-remote npm package plays a crucial bridging role between Amazon Q Developer CLI and our remote MCP server. Let's add this important component to our explanation:

What is mcp-remote?

mcp-remote is an npm package that serves as a protocol adapter, enabling Amazon Q Developer CLI to communicate with remote MCP servers over HTTP/SSE transport. While Amazon Q natively supports local MCP servers using stdio transport, mcp-remote extends this capability to work with remote servers.

How mcp-remote Works

When you configure Amazon Q Developer CLI to use a remote MCP server, the following process occurs:
  • Amazon Q CLI launches npx mcp-remote as specified in your mcp.json configuration
  • mcp-remote establishes an SSE (Server-Sent Events) connection to your remote MCP server
  • It translates between:
  • This bidirectional translation happens transparently, allowing seamless communication

Configuration in mcp.json

The mcp.json configuration specifies how to connect to your remote MCP server:
Key parameters:
  • http://<Your-ALB-DNS-Name>/sse: The endpoint URL of your remote MCP server's SSE interface
  • --allow-http: Permits non-HTTPS connections (for development; use HTTPS in production)
  • --transport sse: Specifies the Server-Sent Events transport protocol
The mcp-remote package is automatically installed when needed through the npx command specified in the mcp.jsonconfiguration. There's no need for manual installation, making it easy for team members to connect to your remote MCP server.
Use Case: Database Performance Optimization
Imagine you're responsible for a critical PostgreSQL database that powers your organization's applications. As usage grows, certain queries begin to slow down, affecting user experience. Your team needs to quickly identify problematic queries, understand execution bottlenecks, and implement optimizations.
This is where a remote MCP server connected to Amazon Q Developer CLI becomes invaluable. By providing Amazon Q with direct access to your database's performance metrics, you enable:
  • Natural language queries to identify slow-running SQL
  • Automated analysis of query execution plans
  • Intelligent index recommendations
  • Query rewriting suggestions
Let's build this solution together.

Setting Up Your Environment

Before we begin, ensure you have:
  • Amazon Q Developer CLI installed (version 1.9.0 or later)
  • Authentication set up with your AWS Builder ID
  • Access to AWS ECS and Aurora PostgreSQL
  • AWS Secrets Manager configured with your database credentials
To verify your Amazon Q CLI version: q --version
Building the PostgreSQL Performance Analyzer MCP Server
Our MCP server will provide several capabilities essential for database performance optimization:
  • Identifying slow queries
  • Analyzing query execution plans
  • Recommending indexes
  • Suggesting query rewrites
  • Analyzing database structure
Hybrid Approach: Combining Code Structure with Model Intelligence
The key in our implementation is the hybrid approach we take for all database analysis functions. Rather than trying to code every possible optimization rule (which would be impossible) or relying entirely on the model's knowledge (which might lack structure), we combine the best of both worlds:
  • We use structured code to extract metrics, execution plans, and database statistics
  • We leverage the model's extensive knowledge of SQL optimization for analysis and recommendations
This approach gives us reliable data extraction with the flexibility and depth of AI-powered recommendations.

Deploying PostgreSQL MCP Server in Amazon ECS for High Availability

Remote MCP servers should be deployed in a way that ensures high availability, security, and scalability. Amazon ECS provides an ideal platform for this purpose, offering automated container orchestration, built-in load balancing, and seamless scaling capabilities.

Prerequisites

  • Create an ECR repository for docker images
  • Build Docker images from the cloned repository
  • Push docker images to ECR repository
  • Configure ECR Task Execution Role & Task Role (with Secrets Manager, ECR access)

Step 1: Clone the Repository

Step 2: Build and Push Docker Image

Step 3: Create ECS Task Definition

Create a task definition that specifies:
  • Container image from ECR
  • Port mappings (exposing port 8000)
  • CPU and memory allocations
  • Logging configuration
  • IAM roles for task execution and runtime

Step 4: Set Up ECS Service with Load Balancing

  • Create an Application Load Balancer with:
    • Listener on port 80 (HTTP)
    • Target group with health check path /health on port 8000
    • Security groups allowing inbound traffic on port 80
  • Create an ECS service that:
    • Uses the task definition created earlier
    • Runs in a private subnet with appropriate security groups
    • Integrates with the Application Load Balancer
    • Configures desired task count for high availability (minimum 2)

Step 5: Configure Amazon Q Developer CLI to Use the Remote MCP Server

Once your ECS service is running, configure Amazon Q Developer CLI to use your remote MCP server:
  • Open the MCP configuration file:
  • Update the file with your remote MCP server information:
Start using Amazon Q CLI with your remote MCP server:
# Enter the below command from terminal q chat
You can see as below
Amazon Q Developer CLI
Amazon Q Developer CLI
# Enter /tools to list all the tools from PostgrSQL MCP server.
Amazon Q Developer CLI w/ Remote MCP server tools
Amazon Q Developer CLI w/ Remote MCP server tools
Now we can interact with our database performance tools directly from the Amazon Q CLI:
Real-World Database Optimization Scenarios:
Let's explore how our remote MCP server enhances database optimization workflows:

Database Structure Analysis

Before diving into specific query optimizations, it's often helpful to understand the overall database structure:

Identifying and Analyzing Slow Queries

When application performance degrades, the first step is often identifying slow database queries:
Structured Data Extraction
The code handles the technical aspects of:
  • Executing database queries to gather metrics and statistics
  • Parsing execution plans and query structures
  • Extracting database schema information
  • Detecting common patterns and anti-patterns

AI-Powered Analysis

Once we have the structured data, we leverage the model's extensive knowledge of:
  • SQL optimization techniques
  • Database performance patterns
  • Index selection strategies
  • Schema design principles
  • Query rewriting approaches

Database Structure Context

A key enhancement in our approach is the addition of database structure analysis. Before recommending any optimizations, we:
  1. Analyze the overall database schema
  2. Examine table statistics and relationships
  3. Review existing indexes and their usage patterns
  4. Consider data volumes and access patterns

Benefits of This Approach

  • Reliability: We get consistent extraction of metrics and database information
  • Flexibility: The model can apply its broad knowledge to generate recommendations
  • Maintainability: We only need to code the data extraction, not all possible optimization rules
  • Quality: Recommendations are more nuanced and context-aware than purely code-based ones
  • Contextual: Optimizations consider the entire database structure, not just isolated queries

Security Considerations for Remote MCP Servers

When deploying a remote MCP server, security is paramount. Here are key considerations:

Network Security

  • VPC Configuration: Deploy your MCP server in a private subnet with controlled access
  • Security Groups: Restrict inbound traffic to only necessary ports and sources
  • WAF Integration: Consider adding AWS WAF to protect against common web exploits

Authentication and Authorization

  • API Gateway Integration: Use Amazon API Gateway with custom authorizers
  • IAM Roles: Implement fine-grained access control with IAM roles
  • Token-based Authentication: Require authentication tokens for MCP server access

Encryption and Data Protection

  • TLS Encryption: Ensure all communications use TLS 1.2 or later
  • Secrets Manager: Store all credentials in AWS Secrets Manager
  • Data Classification: Implement controls based on data sensitivity

Troubleshooting Remote MCP Server Deployment

Common Issues and Solutions

  • Connection Refused Errors
    • Verify security group settings allow traffic on port 8000
    • Check that the ECS service is running and healthy
    • Ensure your mcp.json configuration has the correct URL
  • Authentication Failures
    • Verify AWS credentials are properly configured
    • Check IAM permissions for accessing Secrets Manager
  • Slow Response Times
    • Consider scaling up the ECS task CPU/memory allocation
    • Optimize database connection pooling parameters
    • Add caching for frequently accessed database metadata
  • Tool Execution Failures
    • Check CloudWatch logs for detailed error messages
    • Verify database credentials are correctly stored in Secrets Manager
    • Ensure the MCP server has network access to your database

Future Enhancements

As the Model Context Protocol and Amazon Q Developer CLI continue to evolve, consider these future enhancements:
  • Multi-Database Support: Extend your MCP server to support multiple database types beyond PostgreSQL
  • Custom Authentication: Implement organization-specific authentication mechanisms
  • Integration with Existing Tools: Connect with existing monitoring solutions like CloudWatch or Datadog

Conclusion

The addition of MCP server support to Amazon Q Developer CLI represents a significant advancement for database administrators and developers. By deploying a PostgreSQL Performance Analyzer MCP server in ECS with SSE transport, you can provide your entire organization with secure, centralized access to powerful database optimization tools.
Our hybrid approach to database analysis—combining structured code with AI intelligence—demonstrates the true power of this integration. By extracting reliable metrics and database structure information, then leveraging the model's extensive knowledge, we create a solution that's both robust and insightful.
This approach offers numerous benefits:
  • Faster Performance Tuning: Quickly identify and fix slow queries using natural language
  • Enhanced Collaboration: Everyone works with the same tools and insights
  • Operational Efficiency: Automate common database optimization tasks
  • Context-Aware Recommendations: Optimizations consider your specific database structure
As you implement this solution, remember that the true power lies in customizing the MCP server to your specific database needs. Whether you're troubleshooting slow queries, optimizing indexes, or refining your schema, a well-designed MCP server can become an invaluable extension to your Amazon Q Developer CLI.
We encourage you to explore the possibilities of remote MCP servers and share your experiences with the community. The combination of Amazon Q's intelligence with your database performance data creates a powerful synergy that can transform how your teams optimize database performance.

About the Authors

Godwin Sahayaraj Vincent is an Enterprise Solutions Architect at AWS who is passionate about Machine Learning and providing guidance to customers to design, deploy and manage their AWS workloads and architectures. In his spare time, he loves to play cricket with his friends and tennis with his three kids.
Ramesh Kumar Venkatraman is a Senior Solutions Architect at AWS who is passionate about Generative AI, Containers and Databases. He works with AWS customers to design, deploy and manage their AWS workloads and architectures. In his spare time, he loves to play with his two kids and follows cricket.

Ready to get started with Amazon Q Developer CLI and remote MCP servers? Install version 1.9.0 or later today and transform how your teams optimize database performance!
 

Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.

Comments