AWS Logo
Menu
From Protocol to Product: Building MCP Clients with Amazon Bedrock Converse API

From Protocol to Product: Building MCP Clients with Amazon Bedrock Converse API

Authors: Godwin Sahayaraj Vincent and Ramesh Kumar Venkatraman | AWS Solutions Architect.

Godwin Vincent
Amazon Employee
Published May 14, 2025
As artificial intelligence continues to advance, developers increasingly require adaptable methods to interact with large language models (LLMs) while retaining control over their applications. The Model Context Protocol (MCP) addresses this need by offering an open, standardized protocol for connecting AI models to external tools, data sources, and services. When paired with the robust foundation models available through Amazon Bedrock, MCP enables a powerful and unified platform for building intelligent applications that can access and utilize diverse resources efficiently.
In our previous blog post, we discussed how deploying remote PostgreSQL performance analyzer MCP servers enables teams to:
  • Centralize database optimization tools across your organization
  • Leverage natural language interactions for complex performance tuning
  • Natural language queries to identify slow-running SQL
  • Automated analysis of query execution plans
  • Intelligent index recommendations
  • Query rewriting suggestions
While the Amazon Q CLI provides an excellent interface for interacting with these MCP servers, many organizations need custom client applications that can be tailored to specific workflows, integrated into existing systems, or deployed as standalone services. This is where Custom MCP Client comes in.
In this blogpost, we’ll demonstrate how to develop custom client applications-both command-line interfaces (CLI) and web-based user interfaces-using the MCP client SDK alongside Amazon Bedrock’s tool integration capabilities. This approach streamlines the process of connecting your applications to a wide array of data sources and services, reducing the complexity of custom integrations and accelerating the development of context-aware AI solutions.

Introducing the Custom MCP Client using Amazon Bedrock Converse API

The Custom MCP Client using Amazon Bedrock is a Python-based implementation that combines Amazon Bedrock's foundation models with MCP servers. It provides both a CLI and Web interface for interactive use and a programmable API for integration into custom applications. Leveraging the Converse API, this solution offers a consistent and unified way to interact with various Bedrock models, eliminating the need to manage model-specific differences. The Converse API streamlines multi-turn conversations, enables tool use (function calling), and reduces code complexity by allowing developers to write code once and use it seamlessly across supported models. This simplifies integration, accelerates development, and enhances flexibility for building advanced conversational AI solutions.
The MCP client consists of two main components:
  1. CLI Client: A command-line interface for interacting with MCP servers using Amazon Bedrock models
  2. Web UI: A browser-based interface built with FastAPI for a more visual interaction experience

Key Features

  • Multi-server support: Connect to multiple MCP servers simultaneously
  • Model selection: Choose from a wide range of Amazon Bedrock models including Claude, Llama, Mistral, and more
  • Tool discovery: Automatically discover and use tools available on connected servers
  • Conversation memory: Maintain context across multiple interactions
  • Rate limiting: Built-in protection against API throttling
  • Caching: Response caching to improve performance and reduce costs
  • Web interface: User-friendly browser-based UI for non-technical users
  • Query CLI: Query CLI in your local desktop or IDE terminal that interacts with MCP servers using Amazon Bedrock

Important Security Note

When deploying your remote Custom MCP Client in a production environment:
  • Implement proper authentication: Add an authentication layer using Amazon Cognito, API Gateway authorizers, or similar solutions
  • Network isolation: Deploy your Custom MCP Client in a private subnet with restricted access as necessary.

Detailed Architecture Overview: MCP Bedrock Client

mcp-client-bedrock-architecture

Let me elaborate on each key component and how they interact with each other.

Core Components

1. BedrockClient

Bedrock client class serves as the primary interface to Amazon Bedrock's API:
Responsibilities:
  • Establishes connections to Amazon Bedrock using boto3
  • Formats requests for Bedrock's Converse API
  • Handles model-specific configurations (tokens, temperature)
  • Implements exponential backoff and retries for API throttling
  • Caches responses to improve performance
  • Formats MCP tools into Bedrock-compatible format
Key Features:
  • Support for multiple model families (Claude, Llama, Mistral, etc.)
  • Region-specific configurations
  • Rate limiting and error handling
  • Response caching for efficiency

2. MCPServerConnection

The MCPServerConnection class manages connections to individual MCP servers:
Responsibilities:
  • Establishes and maintains SSE (Server-Sent Events) connections
  • Initializes MCP sessions
  • Discovers available tools on each server
  • Executes tool calls and processes results
  • Handles connection lifecycle (connect/disconnect)
Key Features:
  • Asynchronous communication with MCP servers
  • Tool discovery and schema validation
  • Error handling and reconnection logic
  • Result formatting and parsing

3. MCP-BedrockOrchestrator

The MCP-BedrockOrchestrator class orchestrates interactions between Bedrock and MCP servers:
Responsibilities:
  • Manages connections to multiple MCP servers
  • Routes tool calls to appropriate servers
  • Processes user queries through Bedrock
  • Handles tool call responses and formats them for Bedrock
  • Manages conversation flow and tool execution
Key Features:
  • Multi-server support
  • Tool mapping between Bedrock and MCP
  • Conversation management
  • Rate limiting for tool calls
  • Interactive CLI interface

4. ChatMemory

The ChatMemory class maintains conversation history and context:
Responsibilities:
  • Stores user and assistant messages
  • Maintains tool requests and results
  • Provides context for future interactions
  • Generates conversation summaries
  • Validates message history for consistency
Key Features:
  • Configurable memory size
  • Message validation
  • Conversation summarization
  • Context management

5. API (FastAPI)

The API layer provides a web interface for browser access:
Responsibilities:
  • Exposes REST endpoints for client interactions
  • Renders HTML templates for web UI
  • Manages client sessions
  • Handles form submissions and responses
  • Provides model and server selection interfaces
Key Features:
  • HTML templates for user interface
  • Session management
  • Asynchronous request handling
  • Error handling and reporting

Component Interactions

The components interact in the following ways:

User Query Flow:
mcp-user-query-flow
Flow Steps:
  1. User submits a query via CLI or web UI
  2. Query is sent to MCP-Bedrock Orchestrator
  3. Orchestrator adds the query to ChatMemory
  4. BedrockClient sends query and available tools to Amazon Bedrock
  5. If tool calls are needed, Bedrock requests them
  6. Orchestrator routes tool calls to appropriate MCPServerConnection
  7. MCPServerConnection executes tool calls on MCP Servers and returns results
  8. Results are sent back to Bedrock
  9. Final response is returned to Web UI/CLI
  10. Response is presented to the user

Getting Started

Prerequisites

  • Python 3.12 or higher
  • AWS credentials with access to Amazon Bedrock
  • Access to one or more MCP servers

Local Deployment

Configuration

The client uses a configuration file mcp.json file to store server information. Here's an example configuration:
You can add additional servers by editing this file or using the client's API.

Using the CLI Client

The CLI client provides an interactive interface for working with MCP servers:
Once connected, you can:
command to list available tools
mcp-list-tools
mcp-understand-db

Using the Web Interface

The project also includes a web interface built with FastAPI:
Then open your browser to
http://localhost:8000 to access the interface, which provides:
  • Initiate a New Session
mcp-initiate-session
  • A form to select and connect to MCP servers and Bedrock models
mcp-choose-server
  • Option to add remote MCP server
mcp-add-new-server
  • A chat interface for interacting with the models
mcp-chat-interface
  • Response history and context management
mcp-chat-resp-history

Deploying with Amazon ECR, ECS, and Application Load Balancer

To make our solution scalable, reliable, and accessible, we deployed the architecture using containerization and Amazon's managed container services. Here's how we implemented the deployment:

Containerized Architecture and Image Repository

We packaged the entire application into a Docker container and stored it in Amazon Elastic Container Registry (ECR):
  • Build the Docker image using the project's Dockerfile
  • Push to Amazon ECR to securely store and version our container images
  • Configure image scanning to identify security vulnerabilities
  • Set up image lifecycle policies to manage older image versions
The ECR repository serves as the central source for our container images, ensuring consistency across deployments and environments.

CI/CD Pipeline for Image Deployment

Our continuous integration and deployment pipeline:
  • Builds the Docker image when code changes are committed
  • Tags the image with a version number and "latest" tag
  • Pushes the image to our private ECR repository
  • Updates the ECS service to use the new image

Amazon ECS Deployment

We deployed the containerized application using Amazon Elastic Container Service (ECS), which pulls the container image directly from our ECR repository:
  • Task Definition references the ECR image URI "Example: 123456789012.dkr.ecr.us-west-2.amazonaws.com/mcp-client:latest"
  • ECS Service maintains the desired number of tasks running
  • Auto Scaling adjusts capacity based on CPU/memory utilization
Our ECS task definition includes:
  • CPU and memory allocations appropriate for the workload
  • Environment variables for configuration
  • IAM role with permissions to access Amazon Bedrock's Converse API
  • Networking configuration for security groups and subnets

Load Balancing with ALB

In front of the ECS tasks, we placed an Application Load Balancer (ALB) that:

  • Distributes incoming traffic across multiple container instances
  • Performs health checks to ensure only healthy containers receive traffic
  • Terminates HTTPS connections for secure communication
  • Enables path-based routing for API endpoints

IAM Permissions for Bedrock Access

A critical component of our deployment is the IAM role attached to the ECS tasks. This role includes:

  • Permissions to call the Amazon Bedrock Converse API
  • Access to specific models configured in the application
  • Least-privilege permissions following security best practices

Deployment Architecture Diagram in AWS

deploy-aws-arch

Deployment Workflow

Our complete deployment workflow follows these steps:

  • Image Building: Docker image is built from source code
  • Image Publishing: Image is pushed to Amazon ECR repository
  • Deployment: ECS service pulls the image from ECR and deploys it as tasks
  • Traffic Routing: ALB routes user traffic to the ECS tasks
  • Model Access: ECS tasks use IAM roles to access Amazon Bedrock models

Benefits of This Deployment Approach

  • High Availability: The ALB and ECS ensure the application remains available even if individual containers fail
  • Scalability: The architecture can scale horizontally by adding more ECS tasks
  • Security: ECR provides private image storage, and IAM roles provide fine-grained access control
  • Version Control: ECR maintains image versions, enabling rollbacks if needed
  • Cost Efficiency: Resources scale based on demand, optimizing costs
  • Operational Simplicity: AWS manages the underlying infrastructure, reducing operational overhead
By deploying our bedrock-mcp-client using this architecture with Amazon ECR, ECS, and ALB, we've created a robust, scalable solution that can handle production workloads while maintaining security and reliability. The ECS tasks pull container images directly from our ECR repository and have the necessary permissions to access Amazon Bedrock Models through the Converse API.

Advanced Features

Tool Mapping and Discovery

The client automatically maps tools from MCP servers to a format compatible with Bedrock models:

Conversation Memory Management

The client maintains conversation history and can generate summaries:

Rate Limiting and Error Handling

Built-in rate limiting protects against API throttling:

Supported Models

The client supports a wide range of Amazon Bedrock models, including:

  • Anthropic Claude 3, 3.5, and 3.7 (Opus, Sonnet, Haiku)
  • Amazon Nova (Pro, Lite, Micro)
  • Meta Llama 3.1 and 3.2
  • Mistral (Large, Small)
  • and more
Each model is available in multiple AWS regions, making it easy to choose the right model for your specific needs.

Use Cases

The Custom MCP Client for Amazon Bedrock is versatile and can be used for various applications:
  • Database Interaction: Connect to PostgreSQL databases through MCP and use natural language to query data
  • Document Analysis: Process and analyze documents using specialized MCP tools
  • Multi-agent Systems: Create systems where multiple specialized agents collaborate
  • Interactive Chatbots: Build chatbots that can access external tools and data sources
  • Data Processing Pipelines: Create workflows that combine AI with data processing tools

Conclusion

The Custom MCP Client provides a powerful foundation for building custom applications that leverage Amazon Bedrock's foundation models and the flexibility of the Model Context Protocol. By combining these technologies, developers can create sophisticated AI applications that can interact with multiple services, maintain context, and provide rich user experiences. Whether you're building a simple CLI tool or a complex web application, the modular architecture and comprehensive feature set make it easy to get started and scale as your needs grow.

Next Steps

  • Explore the github-repo for more details
  • Contribute to the project by submitting pull requests
  • Join the community to share your experiences and learn from others
  • Start building your own custom client applications today and unlock the full potential of Amazon Bedrock and MCP!
     

Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.

Comments