
MCP in Action (2): Bridging LLMs and Real-World Data with Amazon Bedrock and Nova Act
This post demonstrates building an intelligent weather assistant using Amazon Bedrock's LLM capabilities with MCP and Nova Act. The solution evolves from keyword matching to natural language understanding, enabling interpretation of weather data, contextual responses, and condition-based recommendations.
Haowen Huang
Amazon Employee
Published Jun 2, 2025
Last Modified Jun 3, 2025
In my previous blog post, "MCP in Action (1): Developing a Weather AI Agent Using Amazon Nova Act", I introduced how to build a keyword-based weather assistant using the Model Context Protocol (MCP) and Amazon Nova Act.
Today, I'm excited to share the next evolution of this project: an intelligent weather assistant powered by Amazon Bedrock that can understand natural language queries and provide context-aware responses.
The original weather assistant relied on keyword matching to determine which MCP tool to call. While functional, this approach had clear limitations:
- Users needed to include specific keywords in their queries
- The assistant couldn't understand complex or nuanced questions
- Responses were limited to raw data without interpretation
- There was no context awareness between queries
By integrating Amazon Bedrock's large language model capabilities, our new assistant overcomes these limitations, creating a truly intelligent interface to weather data.
The core innovation in this updated version is how MCP serves as a bridge between Amazon Bedrock's LLMs and real-world data sources. The architecture consists of four main components, shown as the following diagram:
- MCP Server: Provides tools for retrieving weather data using Amazon Nova Act
- LLM on Amazon Bedrock: Provides natural language understanding and response generation
- Amazon Nova Act: Handles web interaction and data extraction
- Agentic Assistant: Coordinates between the user, LLM, and MCP tools

Here's the flow of a typical interaction:
- The user submits a natural language query about Hong Kong's weather
- Amazon Bedrock analyzes the query and determines which tool to call
- The selected MCP tool is invoked with appropriate parameters
- Amazon Nova Act retrieves the requested data from the Hong Kong Observatory website
- The raw data is processed and returned to Amazon Bedrock
- Amazon Bedrock generates a human-friendly response with insights and recommendations
- The response is presented to the user
- Python 3.8+
- AWS account with access to Amazon Bedrock
- Amazon Nova Act API key
- Required Python packages (see requirements.txt)
1. Install the required packages:
2. Set up your environment variables:
To obtain the Amazon Nova Act API Key, register for the Amazon Nova Act SDK Preview, then acquire it from: https://nova.amazon.com/act
3. Run the MCP server:
4. In a separate terminal, run the agentic assistant:
The complete codebase is available in my GitHub repository: https://github.com/hanyun2019/mcp-on-aws-demo/tree/main/hk-weather-mcp-and-nova-act-llm
Once the assistant is running, you can ask questions about Hong Kong's weather in natural language:
- "What's the weather like in Hong Kong today?"
- "Will it rain tomorrow in Hong Kong?"
- "Give me the forecast for the next week"
- "Are there any weather warnings in Hong Kong right now?"
- "What's the temperature and humidity in Hong Kong?"
- "Should I bring an umbrella if I'm going out this afternoon?"
The assistant will use the LLM to understand your query, select the appropriate tool, and provide a meaningful response based on the real-time data from the Hong Kong Observatory website.
Let's look at the key components of our implementation:
The MCP server (hk_weather_mcp_server.py) remains largely unchanged from the previous version. It provides three main tools:
- ‘get_hk_current_weather’: Retrieves current weather conditions
- ‘get_hk_forecast’: Gets the 9-day weather forecast
- ‘get_hk_weather_warnings’: Checks for active weather warnings
Each tool uses Amazon Nova Act to interact with the Hong Kong Observatory website and extract relevant information.
The new agentic_weather_assistant.py is where the magic happens. It replaces the keyword-based client with an LLM-powered agent that can understand natural language queries.
Here's how it works:
When Amazon Bedrock decides to call a tool, the assistant handles the tool invocation and result processing:
To ensure robustness, the assistant includes a fallback mechanism that reverts to keyword-based tool selection if the LLM approach fails:
A key aspect of our implementation is the integration with Anthropic's Claude 3.5 Sonnet model through Amazon Bedrock. Here's a detailed breakdown of how this integration works:
First, we initialize the Bedrock client using boto3:
This creates a client that can communicate with the Amazon Bedrock service. The ‘region_name’ parameter specifies which AWS region to use for Amazon Bedrock.
In the ‘AgenticWeatherAssistant’ class initialization, we specify that we want to use Claude 3.5 Sonnet:
The ‘model_id’ parameter identifies the specific model version we want to use. In this case, we're using Claude 3.5 Sonnet with the version identifier "20241022-v2:0".
We provide a system prompt that defines the assistant's behaviour and capabilities:
This system prompt guides Claude 3.5 on how to behave and what tools it has available.
After receiving the response from Claude 3.5, we process it to extract text responses and tool calls:
Claude 3.5 can respond with text or indicate that it wants to use a tool. We handle both cases appropriately.
When Claude decides to use a tool, we execute the tool call and send the results back to Claude for interpretation:
This process allows Claude 3.5 to:
- Request specific data using tools
- Receive the results of those tool calls
- Generate a final response that incorporates the tool results
What makes this architecture particularly powerful is how MCP serves as a standardized bridge between the LLM and external data sources:
- Tool Discovery: The client can dynamically discover available tools on the server
- Standardized Interface: Tools have consistent naming, descriptions, and input schemas
- Abstraction Layer: The LLM doesn't need to know implementation details of data retrieval
- Separation of Concerns: The server handles data retrieval while the client focuses on user interaction
This separation allows us to:
- Add new data sources without changing the client
- Update the web scraping logic without affecting the user experience
- Use the same tools with different LLM providers
- Create multiple client applications that leverage the same tools
The benefits of using Claude via Amazon Bedrock are:
- Advanced Natural Language Understanding: Claude 3.5 can understand complex, nuanced queries about weather
- Contextual Tool Selection: The model intelligently selects which tool to use based on the query
- Insightful Responses: Claude can interpret raw weather data and provide meaningful insights
- Actionable Recommendations: The model can suggest actions based on weather conditions
- Conversational Context: Claude can maintain context across multiple turns of conversation
Here are some examples of how users can interact with the assistant:
To help you better understand how I’ve combined MCP, Amazon Nova Act, and Amazon Bedrock's Claude model to create an intelligent weather assistant that bridges the gap between large language models and real-world data sources, I've recorded a YouTube video demonstration showcasing its capabilities:
By combining MCP, Amazon Nova Act, and Amazon Bedrock's Claude model, we've created an intelligent weather assistant that bridges the gap between large language models and real-world data sources. This architecture demonstrates how MCP can serve as a standardized interface for connecting AI models with external systems, enabling the creation of powerful, context-aware applications.
The evolution from a keyword-based assistant to an LLM-powered agent represents a significant leap in capability and user experience. Users can now interact with the assistant in natural language and receive insightful, context-aware responses based on real-time data.
This pattern of using MCP as a bridge between LLMs and data sources can be applied to many other domains beyond weather, opening up exciting possibilities for building intelligent, data-driven applications.
The complete codebase is available in my GitHub repository: https://github.com/hanyun2019/mcp-on-aws-demo/tree/main/hk-weather-mcp-and-nova-act-llm
Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.