AWS Logo
Menu
Q-Bits: Enhance Amazon Q Developer CLI's Context Using MCP for Web Crawling

Q-Bits: Enhance Amazon Q Developer CLI's Context Using MCP for Web Crawling

Learn how to enhance Amazon Q Developer CLI with web content by building a custom MCP server that crawls websites and automatically adds them to your context, enabling more informed AI assistance without leaving your terminal.

Dor Fibert
Amazon Employee
Published May 11, 2025
Last Modified Jun 22, 2025
Large Language Models (LLMs) like those powering Amazon Q Developer are incredibly powerful, but they have inherent limitations. They can only work with the information they were trained on and what you explicitly provide them. This is where context becomes crucial.
In this Q-Bits post, we'll explore how to supercharge your Amazon Q Developer CLI experience by leveraging the Model Context Protocol (MCP) to dynamically fetch web content and use it as context. By the end of this tutorial, you'll be able to point Amazon Q at any webpage and have it incorporate that information into its responses - all without leaving your terminal.

Context in Amazon Q Developer CLI

Amazon Q Developer CLI is a powerful tool that brings AI assistance directly to your terminal. One of its most valuable features is the ability to provide context to guide its responses. Context helps Amazon Q Developer understand your specific environment, project requirements, coding standards, and other important information that might not be part of its training data.

Context Management

Context in Amazon Q Developer CLI works through profiles, which contain sets of context files that influence how Amazon Q Developer interacts with you. These profiles can be customized to fit different projects, roles, or workflows, making your AI assistant more relevant and helpful for your specific needs.
You can manage profiles using the /profile command:
Each profile can have its own unique context files, allowing you to quickly switch between different sets of guidelines and information depending on your current task.
Adding context to a profile is straightforward. You can use the /context add command to include files or directories that contain relevant information:
Amazon Q Developer CLI also has a special default context rule: any markdown files placed in the .amazonq/rules/ directory are automatically included in your context. This makes it easy to add project-specific guidelines, coding standards, or other information that you want Amazon Q Developer to consider when providing assistance.
Manually adding context also has limitations:
  1. You need to manually manage the content of context files
  2. The information might become outdated
  3. You can't easily incorporate web content without first downloading it
This is where MCP comes in, allowing you to dynamically fetch and use web content as context.

MCP Integration: Extending Amazon Q Developer's Capabilities

MCP is an open protocol that standardizes how applications provide tools and resources to LLMs. Think of it as a "USB-C port for AI applications" - a standardized way to connect AI models to different data sources and tools.
With Amazon Q Developer CLI support for MCP, you can extend its capabilities through custom MCP servers to integrate it with all kinds of applications and libraries. These servers can provide additional tools, prompts, and resources that Amazon Q Developer can use to better assist you.
In this post, we'll create a simple MCP server that enables Amazon Q Developer to crawl web pages and add their content as context.

Writing a Simple Crawl MCP Server

Let's create our web crawling MCP server. This server will provide a single tool called "crawl" that can fetch content from any URL and return it as markdown text.
We will use Astral's uv: Python packaging which became the standard tool for Python MCP servers, being recommended by the Python MCP SDK. uv is a modern Python package manager built in Rust that offers significantly faster performance compared to traditional tools like pip.
First, let's initialize a project and install the necessary dependencies:
After installation, we need to run the post-installation setup for crawl4ai:
The crawl4ai library is a powerful tool for extracting content from web pages in a format suitable for LLMs. It handles the complexities of web scraping, content extraction, and markdown conversion.
Now, here's the implementation of our web crawler MCP server:

Registering the MCP Server with Amazon Q Developer

Now that we have our MCP server ready, we need to register it with Amazon Q Developer CLI. This is done by creating an MCP configuration file in the .amazonq directory.
Create a file named mcp.json in the .amazonq directory with the following content:

Using the Crawl Tool Directly

Once the MCP server is registered, you can use the crawl tool directly from Amazon Q Developer CLI. Let's try crawling a simple website:
This example shows how Amazon Q Developer CLI can directly use the crawl tool to fetch web content. However, the content is only displayed in the chat and not saved for future use. In the next section, we'll see how to enhance this by creating context files from the crawled content.

Creating Context Files Using the Crawler

While using the crawl tool directly is useful for viewing web content, for our use-case, we want to save that content as context for future use. Let's add a prompt to our server that will:
  1. Take a URL as input
  2. Use the crawl tool to fetch the content
  3. Save the content as a markdown file in the .amazonq/rules/ directory
Here's how to add the prompt to our server:
This prompt creates a natural language instruction that tells Amazon Q Developer to:
  1. Use the crawl tool we defined earlier
  2. Save the result to a markdown file in the .amazonq/rules/ directory
  3. Choose an appropriate filename based on the content
We can use the /prompts list command to list all the available prompts:
Now let's use the prompt to create a new context file from a given URL:
Once the file is saved in the .amazonq/rules/ directory, Amazon Q Developer automatically includes it in its context. You can now ask questions about the content, and Amazon Q Developer will use this information to provide more accurate responses.
For example, you could now ask:

Next Steps

Our current implementation only scrapes the content from a single web page. While this is useful for many scenarios, there are cases where you might want to crawl multiple pages from a website to build a more comprehensive context.
As a next step you can implement deep crawling using Crawl4AI's different crawling strategies with a specified depth.
By taking on these enhancements, you'll build a truly powerful context creation tool that can extract knowledge from entire documentation sites, blogs, or knowledge basesβ€”making Amazon Q Developer an even more valuable assistant for your specific needs.

Conclusion

In this Q-Bits post, we've explored how to extend Amazon Q Developer CLI's capabilities using the Model Context Protocol (MCP). By creating a simple web crawling MCP server, we've enabled Amazon Q Developer to dynamically fetch web content and use it as context.
This approach offers several advantages:
  1. Dynamic Context: You can incorporate the latest information from websites without manually downloading files
  2. Seamless Integration: The MCP server integrates directly with Amazon Q Developer CLI
  3. Extensibility: You can enhance the crawler with additional features like authentication, filtering, or specialized parsing for different types of websites
  4. Reusability: The prompt makes it easy to reuse the crawler for different websites
This is just one example of how MCP can extend Amazon Q Developer's capabilities. You could create MCP servers for other tasks like:
  • Querying databases
  • Accessing APIs
  • Processing images
  • Analyzing code repositories
The possibilities are endless, and the Model Context Protocol provides a standardized way to connect Amazon Q Developer to any data source or tool with an API.
Give it a try and see how adding dynamic web content as context can make Amazon Q Developer even more helpful for your specific needs!
Β 

Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.

Comments