AWS Logo
Menu

Create your own custom AI agent with Amazon Bedrock and Converse API

Building your own custom AI agent is easy to do when you use Amazon Bedrock and the Converse API.

Ross Alas
Amazon Employee
Published Sep 29, 2024
Follow me on LinkedIn for more AWS & GenAI-related content: www.linkedin.com/in/ross-alas
Autonomous agents has immense potential to improve employee productivity, automate mundane tasks, and create new content. Agents are able to understand your request and using tools that it has access to, fulfill your request all without you having to program the exact steps to do it.
In this blog, I’ll guide you on how to easily create a custom AI agent using Amazon Bedrock and Converse API. You can interact with it using a browser through the Streamlit app. The Agent is equipped with a RetrieveUrlTool that can retrieve web pages that enables a lot of use cases such as:
  1. Retrieving web pages
  2. Summarizing web pages
  3. Sentiment analysis of social media posts
  4. and more!

Overview of the solution

The solution consists of several files that you’ll need for your agent.
  • agent.py - core agent logic
  • tools.py - tools the agent can use
  • utils.py - helper functions
  • streamlit_app.py - an interactive UI
Figure 1. Core agent logic
Figure 1. Core agent logic

The agent.py file contains the core agent logic that runs the agent. Walking through the logic step by step:
  1. The user enters a message which is then formatted into a User Message as required by Converse API
  2. The User Message is appended to a Message List
  3. An LLM is called through Converse API with the Message List and the tools that the agent can use
  4. Amazon Bedrock responds with the Assistant Message, that can contain text and toolUse blocks, along with a stopReason.
  5. The stopReason is examined. The stopReason can be tool_use which means that the LLM has decided to call a tool. It can also be end_turn which means that the LLM is done calling tools and is ending its turn in the chat. There are other stopReasons, refer to Converse API.
  6. If the stopReason is tool_use, then a tool use handler is used to call the appropriate function with the parameters that is provided by the LLM.
  7. The tool is executed and responds with the toolResult.
  8. The toolResult is wrapped in a User Message and appended to the Message List
  9. The Converse API is called once more and the loop continues from step 4 to 9 until a stopReason of end_turn is received.
  10. Once a stopReason of end_turn is received, it means the Agent has completed its processing.
  11. The final response is returned to the user and the user can then continue to enter more messages and the process begins from Step 1 again.
Figure 2. The Streamlit UI for the demo
Figure 2. The Streamlit UI for the demo

Figure 2. The Streamlit UI for the demo
The Streamlit UI in Fig. 2 shows an example of how the agent can be used. The user requests the agent to summarize given a link. The LLM automatically determined that it needs to use a tool and calls the RetrieveUrlTool to retrieve the content. The tool retrieves the web page, then the content is then preprocessed using html2text to convert the webpage to markdown before being returned to the agent as the toolResult. The Agent then processes the toolResult and creates the final response that includes the summary of the webpage.
Disclaimer: You may incur AWS charges and the following contains only sample code that is not directly suitable for production. Read MIT-0 LICENSE for license info.

Prerequisites

Create your Python environment and install packages
First, create your folder and environment either using virtualenv, conda, or your own choice of environment managers. Activate the environment and install the packages using pip.
Creating agent.py
The agent.py file contains the main agent logic. Create an agent.py file and include the following code:
Create tools.py
The tools.py contains the tools that the agents will have access to. Create another file called tools.py and include the following:
Create utils.py
The utils.py file contains a helper function to extract the content of an XML tag outputted by the model. Include the following:
Create streamlit_app.py
The streamlit_app.py contains a UI that allows you to interact with the agent. Include the following:
Let’s run it!
Once you have created the files above, you can run the Streamlit app:
You should now able to access the web app at http://localhost:8501

What's Next?

Try creating your own tools and see what you can do!
Read more about Converse API.
 

Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.

Comments