AWS Logo
Menu
Build generative AI agents with LangChain and Anthropic Claude 3 models on Amazon Bedrock

Build generative AI agents with LangChain and Anthropic Claude 3 models on Amazon Bedrock

Learn how to leverage the new native tool-calling capabilities in the LangChain AWS package, using Anthropic Claude 3 models with Amazon Bedrock.

Laith Al-Saadoon
Amazon Employee
Published Jun 27, 2024
Many of our customers often ask me if LangChain supports Amazon Bedrock tool use– it absolutely does! But why is that so important? Tool use is crucial for agent behavior using large language models (LLMs) because it allows the models to perform specific tasks and operations beyond their inherent text generation and conversation capabilities, enhancing their functionality and accuracy. Agents use external tools and APIs to perform specific tasks. For example, an LLM agent can use a calculator tool to solve math problems or access a database to retrieve information. By integrating custom tools, LLMs can execute precise commands, interact with external systems, and deliver more practical and context-aware responses.
LangChain provides base classes, pre-built toolkits, agent runtime classes, and sample code that make it easy to build agent applications using LLMs. The combination of LangChain + Amazon Bedrock + Anthropic Claude 3 models allows you to build agents with the ease of accessing leading LLMs on AWS.

What’s New?

The LangChain AWS package now supports native tool-calling using Anthropic Claude 3 models. This enhancement allows developers to seamlessly integrate powerful LLMs with custom tools and toolkits, providing a robust and flexible environment for various applications using LangChain and LangGraph.

How It Works

Let me walk you through how you can set up and use this feature in your projects:

Set Up Your Environment:

Make sure you have the necessary dependencies installed, including `boto3`, `langchain_aws`, and other related packages.
$ pip install boto3 langchain langchain_aws mypy-boto3-bedrock-runtime pydantic

Define Your Models and Parameters

Use Pydantic models to define inference parameters and models for validation and default settings. This approach helps maintain consistency and reliability in your AI model configurations.

Get Your Bedrock Client:

Set up a Bedrock runtime client using boto3 to leverage the Claude 3 models. This involves configuring your AWS session and client.

Integrate with LangChain:

Use the ChatBedrock class to set up your Bedrock runtime client. This class allows you to pass custom tools and commands seamlessly.

Define and Use Custom Tools:

Create custom tools like this simple math operation tool and use them with your chat model.

Invoke and Execute:

Finally, you can invoke the chat model with your custom tools and get the desired results.

Wrapping up

In this post, you learned how to set up and use the LangChain AWS package with native tool-calling support for Claude 3 models on Amazon Bedrock. By following the steps, you can start building agents with the rest of the LangChain package and its capabilities.
Feel free to leave a comment with your experiences or any questions you have as you try out these features!
 

Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.

1 Comment