Select your cookie preferences

We use essential cookies and similar tools that are necessary to provide our site and services. We use performance cookies to collect anonymous statistics, so we can understand how customers use our site and make improvements. Essential cookies cannot be deactivated, but you can choose “Customize” or “Decline” to decline performance cookies.

If you agree, AWS and approved third parties will also use cookies to provide useful site features, remember your preferences, and display relevant content, including relevant advertising. To accept or decline all non-essential cookies, choose “Accept” or “Decline.” To make more detailed choices, choose “Customize.”

AWS Logo
Menu

Conversational AI: Add chatting capabilities to your app

Discover how to implement multi-turn conversations and maintain context using Amazon Bedrock's Converse API and the AWS SDK for JavaScript.

Dennis Traub
Amazon Employee
Published May 30, 2024
Last Modified Mar 5, 2025
[ Open the Java edition | Jump directly to the code 💻 ]
Welcome to Part 2 of my tutorial series on Amazon Bedrock's Converse API!
Today, we'll explore how to implement a conversation using Amazon Bedrock's Converse API. You'll learn how to send messages to an AI model, maintain a conversation history, and handle multi-turn conversations. By the end of this tutorial, you'll have the skills to add chatting capabilities to your own applications.
Note: The examples in this edition use JavaScript. I've also prepared a Java edition, and many more examples in Python, C#, etc.

Series overview

This series guides you through Amazon Bedrock's Converse API:
  • In Part 1: Getting Started, you learned how to send your first request.
  • In Part 2: Conversational AI (this post), I'll show you how to implement conversational turns.
  • In Part 3: Customizing AI Behavior, we'll configure the model with a system prompt and additional inference parameters.
Future posts will cover extracting invocation metrics and metadata, sending and receiving model-specific parameters and fields, processing model responses in real-time, the new tool-use feature, and more.
Let's dive in and start building! 💻

Step-by-step: Add a conversation history and chat with the AI

Prerequisites

Before you begin, ensure all prerequisites are in place. You should have:
  • The AWS CLI installed and configured with your credentials
  • The latest stable version of Node.js and npm installed
  • Requested access to the model you want to use

Step 1: Install the Bedrock Runtime client

First, install the Bedrock Runtime client from the AWS SDK for JavaScript v3:
1
npm install @aws-sdk/client-bedrock-runtime

Step 2: Import required modules

In a new JavaScript file, import the required modules:
1
2
3
4
import {
BedrockRuntimeClient,
ConverseCommand,
} from "@aws-sdk/client-bedrock-runtime";

Step 3: Create a BedrockRuntimeClient instance

Create an instance of the BedrockRuntimeClient, specifying the AWS region where the model is available:
1
const client = new BedrockRuntimeClient({ region: "us-east-1" });

Step 4: Specify the model ID

Specify the ID of the model you want to use. In this example, we'll use Claude 3 Haiku:
1
const modelId = "anthropic.claude-3-haiku-20240307-v1:0";
You can find the complete list of models supporting the Converse API and a list of all available model IDs in the documentation.

Step 5: Send an initial message

Prepare the first user message and add it to a new conversation array:
1
2
3
4
5
6
7
const firstUserMessage = "What is the capital of Australia?";
const conversation = [
{
role: "user",
content: [{ text: firstUserMessage }]
}
];
The conversation array represents the conversation history, with each element containing a role (e.g., "user" or "assistant") and at least one content block with the message text.
Send the conversation to the model using a ConverseCommand:
1
2
3
const firstResponse = await client.send(
new ConverseCommand({ modelId, messages: conversation })
);
The ConverseCommand sends the conversation to the specified model and returns its response.
Print out the model's response:
1
2
const firstResponseText = firstResponse.output.message.content[0].text;
console.log(`First response: ${firstResponseText}`);

Step 6: Add the model response to the conversation

To maintain context for the next request, add the model's response to the conversation array:
1
conversation.push(firstResponse.output.message);

Step 7: Send a follow-up message

Prepare a second user message and add it to the conversation:
1
2
3
4
5
const secondUserMessage = "What was my first question?";
conversation.push({
role: "user",
content: [{ text: secondUserMessage }]
});
Send the updated conversation to the model:
1
2
3
const secondResponse = await client.send(
new ConverseCommand({ modelId, messages: conversation })
);
Print out the model's response:
1
2
const secondResponseText = secondResponse.output.message.content[0].text;
console.log(`Second response: ${secondResponseText}`);

Let's run the program

With the code complete, let's run it and see the AI engage in a multi-turn conversation!
Here's the full code for reference:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
import {
BedrockRuntimeClient,
ConverseCommand,
} from "@aws-sdk/client-bedrock-runtime";

const client = new BedrockRuntimeClient({ region: "us-east-1" });

const modelId = "anthropic.claude-3-haiku-20240307-v1:0";

const firstUserMessage = "What is the capital of Australia?";
const conversation = [
{
role: "user",
content: [{ text: firstUserMessage }],
},
];

const firstResponse = await client.send(
new ConverseCommand({ modelId, messages: conversation }),
);

const firstResponseText = firstResponse.output.message.content[0].text;
console.log(`First response: ${firstResponseText}`);

conversation.push(firstResponse.output.message);

const secondUserMessage = "What was my first question?";
conversation.push({
role: "user",
content: [{ text: secondUserMessage }],
});

const secondResponse = await client.send(
new ConverseCommand({ modelId, messages: conversation }),
);

const secondResponseText = secondResponse.output.message.content[0].text;
console.log(`Second response: ${secondResponseText}`);
To run it:
  1. Save the code in a file named bedrock_conversation.js
  2. In your terminal, navigate to the directory containing bedrock_conversation.js
  3. Run the following command:
1
node bedrock_conversation.js
If everything is set up correctly, you should see the model's responses printed in the console:
1
2
First response: The capital of Australia is Canberra.
Second response: Your first question was "What is the capital of Australia?".
If you encounter any errors, double check that you have:

Next steps

Congratulations on implementing a multi-turn conversation with Bedrock's Converse API!
Ready for more? Here are some ideas to keep exploring:
  • Write a chat loop that repeatedly prompts the user for input and prints the model's responses.
  • Experiment with different models and compare their responses. Here's the list of all models supporting the Converse API.
  • Challenge yourself to rewrite this program in another language. Here are examples in Python, Java, C#, and more.
  • Dive deeper into the Converse API in the Amazon Bedrock User Guide.
In Part 3 of this series, we'll learn how to customize the AI's behavior with a system prompt and inference parameters.
I'd love to see what you build with Amazon Bedrock! Feel free to share your projects or ask questions in the comments.
Thanks for following along and happy building! 💻🤖
 

Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.

1 Comment

Log in to comment