logo
Menu
Conversational AI: Add chatting capabilities to your app

Conversational AI: Add chatting capabilities to your app

In this tutorial, you'll learn how to implement conversational turns, providing AI models with a memory representing a conversation's history, using the Bedrock Converse API.

Dennis Traub
Amazon Employee
Published May 30, 2024
Last Modified May 31, 2024

Welcome 👋

Thanks for joining me in this tutorial series, where I'll show you all about Bedrock's new Converse API!
Today, I will show you how to implement conversational turns, providing the model with a memory of a conversation's history.

Recap: The Amazon Bedrock Converse API

The Amazon Bedrock Converse API contains two new Actions for the Bedrock Runtime: Converse and ConverseStream, simplifying the interaction with all text-based generative AI models on Amazon Bedrock. It provides a cohesive set of functionality through common and strongly typed request format, no matter which foundation model you want to use.
To learn more, check out the Amazon Bedrock User Guide, browse our growing collection of code examples, covering multiple models and programming languages, or jump directly into the AWS console and try it out yourself.

Series overview and outlook

In future posts I will also show how to extract invocation metrics and metadata from the model response, how to send and receive model-specific parameters and fields, how to retrieve, process, and print the model response in real-time, and a lot more.
So stay tuned!

A quick note on programming languages

For the longest time, the default language for AI/ML used to be Python, and unfortunately it's hard to find examples for the rest of us. This is why I am using JavaScript in this totorial, and have also created additional examples in Java, C#, etc..
And now, without any further ado, let's and dive into some actual code 💻

Add a conversation history and chat with the AI

We're using Anthropic Claude 3 Haiku today, but you can replace it with any other model that supports the unified Messages API. To find any specific model ID, here's the most current list in the Amazon Bedrock User Guide.

Prerequisites

  • Install the latest stable version of Node.js.
  • Set up a shared configuration file with your credentials. For more information, see the AWS SDK for JavaScript Developer Guide.
  • Request access to the foundation models you want to use. For more information, see Model access.

Step 1 - Import and create an instance of the Bedrock Runtime client

To interact with the API, you can use the Bedrock Runtime client, provided by the AWS SDK.
  1. Create a new file, e.g., bedrock_chat.js, and open it in an IDE or text editor.
  2. Import the BedrockRuntimeClient and the ConverseCommand from the AWS SDK for JavaScript.
  3. Create an instance of the client and configure it with the AWS Region of your choice.
1
2
3
4
5
6
import {
BedrockRuntimeClient,
ConverseCommand,
} from "@aws-sdk/client-bedrock-runtime";

const client = new BedrockRuntimeClient({ region: "us-east-1" });
Note: Please double-check that the model you want to use is available in the region and you have requested access.

Step 2 - Create the conversation history

Generative AI models are stateless and don't retain context across invocations, i.e. they don't "remember" anything you've sent. To simulate an actual conversation, we have to retain the message history in our application.
  1. Create a variable called firstMessage with a text representing an earlier prompt.
  2. In the next line, create a new variable called firstResponse with a response text representing the model's response to that first prompt.
  3. Define an array called conversation.
  4. Add both messages to the conversation and double-check their respective roles "user" and "assistant". This is how the model will understand that these are conversational turns.
1
2
3
4
5
6
7
8
9
10
11
const firstMessage = "Explain 'rubber duck debugging' in one line.";

const firstResponse =
"Rubber duck debugging is the process of explaining a problem to" +
"an inanimate object, like a rubber duck, to help identify and" +
"resolve the issue.";

const conversation = [
{ role: "user", content: [{ text: firstMessage }] },
{ role: "assistant", content: [{ text: firstResponse }] },
];

Step 3 - Add a new message to the conversation

Now, with the history set up, you can simply add your next prompt to the conversation by adding it to the end of the conversation.
  1. Define a new prompt.
  2. Add the prompt, along with the role "user", to a new message object.
  3. Add the new message object to the end of the conversation.
1
2
3
4
5
const newPrompt = "Okay. And does this actually work?";

const newMessage = { role: "user", content: [{ text: newPrompt }] };

conversation.push(newMessage);

Step 4 - Prepare the request and send it to the API

Now we'll prepare an invocation command, send it to the client, and wait for the response.
  1. Set the model ID.
  2. Create a new ConverseCommand with the model ID and the conversation
  3. Send the command to the Bedrock Runtime and wait for the response.
1
2
3
4
5
6
7
8
const modelId = "anthropic.claude-3-haiku-20240307-v1:0";

const command = new ConverseCommand({
modelId,
messages: conversation,
});

const apiResponse = await client.send(command);
Note: You can find the list of models supporting the Converse API and a list of all model IDs in the documentation.

Step 5 - Extract and print the model's response

Now, we can extract the model's response text and print it to the console:
1
2
const responseText = apiResponse.output.message.content[0].text;
console.log(responseText);

Let's run the program

🚀 Now let's see our program in action! Open a terminal, run it using Node, and observe the response.
Here's is what I got when running my example:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
$ node bedrock_chat.js

Yes, rubber duck debugging can actually be an effective problem-solving technique.
The act of explaining the problem out loud, even to an inanimate object, can help
the programmer:

1. Clarify their own understanding of the problem.
2. Catch logical errors or syntax mistakes they may have overlooked.
3. Stimulate new ideas or approaches to solving the problem.

The rubber duck serves as a passive, non-judgmental listener that forces the
programmer to organize their thoughts and verbalize the problem clearly.

This simple act of "explaining to the duck" can often lead to the programmer
discovering the solution themselves, without needing additional help or resources.

Next steps

You just got a first taste of Amazon Bedrock's powerful new Converse API. You learned how to send a conversation history to the API, simulating a memory for chat-based applications. All with just a few lines of code!
Ready for more? Here are some ideas to keep exploring:
Thanks for joining me today, I hope you learned something new! See you soon 👋

The complete source code for this tutorial

Here's the complete source code of this tutorial. Feel free to copy, paste, and start building your own AI-enhanced app!
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
import {
BedrockRuntimeClient,
ConverseCommand,
} from "@aws-sdk/client-bedrock-runtime";

const client = new BedrockRuntimeClient({ region: "us-east-1" });

const firstMessage = "Explain 'rubber duck debugging' in one line.";

const firstResponse =
"Rubber duck debugging is the process of explaining a problem to" +
"an inanimate object, like a rubber duck, to help identify and" +
"resolve the issue.";

const conversation = [
{ role: "user", content: [{ text: firstMessage }] },
{ role: "assistant", content: [{ text: firstResponse }] },
];

const newPrompt = "Okay. And does this actually work?";

const newMessage = { role: "user", content: [{ text: newPrompt }] };

conversation.push(newMessage);

const modelId = "anthropic.claude-3-haiku-20240307-v1:0";

const command = new ConverseCommand({
modelId,
messages: conversation,
});

const apiResponse = await client.send(command);

const responseText = apiResponse.output.message.content[0].text;
console.log(responseText);
 

Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.

Comments