Customizing AI Behavior: System prompts and inference parameters in Bedrock's Converse API
In this tutorial, you'll learn how to configure a generative AI model with a system prompt and additional inference parameters, using the Bedrock Converse API and the AWS SDK for JavaScript.
- In Part 1: Getting Started, you learned how to send your first request.
- In Part 2: Conversational AI, I'll show you how to implement conversational turns.
- In Part 3: Customizing AI Behavior (this post), we'll configure the model with a system prompt and additional inference parameters.
- An AWS account
- The AWS CLI installed and configured with your credentials
- The latest stable version of Node.js and npm installed
- Requested access to the model you want to use
1
npm install @aws-sdk/client-bedrock-runtime
1
2
3
4
import {
BedrockRuntimeClient,
ConverseCommand,
} from "@aws-sdk/client-bedrock-runtime";
BedrockRuntimeClient
, specifying the AWS region where the model is available:1
const client = new BedrockRuntimeClient({ region: "us-east-1" });
1
const modelId = "anthropic.claude-3-haiku-20240307-v1:0";
1
2
3
4
5
6
7
const userMessage = "What is 'rubber duck debugging'";
const conversation = [
{
role: "user",
content: [{ text: userMessage }]
}
];
1
const systemPrompt = [{ text: "You must always respond in rhymes." }];
1
2
3
4
const additionalParameters = {
maxTokens: 100,
temperature: 0.5
};
ConverseCommand
:1
2
3
4
5
6
7
8
const response = await client.send(
new ConverseCommand({
modelId,
messages: conversation,
system: systemPrompt,
inferenceConfig: additionalParameters
})
);
1
2
const responseText = response.output.message.content[0].text;
console.log(responseText);
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
import {
BedrockRuntimeClient,
ConverseCommand,
} from "@aws-sdk/client-bedrock-runtime";
const client = new BedrockRuntimeClient({ region: "us-east-1" });
const modelId = "anthropic.claude-3-haiku-20240307-v1:0";
const userMessage = "What is 'rubber duck debugging'";
const conversation = [
{
role: "user",
content: [{ text: userMessage }]
}
];
const systemPrompt = [{ text: "You must always respond in rhymes." }];
const additionalParameters = {
maxTokens: 100,
temperature: 0.5
};
const response = await client.send(
new ConverseCommand({
modelId,
messages: conversation,
system: systemPrompt,
inferenceConfig: additionalParameters
})
);
const responseText = response.output.message.content[0].text;
console.log(responseText);
- Save the code in a file named
bedrock_customized.js
- In your terminal, navigate to the directory containing
bedrock_customized.js
- Run the following command:
1
node bedrock_customized.js
1
2
3
4
5
6
7
8
9
10
11
12
13
14
Rubber duck debugging, a clever way,
To solve problems without delay.
You talk to a duck, real or toy,
And the solution you will enjoy.
By explaining the issue out loud,
Your mind becomes less clouded.
The duck listens, never judges,
And the answer often just nudges.
It's a technique that's quite profound,
When you're stuck, it helps you rebound.
So next time you're in a bind,
Grab a duck and let your thoughts unwind.
- Configured your AWS credentials
- Requested access to the model you are using
- Installed the required dependencies
- Experiment with different models and compare their responses. Here's the list of all models supporting the Converse API.
- Challenge yourself to rewrite this program in another language. Here are examples in Python, Java, C#, and more.
- Dive deeper into the Converse API in the Amazon Bedrock User Guide.
Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.