
Customizing AI Behavior: System prompts and inference parameters with Java and Bedrock's Converse API
In this tutorial, you'll learn how to configure a generative AI model with a system prompt and additional inference parameters, using the Bedrock Converse API and the AWS SDK for Java.
- In Part 1: Getting Started, you learned how to send your first request.
- In Part 2: Conversational AI, I'll show you how to implement conversational turns.
- In Part 3: Customizing AI Behavior (this post), we'll configure the model with a system prompt and additional inference parameters.
- An AWS account
- The AWS CLI installed and configured with your credentials
- A Java Development Kit (JDK) version 17 or later and a build tool like Apache Maven installed
- Requested access to the model you want to use
pom.xml
file:aws.sdk.version
with the latest version of the AWS SDK for Java.USER
role:converse()
method:- Save the code in a file named
BedrockCustomized.java
- Compile and run the Java application using your preferred IDE or command-line tools.
- Configured your AWS credentials
- Requested access to the model you are using
- Installed the required dependencies
- Experiment with different models and compare their responses. Here's the list of all models supporting the Converse API.
- Challenge yourself to rewrite this program in another language. Here are examples in Python, JavaScript, C#, and more.
- Dive deeper into the Converse API in the Amazon Bedrock User Guide.
Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.