logo
Menu
A Java developer's guide to Bedrock's new Converse API

A Java developer's guide to Bedrock's new Converse API

Learn hands-on how to use Amazon Bedrock's new Converse API to interact with generative AI models on Amazon Bedrock, using the AWS SDK for Java.

Dennis Traub
Amazon Employee
Published Jun 6, 2024
👨‍💻 Want to dive directly into the code? Here you go!

Introduction

Welcome, and thanks for joining me in this tutorial series, where we'll explore Amazon Bedrock's new Converse API, using Java.
This edition of the tutorial uses Java. I've also prepared a JavaScript edition, and have uploaded many more examples in Python, C#, etc.
With the Converse API, we've added two new Actions to the Bedrock Runtime: Converse and ConverseStream, simplifying the interaction with all text-based generative AI models on Amazon Bedrock. It provides a cohesive set of functionality through common and strongly typed request format, no matter which foundation model you want to use.

What you will learn in this series

This series will guide you through the following features of the new Converse API:
  • In Part 1: Getting Started (this post), you will learn how to send your first request.
  • In Part 2: Conversational AI (upcoming), I will show you how to implement conversational turns.
  • In Part 3: Customizing AI Behavior (upcoming, we'll configure the model with a system prompt and additional inference parameters.
Future posts will cover extracting invocation metrics and metadata, sending and receiving model-specific parameters and fields, processing model responses in real-time, and more.
And now, without any further ado, let's and dive into some actual code 💻

Step-by-step: Send your first prompt with the Converse API

Throughout the following steps, you will learn how to use the Amazon Bedrock Runtime client for Java to send a message to a foundation model and print the response.
Let's get started!

Prerequisites

Before you begin, ensure you have:
  • A Java Development Kit (JDK) version 17 or later and a build tool like Apache Maven or Gradle for managing dependencies and building your project.
  • Set up your AWS credentials. For more information, see Set up the AWS SDK for Java 2.x.
  • Make sure you have access to the foundation models you want to use. For more information, see Model access.

Step 1 - Set up a new Java Project

Create a new Java project using Maven or Gradle and add the required dependencies to your build configuration file:
Note: Replace 2.25.67 in the examples with the latest version of the AWS SDK for Java.

Adding Dependencies with Maven

If you are using Maven, add the following dependencies to your pom.xml file:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
<properties>
<aws.java.sdk.version>2.25.67</aws.java.sdk.version>
</properties>

<dependencyManagement>
<dependencies>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>bom</artifactId>
<version>${aws.java.sdk.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>

<dependencies>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>bedrockruntime</artifactId>
</dependency>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>sts</artifactId>
</dependency>
</dependencies>

Adding Dependencies with Gradle

If you are using Gradle, add the following dependencies to your build.gradle file:
1
2
3
4
5
dependencies {
implementation(platform("software.amazon.awssdk:bom:2.25.67"))
implementation("software.amazon.awssdk:bedrockruntime")
implementation("software.amazon.awssdk:sts")
}

Step 2 - Import and create an instance of the Bedrock Runtime client

To use the Bedrock Runtime SDK in your Java application, import the necessary classes and create a BedrockRuntimeClient instance.
  1. Open your Java file, e.g., src/main/java/com/example/App.java, in your IDE or text editor.
  2. Create an instance of the BedrockRuntimeClient using the builder pattern, with a credentials provider¹ and the region you want to use.²
1
2
3
4
5
6
7
8
9
10
import ... // You can find all required imports in the complete example below.

public class App {
public static void main(String[] args) {
BedrockRuntimeClient client = BedrockRuntimeClient.builder()
.credentialsProvider(DefaultCredentialsProvider.create())
.region(Region.US_EAST_1)
.build();
}
}
¹ This example uses the DefaultCredentialsProvider. Depending on your local setup, you may have to use a different credentials provider. Learn more about the available options at: Provide temporary credentials to the SDK.
² Double-check that the model you want to use is available in the region and you have requested access.

Step 2 - Prepare a message to send

Next, we'll define the text we want to send and embed it in a message to send to the model. The Message class is part of the Converse API, providing a common format to interact with all text-based models on Amazon Bedrock.
  1. Define the inputText, which can be any prompt you choose.
  2. Create the message, using the Message.builder() method, embed the inputText in a ContentBlock, and the role of the sender, in this case ConversationRole.USER.
1
2
3
4
5
6
String inputText = "Explain 'rubber duck debugging' in one line.";

Message message = Message.builder()
.content(ContentBlock.fromText(inputText))
.role(ConversationRole.USER)
.build();

Step 3 - Prepare the request and send it to the API

Now that we have our message ready, it's time to prepare the request and send it to the API.
  1. Specify the modelId of the foundation model you want to use. In this example, we're using Anthropic Claude 3, but you can replace it with any other model supported by the Converse API.¹
  2. Call the converse() method on the Bedrock Runtime client, passing in a request builder lambda. In the lambda, set the modelId, the message, and any additional inference configuration options.²
  3. The converse method returns a ConverseResponse object containing the model's response.
1
2
3
4
5
6
7
8
String modelId = "anthropic.claude-3-haiku-20240307-v1:0";

ConverseResponse response = client.converse(request -> request
.modelId(modelId)
.messages(message)
.inferenceConfig(config -> config
.maxTokens(512)
.temperature(0.5F)));
¹ You can find the list of models supporting the Converse API and a list of all model IDs in the documentation.
² Available inference configuration options are: maxTokens, temperature, topP, and stopSequences. You can find more details in the documentation of the InferenceConfiguration field.

Step 4 - Extract and print the model's response.

After sending the request to the API, the converse method returns a ConverseResponse object containing the model's response. To access and print the generated text:
  1. Extract the response text from the ConverseResponse by accessing the first (and in this case, only) content block in the output message.
  2. Finally, print the response text to the console, allowing us to see the model's generated output.
1
2
String responseText = response.output().message().content().get(0).text;
System.out.println(responseText);

Step 5 - Run the program

🚀 Now let's see it in action! With all the code in place, it's time to run our program and see the results.
  1. Compile and run the Java application using your preferred IDE or command-line tools.
  2. If everything is set up correctly, you should see the model's response printed in the console, similar to this:
1
2
Rubber duck debugging is the process of explaining your code to
a rubber duck (or any inanimate object) to identify and fix bugs.
Congratulations! You have successfully sent your first request to the Amazon Bedrock Converse API using the AWS SDK for Java, and all with just a few lines of code!

Next steps

You can now experiment with different prompts, models, and inference configuration options to learn more about the Converse API.
Here are some ideas to keep exploring:
  • Swap out the model with another one and see how they respond differently. Here's the list of all models supporting the Converse API.
  • Challenge yourself to rewrite this program in another programming language. For inspiration, check out these examples in multiple languages.
  • Learn more about the Converse API in the Amazon Bedrock User Guide.
In the next part of this series, we'll dive deeper into the Converse API and learn how to implement conversational turns, allowing you to engage in multi-turn conversations with the AI.
Thanks for joining me today, I hope you learned something new! See you soon 👋

The complete source code

Here's the complete source code. Feel free to copy, paste, and start building your own AI-enhanced app!
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
import software.amazon.awssdk.regions.Region;
import software.amazon.awssdk.services.bedrockruntime.BedrockRuntimeClient;
import software.amazon.awssdk.services.bedrockruntime.model.ContentBlock;
import software.amazon.awssdk.services.bedrockruntime.model.ConversationRole;
import software.amazon.awssdk.services.bedrockruntime.model.ConverseResponse;
import software.amazon.awssdk.services.bedrockruntime.model.Message;

public class App
{
public static void main( String[] args )
{
// Create a Bedrock Runtime client in the AWS Region you want to use.
// Replace the DefaultCredentialsProvider with your preferred credentials provider.
BedrockRuntimeClient client = BedrockRuntimeClient.builder()
.credentialsProvider(DefaultCredentialsProvider.create())
.region(Region.US_EAST_1)
.build();

// Create the input text and embed it in a message object with the user role.
String inputText = "Explain 'rubber duck debugging' in one line.";

Message message = Message.builder()
.content(ContentBlock.fromText(inputText))
.role(ConversationRole.USER)
.build();

// Set the model ID, e.g., Claude 3 Haiku.
String modelId = "anthropic.claude-3-haiku-20240307-v1:0";

// Send the message with a basic inference configuration.
ConverseResponse response = client.converse(request -> request
.modelId(modelId)
.messages(message)
.inferenceConfig(config -> config
.maxTokens(512)
.temperature(0.5F)));

// Retrieve and print the generated text from Bedrock's response object.
String responseText = response.output().message().content().get(0).text();
System.out.println(responseText);
}
}
 

Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.

Comments