Integrate Amazon Bedrock Prompt Management in LangChain applications
This post demonstrates how to enable prompt versioning and management, using LangChain and Amazon Bedrock Prompt management.
As a result, many Prompt Management tooling have emerged to address these needs, and on July 10th, 2024, AWS announced the preview launch of Amazon Bedrock Prompt management.
Amazon Bedrock Prompt Management simplifies the creation, evaluation, versioning, and sharing of prompts to help developers and prompt engineers get the best responses from foundation models for their use cases.
Here an example of how to create a Prompt using the Python AWS SDK.
modelId
) as well as the Model Parameters (inferenceConfiguration
). This is extremely helpful as you can iterate on the entire LLM configuration, as well as the prompt template, with each new Prompt version.First, retrieve the Prompt in the application using the get_prompt SDK call.
prompt_response
that the application needs.template_format
field to jinja2
. In Bedrock Prompt Management, you use double curly braces to include variables (as in {{variable}}
). This is the same template convention used by Jinja, which is one of the template formats supported by LangChain. - Create and manage the prompt outside the application.
- Retrieve the prompt in the application using Prompt ID and Prompt Version.
- Use the prompt template in LangChain by configuring the Jinja prompt template formatting.
Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.