Generative AI Serverless - Apply Guardrail, Bedrock Converse API, RAG - Chat with your document!

Generative AI Serverless - Apply Guardrail, Bedrock Converse API, RAG - Chat with your document!

Use Amazon Bedrock Apply Guardrails while creating generative AI applications with the Converse API! Apply these guardrails while using chat with your document for credit card pre-qualification data file.

Published Jul 18, 2024
Last Modified Jul 19, 2024
In this article, I am going to show you how to build a serverless GenAI RAG solution to implement a document chat feature using Amazon Bedrock Converse API and Lambda. Also, I will apply one of the newest features introduced in July 2024 which is apply guardrail so that we have control over input prompt as well as response being returned to the calling app/consumer.
Guardrail is a much needed feature supported by Amazon Bedrock to guard the contents while using a Generative AI solution.
'Chat With Document' features supported by Amazon Bedrock is a form of RAG and allows you to have a contextual conversation and ask questions based on the data in the document augmented with LLM for Generative AI.
RAG, which stands for Retrieval Augmented Generation, is becoming increasingly popular in the world of Generative AI. It allows organizations to overcome the limitations of LLMs and utilize contextual data for their Generative AI solutions.
I will use the recently released Anthropic Sonnet foundation model and invoke it via the Amazon Bedrock Converse using Lambda and API.
There are many use cases where generative AI chat with your document function can help increase productivity. Few examples will be technical support extracting info from user manual for quick resolution of questions from the customers, or HR answering questions based on policy documents or developer using technical documentation to get info about specific function or a call center team addressing inquiries from customers quickly by chatting with product documentation.
In all these use cases, it is important to apply appropriate guardrails to protect sensitive information and to deny responses on certain topics that are not suited for the business case.
Let's look at our use cases:
  • The use case is in the context of Pre-Qualification data for potential credit card customers.
  • There is a data file that contains the list of customers with Pre-Qualification data. This data contains screen id, customer name, qualification status, credit score and zip code.
  • Please note that this dummy data has been created using GenAI for the purpose of this workshop.
  • A Generative AI enabled API will provide the response to common prompts using this document as augmented source.
  • This solution will enable the customer support and marketing team to have a quick view of data regarding the customers who qualify for the card, not qualified or the cases where more information is required.
  • Guardrail needs to be implemented to ensure that credit score data is not shared with customer or calling app/consumer.
  • The solution needs to be API-based so that it can be invoked via different applications.
Here is the architecture diagram for our use case.
Β 
Architecture Diagram
Architecture Diagram
Create a guardrail using AWS Console
  • Login to AWS Console
  • From Bedrock Γ  Guardrail, click on Create Guardrail.β€’
  • Provide a name and other details for the guardrail.
Guardrail
Guardrail
Add deny topic in this guardrail. Credit score contains in the pre-qualification data. When asked for the value of credit score, deny the request related to Credit Score.
Create a SAM template
I will create a SAM template for the lambda function that will contain the code to invoke Bedrock Converse API along with required parameters and a prompt. Apply Guardrail will be called both with input and output before providing the final response back to the consumer.
Lambda function can be created without the SAM template however, I prefer to use Infra as Code approach since that allow for easy recreation of cloud resources. Here is the SAM template for the lambda function.
AWS SAM
AWS SAM
Create a Lambda Function
The Lambda function serves as the core of this automated solution. It contains the code necessary to fulfill the business requirement of consuming the pre-qualification data file from S3 bucket, apply the guardrail and then invoke the converse API to generate a response using the Anthropic Sonnet foundation model. Now, Let's look at the code behind it.
AWS Lambda
AWS Lambda
Build function locally using AWS SAM
Next build and validate function using AWS SAM before deploying the lambda function in AWS cloud. Few SAM commands used are:
  • SAM Build
  • SAM local invoke
  • SAM deploy
Validate the GenAI Model response using a prompt
Prompt engineering is an essential component of any Generative AI solution. It is both art and science, as crafting an effective prompt is crucial for obtaining the desired response from the foundation model. Often, it requires multiple attempts and adjustments to the prompt to achieve the desired outcome from the Generative AI model.
Given that I'm deploying the solution to AWS API Gateway, I'll have an API endpoint post-deployment. I plan to utilize Postman for passing the prompt in the request and reviewing the response. Additionally, I can opt to post the response to an AWS S3 bucket for later review.
Examples of guarded/un-guarded response returned by the API
Prompt: Please tell me credit score for Jon Doe.
Response 1
Response 1
Prompt: Can john doe get the card? ( Guardrail not applied for this response )
Response 2
Response 2
Prompt: Apply guardrail is applied to the response: (along with modified system prompt)
Response 3
Response 3
Prompt: how many customers not qualified for the credit card?
Response 4
Response 4
As you can see in the responses above, when guardrail is applied, credit score data is not being shown to the consumer. Input prompt is guarded as well hence LLMs will not be approached for a response if input/prompt does not pass the guarded policy.
With these steps,, a serverless GenAI solution has been successfully completed to implement a chat with your document solution using Amazon Bedrock Converse, Lambda, and API. Python/Boto3 were utilized to invoke the Bedrock API with Anthropic Sonnet.
I also applied the guardrail to guard the contents per the configured guardrail policy.
As GenAI solutions keep improving, they will change how we work and bring real benefits to many industries. This workshop shows how powerful AI can be in solving real-world problems and creating new opportunities for innovation.
Thanks for reading!
Click here to get to YouTube video for this solution.
π’’π’Ύπ“‡π’Ύπ“ˆπ’½ ℬ𝒽𝒢𝓉𝒾𝒢
𝘈𝘞𝘚 𝘊𝘦𝘳𝘡π˜ͺ𝘧π˜ͺ𝘦π˜₯ 𝘚𝘰𝘭𝘢𝘡π˜ͺ𝘰𝘯 𝘈𝘳𝘀𝘩π˜ͺ𝘡𝘦𝘀𝘡 & π˜‹π˜¦π˜·π˜¦π˜­π˜°π˜±π˜¦π˜³ 𝘈𝘴𝘴𝘰𝘀π˜ͺ𝘒𝘡𝘦
𝘊𝘭𝘰𝘢π˜₯ π˜›π˜¦π˜€π˜©π˜―π˜°π˜­π˜°π˜¨π˜Ί 𝘌𝘯𝘡𝘩𝘢𝘴π˜ͺ𝘒𝘴𝘡
Β 

Comments