Virtual assistant powered by Generative AI
How to implement self-service voice/chat-bot powered by Amazon Bedrock and custom knowledgebase.
Anatoly Klepikov
Amazon Employee
Published Oct 4, 2024
Overview
Amazon Bedrock is a Generate AI service, which provides an ability to answer customer common inquiries using publicly available information and business specific inquiries using custom knowledge base.
Amazon Connect offers different ways how to integrate with Amazon Bedrock in order to provide customer self-service capability in the form of an open conversation as an alternative to traditional NLP techniques based on sample utterances to match customer intent. One of the ways to implement such integration is to use Amazon Lex QnAIntent – GenAI feature, which provides native integration between Amazon Lex and Amazon Bedrock. In this scenario the dialog with the customer remains within Amazon Lex. Another way is to make inquiries to Amazon Bedrock and its knowledge base directly from Amazon Connect Contact Flow. In this scenario a dialog with the customer is implemented in Amazon Connect Contact Flow calling Amazon Lex to capture customer inquiry and Amazon Lambda to pass this inquiry to Amazon Bedrock. Both approaches are perfectly valid and have their advantages.
First scenario is described in the Amazon Connect Administration Guide section AMAZON.QnAIntent.
In this article we will focus on the second scenario where Amazon Connect Contact Flow calls Amazon Lex to capture customer inquiry and Amazon Lambda to pass this inquiry to Amazon Bedrock. When Amazon Bedrock receives the inquiry, it will use in our example Australian Family Law knowledge base stored in Amazon S3 bucket to generate an answer, then pass this answer back to Amazon Connect Contact Flow as Lambda response. Amazon Connect Contact Flow can implement an internal loop, where after providing response to the current customer inquiry it will call Amazon Lex to wait for a new inquiry.
The key benefits of this solution are:
· High level of flexibility in how to implement the logic of handling customer inquiries.
· Full control in how to execute request in Amazon Bedrock and handle its response.
· Ability to use alternative services both AWS and external to handle customer inquiries in the languages currently not supported by Amazon Lex.
Solution description
The following AWS Services are used:
· Amazon Connect to provide the communication channel to the customer via voice or chat.
· Amazon Lex to capture customer inquiry.
· AWS Lambda for integration with Amazon Bedrock.
· AWS S3 for hosting the knowledge base, for our demo case Australian Family law documents.
Below is the architecture diagram for possible implementation of the solution:
Customer calls in or initiates chat (1), which launches Contact Flow in Amazon Connect. This Contact Flow can capture customer phone number or email address or other identity data, which can be used for customer identification and verification. In our demo customer identification and verification is outside of the scope, but should be mandatory for commercial environment. As the first step Contact Flow launches Amazon Lex to capture customer question as a free form text (3). Amazon Lex in our instance just captures free form input without using any slots as fallback inent. This free form text is than returned to the Contact flow (4) and passed to AWS Lambda (5), which makes API request to Amazon Bedrock (6) in order to generate an answer using knowledge base in Amazon S3 containing Australian Family Law documents (7). Returned answer is played by Amazon Connect Contact Flow block <Play prompt> using internally Amazon Polly service for text-to-speech conversion. After playing back an answer Amazon Connect Contact Flow goes into the internal loop of calling Amazon Lex (9). If customer wants to end the dialog they can say phrase “No more questions”, or just hang up.
Using Prompt Engineering Amazon Bedrock assistant can be adjusted to accommodate specific business needs, such as responding in a particular language. For voice scenario this would require Amazon Polly to support this language. For chat scenario the solution can be simplified, eg. exclude Amazon Lex used here for capturing customer question as described in the paragraph above.
Below is an example how Amazon Connect Contact Flow can be implemented. In this flow we call Amazon Lex to capture the question, validate that it is not key phrase 'No more questions' indicating end of the dialog, pass question to AWS Lambda to call Amazon Bedrock API, play response back to the customer, go back into Amazon Lex to wait for another customer question.
Below is Amazon Bedrock configuration example using custom knowledge base stored in Amazon S3 bucket:
In this example knowledge base with Reference ID YHELS4MYJY and Anthropic Large Language Model are used. You need to make sure that your Python code uses boto3 module version, which supports Bedrock API. This can be achieved by creating Lambda Layer containing the required boto3 module and assigning this Layer to your Lambda function. For more details how to create and use Lambda Layers please refer to How to create AWS Lambda Layer
In your case your Reference ID will be different, you need to use your Reference ID.
Below is Lambda code in Python, showing how customer inquiry can be used to find answer in the knowledge base powered by Amazon Bedrock (this code is an example for demonstration purposes only, reader is expected to modify it to accommodate their needs):
Summary
In this post we described how to implement virtual chat/voice-bot powered by Amazon Bedrock using Australian Family Law as custom knowledge base.
References
Tags
The team
Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.