AWS Logo
Menu
Build a Streamlit app with LangChain and Amazon Bedrock

Build a Streamlit app with LangChain and Amazon Bedrock

Use ElastiCache Serverless Redis for chat history, deploy to EKS and manage permissions with EKS Pod Identity

Abhishek Gupta
Amazon Employee
Published Jan 12, 2024
Last Modified Jan 22, 2024
It's one thing to build powerful machine learning models and another thing to be able to make them useful. A big part of it is to be able to build applications to expose its features for end users. Popular examples include ChatGPT, Midjourney etc.
Streamlit is an open-source Python library which makes it easy to build web applications for machine learning and data science. It has a set of rich APIs for visual components including several chat elements, making it quite convenient to build conversational agents or chatbots, especially when combined with LLMs (Large Language Models).
And that's the example for this blog post as well - A Streamlit based chatbot, deployed to a Kubernetes cluster on Amazon EKS. But that's not all!
We will use Streamlit with LangChain, which is a framework for developing applications powered by language models. The nice thing about LangChain is that it supports many platforms and LLMs, including Amazon Bedrock (which will be used for our application).
A key part of chat applications is the ability to refer to historical conversation(s) - at least within a certain time-frame (window). In LangChain, this is referred to as Memory. Just like LLMs, you can plug-in different systems to work as the memory component of a LangChain application. This includes Redis, which is a great choice for this use case since it's an high performance in-memory database with flexible data structures. Redis is already a preferred choice for real-time applications (including chat) combined with Pub/Sub and WebSocket. This application will use Amazon ElastiCache Serverless for Redis, an option that simplifies cache management and scales instantly. This was announced at re:Invent 2023, so let's explore while it's still fresh!
To be honest, the application can be deployed on other compute options such as Amazon ECS, but I figured since it needs to invoke Amazon Bedrock, it's a good opportunity to also cover how to use EKS Pod Identity (also announced at re:Invent 2023!!)
GitHub repository for the app - https://github.com/build-on-aws/streamlit-langchain-chatbot-bedrock-redis-memory
Here is a simplified, high-level diagram:
architecture
High level architecture
Let's go!!

Basic setup

Push to the Docker image to ECR and deploy the app to EKS

Clone the GitHub repository:
Create an ECR repository:
Create the Docker image and push it to ECR:

Deploy Streamlit chatbot to EKS

Update the app.yaml file:
  • Enter the ECR docker image info
  • In the Redis connection string format, enter the Elasticache username and password along with the endpoint.
Deploy the application:
To check logs: kubectl logs -f -l=app=streamlit-chat

Start a conversation!

To access the application:
Navigate to http://localhost:8080 using your browser and start chatting! The application uses the Anthropic Claude model on Amazon Bedrock as the LLM and Elasticache Serverless instance to persist the chat messages exchanged during a particular session.
chatbot
Streamlit Chatbot

Behind the scenes in ElastiCache Redis

To better understand what's going, you can use redis-cli to access the Elasticache Redis instance from EC2 (or Cloud9) and introspect the data structure used by LangChain for storing chat history.
Don't run keys * in a production Redis instance - this is just for demonstration purposes
You should see a key similar to this - "message_store:d5f8c546-71cd-4c26-bafb-73af13a764a5" (the name will differ in your case).
Check it's type: type message_store:d5f8c546-71cd-4c26-bafb-73af13a764a5 - you will notice that it's a Redis List.
To check List contents, use the LRANGE command:
You should see a similar output:
Basically, the Redis memory component for LangChain persists the messages as a List and passes its contents as additional context with every message.

Conclusion

To be completely honest, I am not a Python developer (mostly use Go, Java or sometimes Rust), but found Streamlit relatively easy to start with, except for some of the session related nuances.
I figured out that for each conversation, the entire Streamlit app is executed (this was a little unexpected coming from a backend dev background). That's when I moved the chat ID (kind of unique session ID for each conversation) to the Streamlit session state, and things worked.
This is also used as part of the name of the Redis List that stores the conversation (message_store:<session_id>) - each Redis List is mapped to the Streamlist session. I also found the Streamlit component based approach to be quite intuitive and it pretty extensive as well.
I was wondering if there are similar solutions in Go 🤔 If you know of something, do let me know.
Happy building!
 

Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.

1 Comment