Running LangChain.js Applications on AWS Lambda
Learn how to run LangChain.js apps powered by Amazon Bedrock on AWS Lambda using function URLs and response streaming.
“It is a mistake to think you can solve any major problems just with potatoes.” 🥔
― Douglas Adams, Life, the Universe and Everything

- Make sure these tools are installed and properly configured:
- Docker 🐋
- AWS SAM CLI 🐿️
- jq (optional)
- Request model access via Amazon Bedrock
💡 For more information on how enable model access, please refer to the Amazon Bedrock User Guide (Set up > Model access)
👨💻 All code and documentation for this post is available on GitHub.
lambdachain
folder. The main point of interest is index.mjs
which contains the handler function. 💡 For more information on how to do this, please refer to the AWS Boto3 documentation (Developer Guide > Credentials).
sam local invoke
command to test the application locally, just keep in mind that response streaming is not supported (yet!).MODEL_ID
environment variable to the Lambda function to change the target model.☝️ Pro Tip: Pipe the output throughjq -rj .kwargs.content
for a cleaner output
Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.