Running LangChain.js Applications on AWS Lambda
Learn how to run LangChain.js apps powered by Amazon Bedrock on AWS Lambda using function URLs and response streaming.
“It is a mistake to think you can solve any major problems just with potatoes.” 🥔
― Douglas Adams, Life, the Universe and Everything
- Make sure these tools are installed and properly configured:
- Docker 🐋
- AWS SAM CLI 🐿️
- jq (optional)
- Request model access via Amazon Bedrock
💡 For more information on how enable model access, please refer to the Amazon Bedrock User Guide (Set up > Model access)
👨💻 All code and documentation for this post is available on GitHub.
1
2
git clone https://github.com/JGalego/LambdaChain
cd LambdaChain
1
2
3
4
5
6
7
8
9
.
├── README.md
├── lambdachain
│ ├── Dockerfile
│ ├── index.mjs
│ └── package.json
├── lambdachain.png
├── samconfig.toml
└── template.yaml
lambdachain
folder. The main point of interest is index.mjs
which contains the handler function. 1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
import util from 'util';
import stream from 'stream';
import { BedrockChat } from "@langchain/community/chat_models/bedrock";
import { HumanMessage } from "@langchain/core/messages";
const pipeline = util.promisify(stream.pipeline);
const model = new BedrockChat({
model: process.env.MODEL_ID || "anthropic.claude-3-sonnet-20240229-v1:0",
region: process.env.AWS_REGION || process.env.AWS_DEFAULT_REGION,
modelKwargs: {
temperature: parseFloat(process.env.TEMPERATURE) || 0.0
}
});
export const handler = awslambda.streamifyResponse(async (event, responseStream, _context) => {
const completionStream = await model.stream([
new HumanMessage({ content: JSON.parse(event.body).message })
]);
await pipeline(completionStream, responseStream);
});
💡 For more information on how to do this, please refer to the AWS Boto3 documentation (Developer Guide > Credentials).
1
2
3
4
5
6
7
# Option 1: (recommended) AWS CLI
aws configure
# Option 2: environment variables
export AWS_ACCESS_KEY_ID=...
export AWS_SECRET_ACCESS_KEY=...
export AWS_DEFAULT_REGION=...
sam local invoke
command to test the application locally, just keep in mind that response streaming is not supported (yet!).1
2
3
4
5
# 🏗️ Build
sam build --use-container
# 🚀 Deploy
sam deploy --guided
1
export FUNCTION_URL=`sam list stack-outputs --stack-name lambdachain --output json | jq -r '.[] | select(.OutputKey == "LambdaChainFunctionUrl") | .OutputValue'`
MODEL_ID
environment variable to the Lambda function to change the target model.1
2
sam remote invoke --stack-name lambdachain \
--event '{"body": "{\"message\": \"What is the answer to life, the Universe and everything?\"}"}'
1
2
3
4
5
6
7
8
curl --no-buffer \
--silent \
--aws-sigv4 "aws:amz:$AWS_DEFAULT_REGION:lambda" \
--user "$AWS_ACCESS_KEY_ID:$AWS_SECRET_ACCESS_KEY" \
-H "x-amz-security-token: $AWS_SESSION_TOKEN" \
-H "content-type: application/json" \
-d '{"message": "What is the answer to life, the Universe and everything?"}' \
$FUNCTION_URL
☝️ Pro Tip: Pipe the output throughjq -rj .kwargs.content
for a cleaner output
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
The answer is 42.
This is a reference to the famous joke from The Hitchhiker's Guide to the Galaxy by Douglas Adams.
In the story, scientists build an incredibly powerful computer called Deep Thought to calculate the
Answer to the Ultimate Question of Life, the Universe, and Everything.
After 7.5 million years of computing, Deep Thought provides the answer: 42.
Of course, 42 is not really the meaningful answer everyone was hoping for.
It's simply an absurd joke playing on the deep philosophical question by giving an unhelpful numerical answer.
The point is that the question itself is too vague and impossible to definitively answer in such a simplistic way.
So in pop culture, "42" has become a tongue-in-cheek way to provide a humorous non-answer answer to
the mysteries of existence and the universe.
It's an iconic bit of silliness from the brilliant comedic mind of Douglas Adams.
Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.