Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

core[docs]: Docs & example for runnable history #3527

Merged
merged 16 commits into from
Dec 6, 2023
Merged
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
import CodeBlock from "@theme/CodeBlock";
import Example from "@examples/guides/expression_language/runnable_history.ts";

# Runnables With History

<CodeBlock language="typescript">{Example}</CodeBlock>
bracesproul marked this conversation as resolved.
Show resolved Hide resolved
90 changes: 90 additions & 0 deletions examples/src/guides/expression_language/runnable_history.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,90 @@
import { ChatOpenAI } from "langchain/chat_models/openai";
import { ChatMessageHistory } from "langchain/memory";
import { ChatPromptTemplate } from "langchain/prompts";
import {
RunnableConfig,
RunnableSequence,
RunnableWithMessageHistory,
} from "langchain/runnables";
import {
BaseListChatMessageHistory,
BaseMessage,
HumanMessage,
} from "langchain/schema";
import { StringOutputParser } from "langchain/schema/output_parser";

// Define your session history store.
// This is where you will store your chat history, keyed by sessionId.
async function getListSessionHistory(): Promise<
(sessionId: string) => Promise<BaseListChatMessageHistory>
> {
const chatHistoryStore: { [key: string]: BaseListChatMessageHistory } = {};

async function getSessionHistory(
sessionId: string
): Promise<BaseListChatMessageHistory> {
if (!(sessionId in chatHistoryStore)) {
chatHistoryStore[sessionId] = new ChatMessageHistory();
}
return chatHistoryStore[sessionId];
bracesproul marked this conversation as resolved.
Show resolved Hide resolved
}

return getSessionHistory;
}

// Instantiate your model and prompt.
const model = new ChatOpenAI({});
const prompt = ChatPromptTemplate.fromMessages([
["ai", "You are a helpful assistant."],
["human", "{question}"],
]);

// Create a simple runnable which passes a question to the AI.
const runnable = RunnableSequence.from([
{
question: (messages: Array<BaseMessage>) =>
bracesproul marked this conversation as resolved.
Show resolved Hide resolved
messages.map((m) => `${m._getType()}: ${m.content}`).join("\n"),
},
prompt,
model,
new StringOutputParser(),
]);

// Create your `RunnableWithMessageHistory` object, passing in the
// runnable created above.
const withHistory = new RunnableWithMessageHistory({
runnable,
config: {},
getMessageHistory: await getListSessionHistory(),
bracesproul marked this conversation as resolved.
Show resolved Hide resolved
});

// Create your `configurable` object. This is where you pass in the
// `sessionId` which is used to identify chat sessions in your message store.
const config: RunnableConfig = { configurable: { sessionId: "1" } };

// Pass in your question as an instance of HumanMessage.
// This is because in our runnable, we prefix each message
// with the type.
let output = await withHistory.invoke(
[new HumanMessage("Hello there, I'm Archibald!")],
config
);
console.log("output 1:", output);
/**
* output 1: AI: Hello Archibald! How can I assist you today?
*/

output = await withHistory.invoke(
[new HumanMessage("What's my name?")],
config
);
console.log("output 2:", output);
/**
* output 2: AI: Your name is Archibald, as you mentioned earlier. Is there anything else I can help you with?
*/

/**
* You can see the LangSmith traces here:
* output 1 @link https://smith.langchain.com/public/f8baefdb-4dd4-4e58-abb3-bbd91da2b543/r
* output 2 @link https://smith.langchain.com/public/df49265e-b1db-4f43-a47d-f362310bd01f/r
*/