Usage with LangChain
This guide demonstrates how to integrate zkStash SDK with LangChain to provide long-term memory for your agents.
We cover two primary integration patterns:
- Hot Path: Using MCP to give agents explicit tools to manage memories.
- Background: Using a custom middleware to automatically handle memory storage and retrieval on the agentic loop.
Prerequisites
Ensure you have the necessary dependencies installed:
pnpm add @zkstash/sdk langchain @langchain/anthropic zodDefining Memories Schemas
We define a memory schema we will use for this demo.
// Create a memory schema
const preferencesSchema = z.object({
color: z.string(),
food: z.string(),
});
await client.registerSchema({
name: "user_preferences",
description: "Use to remember the user's preferences.",
uniqueOn: ["kind"], // Auto-supersede: keeps only the latest preferences
schema: preferencesSchema,
});Hot Path:
In this pattern, the agent is “aware” of its memory capabilities. It accesses zkStash through tools exposed via the Model Context Protocol (MCP). This allows the agent to decide when to search for memories or save new information.
1. Setup MCP Client
For more details on LangChain’s MCP integration, please see the official documentation .
First, create a ZkStash MCP client using the MCP entrypoint.
import { fromPrivateKey } from "@zkstash/sdk/mcp";
// Initialize the client
const mcpClient = await fromPrivateKey(
process.env.PRIVATE_KEY!,
{
agentId: "my-agent",
threadId: "conversation-001"
}
);2. Expose Tools to the Agent
// 1. Get raw tools from SDK
const memoryTools = await mcpClient.listTools();
// 2. Create Agent
const agent = createAgent({
model: "claude-sonnet-4-5",
tools,
});
// 3. Run the agent
const response = await agent.invoke({
messages: [{
role: "user",
content: "My favorite color is green, and I love pizza."
}],
});Background:
In this pattern, memory management happens after the loop interaction. The agent doesn’t need to choose to save or search; We will create a middleware that handles it for every interaction. This is useful for seamless “always-on” memory.
Implementation
For more details on creating custom middleware in LangChain, please see the official documentation .
We will create a middleware using LangChain’s createMiddleware to wrap the agent execution. This middleware intercepts the run to inject context and save results.
import {
createMiddleware,
AIMessage,
HumanMessage,
SystemMessage
} from "langchain";
import { fromPrivateKey } from "@zkstash/sdk/rest";
// Initialize SDK
const client = await fromPrivateKey(process.env.PRIVATE_KEY!);
// Create middleware
export const createMemoryMiddleware = () => {
return createMiddleware({
name: "MemoryMiddleware",
// Before agent starts (once per invocation)
beforeAgent: async (state) => {
const { agentId, threadId, messages } = state;
const lastMessage = messages[messages.length - 1];
// Only search if the last message is from a user
if (lastMessage instanceof HumanMessage) {
console.log(`Searching memories for: "${lastMessage.content}"`);
const memories = await client.searchMemories({
query: lastMessage.content as string,
filters: {
agentId,
threadId,
kind: 'user_preferences'
}
});
if (memories.length > 0) {
// Here we prepend a SystemMessage with context.
const contextMsg = new SystemMessage(
`Relevant Memories:\n${memories.map(m => m.content).join("\n")}`
);
return {
messages: [contextMsg, ...state.messages]
};
}
}
return;
},
// After agent completes (once per invocation)
afterAgent: async (state) => {
const { messages, agentId, threadId } = state;
// Get last interavtion
const lastUser = messages.findLast(m => m instanceof HumanMessage);
const lastAI = messages.findLast(m => m instanceof AIMessage);
if (lastUser && lastAI) {
await client.createMemory({
agentId,
threadId,
schemas: ["user_preferences"],
conversation: [
{ role: "user", content: lastUser.content as string },
{ role: "assistant", content: lastAI.content as string }
]
});
}
return;
}
});
};Usage with Agent
Attach the middleware to your runnable.
const agent = createAgent({
model: "claude-sonnet-4-5",
tools: [...],
middleware: [callCounterMiddleware],
});
const result = await agent.invoke({
messages: [new HumanMessage("My favorite color is green and I love pizza.")],
agentId: "my-agent",
threadId: "conversation-001",
});