Overview
LangMem is LangChain's official library for adding long-term memory to LangGraph agents. It provides memory extraction, storage, and retrieval capabilities designed to integrate seamlessly with the LangChain ecosystem.
Key Features
**Memory Extraction**: Automatically extracts memorable information from conversations**Semantic Memory**: Store and query facts and preferences**LangGraph Integration**: Native integration with LangGraph agents**Flexible Storage**: Supports various vector store backends**Memory Namespaces**: Organize memories by user, session, or custom scopesWhen to Use LangMem
LangMem is ideal for:
Existing LangChain/LangGraph projects needing memoryTeams already invested in the LangChain ecosystemApplications requiring tight integration with LangGraph workflowsBuilding agents that learn from conversationsPros
Official LangChain supportSeamless LangGraph integrationFlexible and extensibleGood documentationActive developmentCons
Requires LangChain/LangGraph knowledgeRelatively new libraryLess standalone than other optionsTied to LangChain ecosystemGetting Started
from langmem import create_memory_store
from langgraph.prebuilt import create_react_agent
# Create memory store
memory = create_memory_store(
embeddings=embeddings,
store=vector_store
)
# Use with LangGraph agent
agent = create_react_agent(
model=model,
tools=tools,
memory=memory
)
Architecture
LangMem uses a modular architecture:
**Memory Store**: Handles storage and retrieval**Memory Extractors**: Pull memorable info from conversations**Memory Injectors**: Add relevant memories to promptsPricing
**Open Source**: Free, Apache 2.0 license