Vector Store Memory

intermediate
Memory TypesLast updated: 2025-01-15

What is Vector Store Memory?


Vector Store Memory is a LangChain memory implementation that uses vector databases and semantic search to store and retrieve conversation history or agent memories based on relevance rather than recency. Unlike buffer-based memories that simply return recent messages, vector store memory embeds all stored interactions, then retrieves the most semantically similar memories when new input arrives. This enables surfacing relevant context from any point in the conversation history, not just recent exchanges.


The implementation stores each interaction (or observation) as text in a vector database along with its embedding. When new input is received, it's embedded and used to query the vector store for the k most similar past interactions based on semantic similarity. These relevant memories are then included in the prompt context, providing the agent with pertinent historical information regardless of when it occurred. This approach enables agents to maintain much longer effective memory spans than buffer-based approaches.


Vector Store Memory is particularly valuable for conversations involving multiple topics where relevant context might be non-consecutive, or for agents that need to recall specific information from earlier in long interactions. However, it has tradeoffs: purely similarity-based retrieval may miss important recent context, semantic search might retrieve topically similar but contextually inappropriate memories, and the approach requires vector database infrastructure. Many implementations combine vector store memory with other memory types, using buffers for recent context and vector stores for historical retrieval.


Related Terms