The Memory MCP server gives your AI assistant a persistent knowledge graph that survives across chat sessions. It stores entities (people, projects, concepts) and their relationships in a local JSON file, so your AI remembers context from previous conversations. Configure it with one line in your MCP config — no API keys or databases needed — and your AI builds up a growing understanding of your projects, preferences, and decisions over time.
Give Your AI Persistent Memory Across Conversations
One of the biggest limitations of AI assistants is that each conversation starts from scratch. The @modelcontextprotocol/server-memory MCP server solves this by providing a knowledge graph that persists across sessions. Your AI can store entities (like people, projects, technologies, and preferences) along with their relationships and observations. This data is saved to a local JSON file and loaded in every new conversation. Over time, your AI builds a rich understanding of your work context — remembering project architectures, team member roles, coding preferences, past decisions, and ongoing tasks.
Prerequisites
- Node.js 18 or later installed on your machine
- Claude Desktop, Cursor, or another MCP-compatible AI host
- A directory where the memory file will be stored
Step-by-step guide
Add the Memory MCP server to your configuration
Add the Memory MCP server to your configuration
Open your MCP host's configuration file and add a memory entry. The server requires no API keys or external services — it stores everything in a local JSON file. The command uses npx with the -y flag for automatic installation. By default, the memory file is stored in the server's working directory.
1{2 "mcpServers": {3 "memory": {4 "command": "npx",5 "args": [6 "-y",7 "@modelcontextprotocol/server-memory"8 ]9 }10 }11}Expected result: Your configuration file contains the Memory MCP server entry.
Restart your AI host and verify the connection
Restart your AI host and verify the connection
Fully quit and reopen Claude Desktop, Cursor, or VS Code. The Memory server provides tools for creating entities, adding relationships, storing observations, and querying the knowledge graph. These should appear in your available tools list after restart.
Expected result: The Memory MCP server shows as connected with knowledge graph tools available.
Store your first entities and relationships
Store your first entities and relationships
Tell the AI about your project, team, or preferences. The AI will use the Memory server to create entities and store observations. Be explicit about what you want remembered — the AI does not automatically save everything from the conversation.
1Example prompts:23"Remember that I'm working on a project called ShopFlow, it's an e-commerce platform built with Next.js and Supabase"45"Remember that our team has 3 developers: Alice (frontend lead), Bob (backend), and Carol (DevOps)"67"Remember that I prefer TypeScript strict mode, Tailwind CSS, and shadcn/ui for all my projects"Expected result: The AI creates entities in the knowledge graph and confirms what it has stored.
Query the memory in a new conversation
Query the memory in a new conversation
Close the current conversation and start a new one. Ask the AI about something you stored previously. The AI will query the knowledge graph and recall the information, even though this is a brand new chat session. This demonstrates the persistence of the memory across sessions.
1Example prompts (in a new session):23"What do you know about my ShopFlow project?"45"Who are the members of my development team?"67"What are my coding preferences?"Expected result: The AI recalls entities and observations from previous sessions using the persistent knowledge graph.
Add relationships between entities
Add relationships between entities
The knowledge graph supports relationships between entities, creating a connected web of information. Tell the AI how different entities relate to each other — projects to team members, technologies to projects, or any other connections. This helps the AI understand context more deeply.
1Example prompts:23"Remember that Alice is the frontend lead on ShopFlow and she specializes in React performance optimization"45"Remember that ShopFlow depends on Stripe for payments and SendGrid for emails"67"Remember that we decided to use Supabase RLS instead of a custom auth middleware — this was decided on March 15"Expected result: The AI stores relationships between entities, building a connected knowledge graph.
Manage and clean up the memory graph
Manage and clean up the memory graph
Over time, some stored information may become outdated. You can ask the AI to update observations, remove entities, or clear specific relationships. You can also ask it to summarize everything it knows to audit the knowledge graph.
1Example prompts:23"Show me everything you remember about my projects and team"45"Update: we migrated ShopFlow from Supabase to Firebase last week"67"Forget everything about the old ProjectX — we abandoned that project"Expected result: The AI updates, removes, or summarizes entities in the knowledge graph as requested.
Complete working example
1{2 "mcpServers": {3 "memory": {4 "command": "npx",5 "args": [6 "-y",7 "@modelcontextprotocol/server-memory"8 ]9 }10 }11}1213// VS Code variant (.vscode/mcp.json):14// {15// "servers": {16// "memory": {17// "command": "npx",18// "args": ["-y", "@modelcontextprotocol/server-memory"]19// }20// }21// }2223// No API keys or environment variables required.24// Memory is stored in a local JSON file.2526// Available tools:27// - create_entities: Create new entities with observations28// - create_relations: Link entities together29// - add_observations: Add new facts to existing entities30// - delete_entities: Remove entities from the graph31// - delete_observations: Remove specific observations32// - delete_relations: Remove relationships33// - read_graph: View the entire knowledge graph34// - search_nodes: Find entities by query35// - open_nodes: Get details of specific entities3637// Knowledge graph structure:38// - Entities: { name, entityType, observations[] }39// - Relations: { from, to, relationType }Common mistakes when using the Memory MCP server for persistent context
Why it's a problem: Expecting the AI to automatically remember everything from every conversation
How to avoid: The AI only stores information when it explicitly uses the Memory server tools. Tell it what to remember: 'Remember that...' or 'Store this for future reference.' Without explicit instructions, conversation content is not saved to the knowledge graph.
Why it's a problem: Storing overly vague or duplicate entities
How to avoid: Be specific with entity names and types. Instead of storing 'my project', use 'ShopFlow e-commerce platform'. Check what is already stored before adding new entities to avoid duplicates.
Why it's a problem: Not realizing the memory file can grow large over time
How to avoid: The knowledge graph is stored as a JSON file that grows with each addition. Periodically ask the AI to show what it remembers and clean up outdated or irrelevant entries to keep the graph manageable.
Why it's a problem: Storing sensitive information like passwords or API keys in memory
How to avoid: The memory file is a plain JSON file on your filesystem. Never ask the AI to remember passwords, tokens, or other secrets. Anyone with access to the file can read all stored data.
Best practices
- Explicitly tell the AI what to remember rather than expecting automatic memory
- Use clear entity names and types for better recall (e.g., 'ShopFlow project' not just 'my project')
- Periodically review the knowledge graph by asking the AI to summarize what it knows
- Clean up outdated information to prevent stale context from affecting new conversations
- Store architectural decisions with dates so you can track when choices were made
- Never store sensitive credentials or personal data in the memory graph
- Combine with the filesystem MCP server so the AI can cross-reference memory with actual code
- Use entity relationships to model your project structure — modules, dependencies, team roles
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
I want to set up the Memory MCP server so my AI assistant can remember context across conversations. Explain how the knowledge graph works, how to configure it in Claude Desktop, and give me examples of what kinds of information are most useful to store.
Show me everything you remember about my projects and team, then update the ShopFlow project to note that we added a new payment provider (Lemonsqueezy) this week and Carol is handling the integration.
Frequently asked questions
Where is the memory data stored?
The knowledge graph is stored as a JSON file on your local filesystem. The default location depends on where npx runs the server. You can specify a custom path by checking the server's documentation for environment variable or argument configuration.
Is the memory shared across different AI hosts?
If Claude Desktop and Cursor both use the same Memory MCP server configuration pointing to the same file, they will share the same knowledge graph. However, they would need to be configured identically to share the memory.
Can I back up or export the knowledge graph?
Yes. Since the memory is stored as a JSON file, you can simply copy the file to back it up. You can also ask the AI to read the entire graph and display it, or use the filesystem MCP server to read the JSON file directly.
How much data can the knowledge graph hold?
There is no hard limit, but the entire graph is loaded into memory when the server starts and sent as context when the AI queries it. Very large graphs (thousands of entities) may slow down the server and consume context window space. Keep it focused on the most relevant information.
Will the AI always check memory before answering questions?
Not automatically. The AI decides when to query memory based on your prompt. If you want it to use its memory, frame your question in a way that suggests stored knowledge: 'Based on what you know about my project...' or 'Recall my preferences for...'
Can RapidDev help me build a more advanced memory system?
Yes. If you need persistent memory with vector search, multi-user support, or integration with a database backend, RapidDev can help design a custom MCP server that goes beyond the basic knowledge graph.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation