/n8n-tutorials

How to store conversation state between webhook triggers for AI agents in n8n?

Learn how to persist AI agent conversation state in n8n between webhook triggers with practical methods for storing and retrieving context.

Matt Graham, CEO of Rapid Developers

Book a call with an Expert

Starting a new venture? Need to upgrade your web app? RapidDev builds application with your growth in mind.

Book a free consultation

How to store conversation state between webhook triggers for AI agents in n8n?

The short, direct answer:
Store conversation state for AI agents in n8n by saving state outside the workflow—typically in a database (MySQL/Postgres), KV store (Redis), or n8n’s built‑in Data Stores—and then reloading that state at the beginning of each Webhook-triggered execution. n8n does not persist state between Webhook runs on its own, so you must explicitly fetch and update the conversation record every time the webhook fires.

 

Why you must store state externally

 

Every time a Webhook Trigger node fires, n8n starts a brand‑new, stateless execution. It does not remember previous runs. The JSON from the last request is gone. So if you’re building an AI agent that needs memory (like a conversation), the workflow must fetch earlier messages from some external store at the beginning of the run and write them back at the end.

  • n8n executions are isolated — no global process memory.
  • Webhook triggers always start fresh — nothing is automatically carried over.
  • Expressions cannot persist data — they only read current execution data.

 

Recommended storage options

 

These are the options real production teams use for conversation memory in n8n:

  • n8n Data Stores (simple, fast, ideal for prototyping and moderate traffic)
  • External database (MySQL/Postgres/SQLite) via the Database nodes
  • Redis or another KV store via HTTP Request or custom nodes

For most teams, Data Stores are the easiest and perfectly fine unless you need advanced queries or very high throughput.

 

Typical pattern inside an n8n workflow

 

The stable pattern looks like this:

  • Webhook Trigger receives a user message, usually with a user ID or session ID.
  • A Database or Data Store Query loads the existing conversation array (if any).
  • You append the new message and send it to your AI model.
  • You store the updated conversation back to the database or Data Store.

Below is a simple Data Store example to make the logic concrete.

 

// Example of a Function node that merges the new message into the stored conversation

// Input:
// $json.userId        -> The user identifier
// $json.message       -> The new incoming message
// $json.storedHistory -> Array of past messages fetched from Data Store

const history = $json.storedHistory || [];    // If none, start with empty array
history.push({
  role: "user",
  content: $json.message
});

return [
  {
    json: {
      userId: $json.userId,
      updatedHistory: history
    }
  }
];

 

How to wire this inside n8n

 

This is the typical workflow structure using a Data Store:

  • Webhook Trigger (receives userId and message)
  • Data Store (Get) where key = userId
  • Function node to merge history (like the code above)
  • OpenAI/LLM node to generate the assistant reply using updatedHistory
  • Data Store (Set) to save the updated conversation history back
  • Respond to Webhook

That’s everything you need. The Webhook Trigger doesn’t store anything — the Data Store does.

 

Notes for production

 

  • Don’t store huge histories. Trim or summarize after every few messages to avoid slowdowns.
  • Use user IDs consistently. This is your key for loading the right memory.
  • Handle missing records. First-time users won’t have any stored conversation yet.
  • Protect your webhook with a secret or auth; you don’t want random people hitting it.
  • Use error workflows so failures don’t break your state.

 

If you follow this pattern – fetch memory at the start, update it, save it back – your AI agents will maintain stable conversation state across webhook-triggered executions in n8n.

Want to explore opportunities to work with us?

Connect with our team to unlock the full potential of no-code solutions with a no-commitment consultation!

Book a Free Consultation

Client trust and success are our top priorities

When it comes to serving you, we sweat the little things. That’s why our work makes a big impact.

Rapid Dev was an exceptional project management organization and the best development collaborators I've had the pleasure of working with. They do complex work on extremely fast timelines and effectively manage the testing and pre-launch process to deliver the best possible product. I'm extremely impressed with their execution ability.

CPO, Praction - Arkady Sokolov

May 2, 2023

Working with Matt was comparable to having another co-founder on the team, but without the commitment or cost. He has a strategic mindset and willing to change the scope of the project in real time based on the needs of the client. A true strategic thought partner!

Co-Founder, Arc - Donald Muir

Dec 27, 2022

Rapid Dev are 10/10, excellent communicators - the best I've ever encountered in the tech dev space. They always go the extra mile, they genuinely care, they respond quickly, they're flexible, adaptable and their enthusiasm is amazing.

Co-CEO, Grantify - Mat Westergreen-Thorne

Oct 15, 2022

Rapid Dev is an excellent developer for no-code and low-code solutions.
We’ve had great success since launching the platform in November 2023. In a few months, we’ve gained over 1,000 new active users. We’ve also secured several dozen bookings on the platform and seen about 70% new user month-over-month growth since the launch.

Co-Founder, Church Real Estate Marketplace - Emmanuel Brown

May 1, 2024 

Matt’s dedication to executing our vision and his commitment to the project deadline were impressive. 
This was such a specific project, and Matt really delivered. We worked with a really fast turnaround, and he always delivered. The site was a perfect prop for us!

Production Manager, Media Production Company - Samantha Fekete

Sep 23, 2022