Learn how to send follow-up questions to Gemini in n8n using proper thread history for accurate context and smooth automation.

Book a call with an Expert
Starting a new venture? Need to upgrade your web app? RapidDev builds application with your growth in mind.
You send follow‑up questions to Gemini by always passing the full conversation history inside the contents array of your Gemini API request. In n8n, this means you must store the message history yourself
The key idea: every time the user says something new, you take your stored conversation array, append the new user message, and send the entire updated array to Gemini. Gemini outputs a new assistant message, and you add that to the stored history as well.
This is the only reliable method in production because Gemini does not keep state across calls, and n8n does not keep "session memory" unless you build it.
contents field.
The simplest stable pattern: store the conversation JSON in an n8n Variable using a “Set” node or in a database or in-memory store like Redis. This works even for long-running workflows.
Your stored conversation usually looks like this:
[
{ "role": "user", "parts": [{ "text": "Hello Gemini!" }] },
{ "role": "model", "parts": [{ "text": "Hello! How can I help?" }] }
]
Then when the user asks something new, you append:
{ "role": "user", "parts": [{ "text": "Can you explain photosynthesis?" }] }
Now you send the ENTIRE array again to Gemini, not only the new message.
This describes the logic flow inside n8n. You can implement it using regular nodes, no hacks needed.
[].POST https://generativelanguage.googleapis.com/v1/models/gemini-pro:generateContent?key=API\_KEY{"role":"model"} entry into your array.
// inbound data from webhook
const userMessage = $json["message"];
// existing conversation history stored in previous node
const history = items[0].json.history || [];
// push new user message
history.push({
role: "user",
parts: [{ text: userMessage }]
});
// return updated history
return [{ json: { history } }];
{
"contents": [
{ "role": "user", "parts": [{ "text": "Hello Gemini!" }] },
{ "role": "model", "parts": [{ "text": "Hello! How can I help?" }] },
{ "role": "user", "parts": [{ "text": "Can you explain photosynthesis?" }] }
]
}
That’s the reliable production pattern: store the thread yourself, append new messages, and always send the entire conversation to Gemini in the contents array with each follow-up.
When it comes to serving you, we sweat the little things. That’s why our work makes a big impact.