Chain multiple MCP tools into multi-step workflows where each tool's output feeds the next step. Build pipelines like: query database, process results, write to file, and send notification — all orchestrated by the AI or by a custom client script. This tutorial covers tool chaining patterns, error handling between steps, and building a workflow orchestrator MCP server.
Building Multi-Step Workflows by Chaining MCP Tools
Individual MCP tools are useful, but the real power comes from chaining them together. A database query feeds into a file write, which triggers a notification. This tutorial shows three approaches to tool chaining: letting the AI orchestrate naturally, building a custom client that scripts the chain, and creating a workflow MCP server with composite tools that execute multiple steps internally.
Prerequisites
- Two or more MCP servers with tools to chain
- Node.js 18+ and npm installed
- Understanding of MCP tool call and result formats
- Basic TypeScript knowledge
Step-by-step guide
Understand the three tool chaining patterns
Understand the three tool chaining patterns
There are three ways to chain MCP tools. First, natural AI orchestration: the AI calls tools sequentially, using each result to decide the next step. This is the simplest approach but depends on the AI's judgment. Second, scripted client orchestration: a custom MCP client calls tools in a predetermined sequence, passing data between steps programmatically. Third, composite server tools: build a single MCP tool that internally calls multiple operations and returns the combined result.
1// Pattern 1: AI orchestration (natural)2// You just ask: "Query the orders table for this month, export as CSV,3// and notify the team on Slack"4// The AI calls: query_db → write_file → send_slack_message56// Pattern 2: Scripted client orchestration7const dbResult = await client.callTool({ name: "query_db", arguments: { sql: "SELECT ..." } });8const csvPath = await client.callTool({ name: "write_csv", arguments: { data: dbResult, path: "report.csv" } });9await client.callTool({ name: "send_notification", arguments: { message: `Report ready: ${csvPath}` } });1011// Pattern 3: Composite server tool12server.tool("generate_and_notify_report", "...", schema, async (params) => {13 const data = await queryDatabase(params.sql);14 const path = await writeCsv(data, params.filename);15 await sendNotification(`Report ready: ${path}`);16 return { content: [{ type: "text", text: `Report generated and team notified` }] };17});Expected result: Understanding of the three chaining patterns and when to use each.
Build a scripted workflow using the MCP client SDK
Build a scripted workflow using the MCP client SDK
Create a script that connects to multiple MCP servers and chains tool calls in a predetermined sequence. Use the MCP Client class to connect to each server, call tools in order, and pass results between steps. Handle errors at each step so the pipeline fails gracefully with a clear error message indicating which step failed.
1// src/workflow.ts2import { Client } from "@modelcontextprotocol/sdk/client/index.js";3import { StdioClientTransport } from "@modelcontextprotocol/sdk/client/stdio.js";45async function connectServer(command: string, args: string[]): Promise<Client> {6 const transport = new StdioClientTransport({ command, args });7 const client = new Client({ name: "workflow", version: "1.0.0" }, { capabilities: {} });8 await client.connect(transport);9 return client;10}1112function extractText(result: any): string {13 return (result.content as any[])14 .filter(c => c.type === "text")15 .map(c => c.text)16 .join("\n");17}1819async function runPipeline() {20 // Connect to servers21 const dbClient = await connectServer("node", ["dist/db-server.js"]);22 const fileClient = await connectServer("node", ["dist/file-server.js"]);2324 try {25 // Step 1: Query database26 console.error("[pipeline] Step 1: Querying database...");27 const queryResult = await dbClient.callTool({28 name: "query",29 arguments: { sql: "SELECT * FROM orders WHERE created_at >= NOW() - INTERVAL '7 days'" },30 });31 if ((queryResult as any).isError) throw new Error(`Query failed: ${extractText(queryResult)}`);32 const data = JSON.parse(extractText(queryResult));33 console.error(`[pipeline] Got ${data.rows} rows`);3435 // Step 2: Write CSV report36 console.error("[pipeline] Step 2: Writing CSV...");37 const csvResult = await fileClient.callTool({38 name: "write_file",39 arguments: {40 filePath: "reports/weekly-orders.csv",41 content: formatAsCsv(data),42 },43 });44 if ((csvResult as any).isError) throw new Error(`CSV write failed: ${extractText(csvResult)}`);4546 // Step 3: Write summary47 console.error("[pipeline] Step 3: Writing summary...");48 const summary = `Weekly Orders Report\nGenerated: ${new Date().toISOString()}\nTotal rows: ${data.rows}\nTotal revenue: $${data.data.reduce((s: number, r: any) => s + (r.total || 0), 0).toFixed(2)}`;49 await fileClient.callTool({50 name: "write_file",51 arguments: { filePath: "reports/weekly-summary.txt", content: summary },52 });5354 console.error("[pipeline] Complete!");55 } finally {56 await dbClient.close();57 await fileClient.close();58 }59}6061function formatAsCsv(data: { columns: string[]; data: any[] }): string {62 const header = data.columns.join(",");63 const rows = data.data.map((row: any) =>64 data.columns.map(col => JSON.stringify(row[col] ?? "")).join(",")65 );66 return [header, ...rows].join("\n");67}6869runPipeline().catch(e => { console.error("Pipeline failed:", e); process.exit(1); });Expected result: A script that chains database query, CSV export, and summary generation across MCP servers.
Build a composite workflow MCP server
Build a composite workflow MCP server
For workflows that should be available as MCP tools themselves, build a server with composite tools that execute multi-step operations internally. Each composite tool encapsulates an entire pipeline — query, transform, write, notify — as a single tool call. This is the cleanest approach for standardized workflows that users trigger regularly.
1// src/workflow-server.ts2import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";3import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";4import { z } from "zod";5import pg from "pg";6import fs from "fs/promises";78const pool = new pg.Pool({ connectionString: process.env.DATABASE_URL });9const server = new McpServer({ name: "workflow-server", version: "1.0.0" });1011server.tool(12 "generate_report",13 "Query data, generate CSV, and create summary report in one step",14 {15 query: z.string().describe("SQL SELECT query for the report data"),16 reportName: z.string().describe("Name for the report files (no extension)"),17 outputDir: z.string().default("/tmp/reports"),18 },19 async ({ query, reportName, outputDir }) => {20 const steps: string[] = [];2122 try {23 // Step 1: Execute query24 if (/\b(DELETE|DROP|UPDATE|INSERT)\b/i.test(query)) {25 return { content: [{ type: "text", text: "Error: Only SELECT queries allowed" }], isError: true };26 }27 const result = await pool.query({ text: query, timeout: 30000 });28 steps.push(`Queried ${result.rowCount} rows`);2930 // Step 2: Generate CSV31 await fs.mkdir(outputDir, { recursive: true });32 const csvPath = `${outputDir}/${reportName}.csv`;33 const header = result.fields.map(f => f.name).join(",");34 const rows = result.rows.map(r =>35 result.fields.map(f => JSON.stringify(r[f.name] ?? "")).join(",")36 );37 await fs.writeFile(csvPath, [header, ...rows].join("\n"));38 steps.push(`CSV written to ${csvPath}`);3940 // Step 3: Generate summary41 const summaryPath = `${outputDir}/${reportName}-summary.json`;42 const summary = {43 reportName,44 generatedAt: new Date().toISOString(),45 rowCount: result.rowCount,46 columns: result.fields.map(f => f.name),47 csvPath,48 };49 await fs.writeFile(summaryPath, JSON.stringify(summary, null, 2));50 steps.push(`Summary written to ${summaryPath}`);5152 return {53 content: [{ type: "text", text: `Report pipeline complete:\n${steps.map((s, i) => ` ${i + 1}. ${s}`).join("\n")}` }],54 };55 } catch (error) {56 return {57 content: [{58 type: "text",59 text: `Pipeline failed at step ${steps.length + 1}:\n` +60 `Completed: ${steps.join(", ") || "none"}\n` +61 `Error: ${error instanceof Error ? error.message : String(error)}`,62 }],63 isError: true,64 };65 }66 }67);Expected result: A composite MCP tool that executes a multi-step report generation pipeline.
Add error handling and partial rollback to chains
Add error handling and partial rollback to chains
Multi-step workflows need robust error handling. Track completed steps, clean up partial outputs on failure, and report exactly where the chain broke. For critical workflows, implement compensation logic that undoes completed steps when a later step fails. This prevents half-finished operations from leaving inconsistent state.
1// src/safe-pipeline.ts2interface PipelineStep<T> {3 name: string;4 execute: (context: T) => Promise<T>;5 rollback?: (context: T) => Promise<void>;6}78async function runPipeline<T>(steps: PipelineStep<T>[], initialContext: T): Promise<T> {9 const completed: PipelineStep<T>[] = [];10 let context = initialContext;1112 for (const step of steps) {13 try {14 console.error(`[pipeline] Running: ${step.name}`);15 context = await step.execute(context);16 completed.push(step);17 console.error(`[pipeline] Completed: ${step.name}`);18 } catch (error) {19 console.error(`[pipeline] Failed: ${step.name} - ${error}`);2021 // Rollback completed steps in reverse order22 for (const completedStep of completed.reverse()) {23 if (completedStep.rollback) {24 try {25 console.error(`[pipeline] Rolling back: ${completedStep.name}`);26 await completedStep.rollback(context);27 } catch (rollbackError) {28 console.error(`[pipeline] Rollback failed: ${completedStep.name} - ${rollbackError}`);29 }30 }31 }3233 throw new Error(34 `Pipeline failed at "${step.name}": ${error instanceof Error ? error.message : String(error)}\n` +35 `Completed steps: ${completed.map(s => s.name).join(", ") || "none"}\n` +36 `Rolled back: ${completed.filter(s => s.rollback).map(s => s.name).join(", ") || "none"}`37 );38 }39 }4041 return context;42}4344// Usage:45// const result = await runPipeline([46// { name: "query", execute: queryStep },47// { name: "write-csv", execute: writeCsvStep, rollback: deleteFileStep },48// { name: "notify", execute: notifyStep },49// ], { sql: "SELECT ...", outputDir: "/tmp" });Expected result: A pipeline runner with step tracking, error handling, and rollback capability.
Orchestrate cross-server tool chains for complex workflows
Orchestrate cross-server tool chains for complex workflows
The most powerful pattern chains tools across multiple MCP servers: query a database, search related documents, post results to a project management tool, and send a notification. Build a client script that connects to all required servers and orchestrates the full workflow. For enterprises building complex multi-server workflows, RapidDev specializes in designing and implementing production-grade MCP orchestration pipelines.
1// Example: Incident investigation pipeline2// Chains: Sentry → Database → Filesystem → Linear34async function investigateIncident(project: string) {5 const sentry = await connectServer("node", ["dist/sentry-server.js"]);6 const db = await connectServer("node", ["dist/db-server.js"]);7 const files = await connectServer("npx", ["-y", "@modelcontextprotocol/server-filesystem", "."]);8 const linear = await connectServer("node", ["dist/linear-server.js"]);910 try {11 // Step 1: Get recent errors from Sentry12 const errors = await sentry.callTool({13 name: "recent_errors", arguments: { project, hours: 1 },14 });15 const topError = JSON.parse(extractText(errors))[0];1617 // Step 2: Get error details and stack trace18 const details = await sentry.callTool({19 name: "error_details", arguments: { issueId: topError.id },20 });21 const errorInfo = JSON.parse(extractText(details));2223 // Step 3: Query related database records24 const dbResult = await db.callTool({25 name: "query", arguments: {26 sql: `SELECT * FROM error_logs WHERE message LIKE '%${topError.title.slice(0, 50)}%' LIMIT 5`,27 },28 });2930 // Step 4: Read the failing source file31 const frame = errorInfo.frames?.[0];32 const sourceCode = frame ? await files.callTool({33 name: "read_file", arguments: { path: frame.file },34 }) : null;3536 // Step 5: Create a Linear ticket with all context37 await linear.callTool({38 name: "create_issue",39 arguments: {40 title: `[Auto] ${topError.title}`,41 description: `Error count: ${topError.count}\nStack: ${frame?.file}:${frame?.line}\nLast seen: ${topError.lastSeen}`,42 teamKey: "ENG",43 priority: topError.count > 100 ? 1 : 2,44 },45 });4647 console.error("Incident investigated and ticket created");48 } finally {49 await Promise.all([sentry.close(), db.close(), files.close(), linear.close()]);50 }51}Expected result: A cross-server workflow that investigates an incident and creates a ticket automatically.
Complete working example
1import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";2import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";3import { z } from "zod";4import pg from "pg";5import fs from "fs/promises";67const pool = new pg.Pool({ connectionString: process.env.DATABASE_URL });8const BLOCKED = /\b(DELETE|DROP|TRUNCATE|UPDATE|INSERT|ALTER)\b/i;910const server = new McpServer({ name: "workflow", version: "1.0.0" });1112server.tool("generate_report", "Run query → export CSV → create summary", {13 sql: z.string(), name: z.string(), dir: z.string().default("/tmp/reports"),14}, async ({ sql, name, dir }) => {15 if (BLOCKED.test(sql)) return { content: [{ type: "text", text: "Only SELECT allowed" }], isError: true };16 const steps: string[] = [];17 try {18 const r = await pool.query({ text: sql, timeout: 30000 });19 steps.push(`Queried ${r.rowCount} rows`);2021 await fs.mkdir(dir, { recursive: true });22 const csv = [r.fields.map(f => f.name).join(","),23 ...r.rows.map(row => r.fields.map(f => JSON.stringify(row[f.name] ?? "")).join(","))24 ].join("\n");25 await fs.writeFile(`${dir}/${name}.csv`, csv);26 steps.push(`CSV: ${dir}/${name}.csv`);2728 await fs.writeFile(`${dir}/${name}.json`, JSON.stringify({29 name, at: new Date().toISOString(), rows: r.rowCount,30 cols: r.fields.map(f => f.name),31 }, null, 2));32 steps.push(`Summary: ${dir}/${name}.json`);3334 return { content: [{ type: "text", text: steps.map((s,i) => `${i+1}. ${s}`).join("\n") }] };35 } catch (e) {36 return { content: [{ type: "text", text:37 `Failed at step ${steps.length+1}. Done: ${steps.join("; ")}. Error: ${e}` }], isError: true };38 }39});4041server.tool("pipeline_status", "Check if pipeline output files exist", {42 dir: z.string().default("/tmp/reports"), name: z.string(),43}, async ({ dir, name }) => {44 const csvExists = await fs.access(`${dir}/${name}.csv`).then(() => true).catch(() => false);45 const jsonExists = await fs.access(`${dir}/${name}.json`).then(() => true).catch(() => false);46 return { content: [{ type: "text", text: JSON.stringify({ csv: csvExists, summary: jsonExists }) }] };47});4849async function main() {50 await server.connect(new StdioServerTransport());51 console.error("Workflow MCP server running");52}53main().catch(e => { console.error(e); process.exit(1); });Common mistakes when chaining multiple MCP tools in a workflow
Why it's a problem: Not handling errors between chain steps, causing cascading failures without context
How to avoid: Wrap each step in try-catch, track completed steps, and report exactly which step failed with its error message.
Why it's a problem: Not cleaning up partial outputs when a chain fails mid-way
How to avoid: Implement rollback logic for write steps. Delete files that were created before the failure to avoid inconsistent state.
Why it's a problem: Passing raw MCP result objects between steps instead of extracting the text content
How to avoid: Always extract text from result.content before passing to the next step. Parse JSON results before using them.
Why it's a problem: Not closing MCP client connections, leaving zombie server processes
How to avoid: Close all client connections in a finally block, even when the pipeline fails.
Best practices
- Track completed steps so error messages show exactly where the chain broke
- Implement rollback logic for write operations in multi-step chains
- Close all MCP client connections in finally blocks to prevent zombie processes
- Extract and parse text content from results before passing to the next step
- Use composite server tools for standardized workflows that run frequently
- Use scripted client orchestration for complex cross-server pipelines
- Log each step with [pipeline] prefix for clear workflow tracing
- Set timeouts on each step to prevent the entire chain from hanging
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
Show me how to chain MCP tool calls into multi-step workflows. I need three patterns: AI orchestration, scripted client orchestration connecting to multiple servers, and composite server tools. Include error handling with step tracking and rollback. TypeScript.
Build an MCP workflow server with a generate_report tool that chains database query → CSV export → summary creation. Include error handling that reports which step failed. Also show a scripted pipeline using the MCP Client SDK to chain tools across multiple servers.
Frequently asked questions
Should I let the AI orchestrate tool chains or script them?
Let the AI orchestrate for ad-hoc, exploratory workflows. Script chains for recurring, predictable pipelines where you need reliability and consistency. Use composite tools when the pipeline should be callable as a single MCP tool.
How do I handle one step being slow in a long chain?
Set per-step timeouts and implement partial result reporting. If step 3 of 5 times out, report what steps 1-2 produced and let the user decide whether to retry.
Can I run chain steps in parallel instead of sequentially?
Yes, if steps are independent. Use Promise.all to run independent steps concurrently and Promise.allSettled if some steps can fail without blocking others.
How many steps can a tool chain have?
There is no technical limit. Practically, keep chains under 10 steps for maintainability. If a chain grows beyond that, split it into sub-workflows.
Can RapidDev help design complex MCP workflows?
Yes. RapidDev designs and implements multi-server MCP orchestration pipelines for enterprise workflows, including error handling, monitoring, and automated scheduling.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation