Skip to main content
RapidDev - Software Development Agency
mcp-tutorial

MCP vs function calling vs plugins: which to use

MCP, function calling, and plugins solve related but different problems. Function calling is a model-level API feature where you define functions in each API request — it is model-specific and stateless. MCP is a transport protocol that standardizes how AI hosts connect to external tools — it is model-agnostic and works across hosts. Plugins (like ChatGPT plugins, now deprecated) were platform-specific marketplaces. MCP is the clear winner for tool interoperability in 2026: build once, use everywhere.

What you'll learn

  • How MCP, function calling, and plugins differ at a fundamental level
  • When to use MCP vs function calling vs direct API integration
  • Why MCP is replacing plugins as the standard for AI tool access
  • How MCP and function calling work together (they are not competitors)
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Beginner10 min read10 min readAll AI tools and platformsMarch 2026RapidDev Engineering Team
TL;DR

MCP, function calling, and plugins solve related but different problems. Function calling is a model-level API feature where you define functions in each API request — it is model-specific and stateless. MCP is a transport protocol that standardizes how AI hosts connect to external tools — it is model-agnostic and works across hosts. Plugins (like ChatGPT plugins, now deprecated) were platform-specific marketplaces. MCP is the clear winner for tool interoperability in 2026: build once, use everywhere.

Understanding the Three Approaches to AI Tool Integration

Developers often confuse MCP with function calling or equate it to the old ChatGPT plugins. While all three enable AI models to interact with external services, they operate at different levels of the stack. Function calling is a model-level feature. MCP is a transport protocol. Plugins were a platform marketplace. Understanding these distinctions helps you choose the right approach for your use case and understand why MCP has become the de facto standard.

Prerequisites

  • Basic understanding of AI assistants and how they work
  • Familiarity with API concepts (not required to build them)
  • Optional: experience with OpenAI function calling or ChatGPT plugins

Step-by-step guide

1

Understand function calling — model-level tool definitions

Function calling (also called tool use) is a feature of AI model APIs. When you make an API request to OpenAI or Anthropic, you include function definitions in JSON schema format. The model can then decide to 'call' a function by returning its name and arguments. You execute the function in your code and send the result back. Key characteristics: function definitions are sent with every API request, execution happens in your application code, it is model-specific (OpenAI format differs from Anthropic format), and there is no persistent connection.

typescript
1// OpenAI function calling example
2const response = await openai.chat.completions.create({
3 model: "gpt-4o",
4 messages: [{ role: "user", content: "What is the weather in Tokyo?" }],
5 tools: [{
6 type: "function",
7 function: {
8 name: "get_weather",
9 description: "Get weather for a city",
10 parameters: {
11 type: "object",
12 properties: {
13 city: { type: "string", description: "City name" }
14 },
15 required: ["city"]
16 }
17 }
18 }]
19});
20// Model returns: tool_calls: [{ function: { name: 'get_weather', arguments: '{"city":"Tokyo"}' } }]
21// YOU execute the function and send the result back

Expected result: You understand that function calling defines tools per API request and requires you to handle execution.

2

Understand MCP — transport protocol for AI-tool communication

MCP is a protocol layer between AI hosts and external services. Instead of defining tools in every API request, MCP servers persistently expose tools, resources, and prompts. The host discovers them once during initialization and can call them throughout the session. Key characteristics: servers run as persistent processes, tools are discovered automatically, the protocol is model-agnostic (same server works with Claude, GPT-4o, Gemini), and communication uses JSON-RPC 2.0.

typescript
1// MCP server (runs as a persistent process)
2import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
3import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
4import { z } from "zod";
5
6const server = new McpServer({ name: "weather", version: "1.0.0" });
7
8server.tool(
9 "get_weather",
10 "Get weather for a city",
11 { city: z.string() },
12 async ({ city }) => ({
13 content: [{ type: "text", text: `Tokyo: 22°C, Clear` }]
14 })
15);
16
17// Server runs persistently, host discovers tools automatically
18// Same server works in Claude Desktop, Cursor, VS Code, OpenAI Agents SDK
19await server.connect(new StdioServerTransport());

Expected result: You understand that MCP is a persistent protocol layer, not a per-request feature.

3

Understand plugins — the deprecated marketplace approach

ChatGPT plugins (launched March 2023, deprecated 2024) were a platform-specific marketplace. Developers created plugins with OpenAPI specifications, submitted them for review, and users installed them from a store. Plugins were tied to the ChatGPT platform — they did not work in other AI tools. They required a hosted API endpoint, an OpenAPI manifest, and platform approval. MCP replaced the plugin model with an open, decentralized approach: no marketplace, no approval process, works everywhere.

Expected result: You understand that plugins were platform-locked and MCP replaced them with an open standard.

4

Compare the three approaches side by side

The key differences come down to scope, persistence, and portability. Function calling operates per API request and requires your code to handle execution. MCP operates per session with persistent server connections and automatic tool discovery. Plugins operated per platform with marketplace distribution. MCP is the only approach that is model-agnostic, host-agnostic, and runs locally without cloud deployment.

typescript
1// Comparison matrix:
2//
3// | Feature | Function Calling | MCP | Plugins (deprecated) |
4// |----------------------|---------------------|----------------------|----------------------|
5// | Scope | Per API request | Per session | Per platform |
6// | Tool definition | In API payload | In server code | In OpenAPI spec |
7// | Execution | Your code | MCP server process | Cloud API endpoint |
8// | Model-agnostic | No (format varies) | Yes | No (ChatGPT only) |
9// | Host-agnostic | N/A | Yes (any MCP host) | No |
10// | Runs locally | Yes | Yes (stdio) | No (cloud required) |
11// | Persistent connection| No | Yes | No |
12// | Discovery | Manual per request | Automatic | Marketplace |
13// | Approval required | No | No | Yes |
14// | Status in 2026 | Active | Active (growing) | Deprecated |

Expected result: You can clearly distinguish the three approaches across all important dimensions.

5

Understand how MCP and function calling work together

MCP and function calling are not competitors — they are complementary. The OpenAI Agents SDK demonstrates this: it converts MCP tools into function calling format. When an MCP server exposes a tool, the host translates it into the model's native function calling format (OpenAI tools, Anthropic tool_use, etc.). The model uses function calling to decide which tool to invoke, and the host routes the call to the correct MCP server. MCP is the transport layer; function calling is the model interface.

typescript
1// How they work together:
2//
3// 1. MCP server exposes tools via protocol
4// 2. Host discovers tools during MCP handshake
5// 3. Host converts tools to model's function calling format
6// 4. Model uses function calling to select a tool
7// 5. Host routes the function call to the MCP server
8// 6. MCP server executes and returns result
9// 7. Host converts result to model's expected format
10// 8. Model incorporates result into its response

Expected result: You understand that MCP handles transport while function calling handles model-level tool selection.

6

Choose the right approach for your use case

Use function calling when you are building a custom application with direct API access and need fine-grained control over tool definitions per request. Use MCP when you want tool access across multiple AI hosts, need persistent tool connections, or want to share tools with your team. For teams needing help deciding the right architecture for their AI integrations, RapidDev provides consulting on MCP vs custom API integration strategies.

typescript
1// When to use each:
2//
3// Function Calling:
4// - Building a custom chatbot with OpenAI API
5// - Need different tools per conversation
6// - Want full control over execution flow
7// - Single-model application
8//
9// MCP:
10// - Want tools in Claude Desktop, Cursor, VS Code, etc.
11// - Need persistent tool connections (databases, file systems)
12// - Sharing tools across team members
13// - Building tools for the community
14// - Need model-agnostic tool definitions
15//
16// Both together:
17// - OpenAI Agents SDK + MCP servers
18// - Any MCP host that uses function calling internally

Expected result: You can choose the right approach for any AI tool integration scenario.

Complete working example

comparison-example.ts
1// This file shows the same tool implemented three ways:
2// 1. OpenAI Function Calling, 2. MCP Server, 3. Plugin (deprecated)
3
4// === 1. FUNCTION CALLING (per-request, model-specific) ===
5// You define the tool in every API call
6const functionCallingTool = {
7 type: "function" as const,
8 function: {
9 name: "get_weather",
10 description: "Get current weather for a city",
11 parameters: {
12 type: "object",
13 properties: {
14 city: { type: "string", description: "City name" },
15 },
16 required: ["city"],
17 },
18 },
19};
20// Execution: YOU parse the model's response, call your weather API,
21// then send the result back in a follow-up API call.
22
23// === 2. MCP SERVER (persistent, model-agnostic) ===
24import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
25import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
26import { z } from "zod";
27
28const mcpServer = new McpServer({ name: "weather", version: "1.0.0" });
29mcpServer.tool(
30 "get_weather",
31 "Get current weather for a city",
32 { city: z.string().describe("City name") },
33 async ({ city }) => {
34 // Execution happens HERE, in the server
35 const weather = await fetchWeather(city);
36 return {
37 content: [{ type: "text", text: `${city}: ${weather.temp}°C, ${weather.condition}` }],
38 };
39 }
40);
41// Works in Claude Desktop, Cursor, VS Code, OpenAI Agents — no changes
42await mcpServer.connect(new StdioServerTransport());
43
44// === 3. PLUGIN (deprecated, platform-locked) ===
45// Required: hosted API + OpenAPI manifest + ChatGPT approval
46// openapi.yaml:
47// paths:
48// /weather:
49// get:
50// operationId: getWeather
51// parameters: [{ name: city, in: query, schema: { type: string } }]
52// Only worked in ChatGPT. Deprecated 2024.
53
54async function fetchWeather(city: string) {
55 return { temp: 22, condition: "Clear" }; // placeholder
56}

Common mistakes

Why it's a problem: Thinking MCP replaces function calling

How to avoid: MCP and function calling are complementary, not competing. MCP handles transport between hosts and servers. Function calling handles tool selection at the model level. Most MCP hosts use function calling internally to let the model choose which MCP tool to call.

Why it's a problem: Building a new plugin instead of an MCP server

How to avoid: ChatGPT plugins are deprecated. Build MCP servers instead — they work across all major AI hosts (Claude Desktop, Cursor, VS Code, OpenAI Agents SDK) without platform approval or cloud hosting requirements.

Why it's a problem: Using function calling for tools that should be MCP servers

How to avoid: If you want the same tools in Claude Desktop, Cursor, and VS Code, use MCP. Function calling locks you into one model's API format. MCP servers are portable across all hosts.

Why it's a problem: Confusing MCP with REST APIs

How to avoid: MCP is a specific protocol (JSON-RPC 2.0) designed for AI tool access, not a generic API framework. It includes capability discovery, error handling, and resource management features that plain REST does not provide.

Best practices

  • Use MCP when you want tools available across multiple AI hosts and models
  • Use function calling when building custom single-model applications with per-request tool control
  • Remember that MCP and function calling work together — they are complementary layers
  • Do not build ChatGPT plugins — they are deprecated. Build MCP servers instead
  • For community distribution, publish MCP servers to npm so anyone can use them with npx
  • For internal tools, MCP servers give your whole team access without custom API work
  • When evaluating third-party AI tools, check if they support MCP for maximum flexibility
  • Consider that MCP servers run locally via stdio, while function calling always requires network API calls

Still stuck?

Copy one of these prompts to get a personalized, step-by-step explanation.

ChatGPT Prompt

Compare MCP (Model Context Protocol), OpenAI function calling, and ChatGPT plugins. Explain how they differ in scope, execution, portability, and use cases. Show the same weather tool implemented with function calling and as an MCP server. Explain how they work together in the OpenAI Agents SDK.

MCP Prompt

I am building AI tools and need to understand the difference between MCP and function calling. When should I build an MCP server vs define functions in API calls? Show practical examples of each approach and explain when they complement each other.

Frequently asked questions

Is MCP going to replace function calling?

No. MCP and function calling serve different purposes. Function calling is how models interact with tools at the API level. MCP is how hosts connect to external services. They work together — MCP hosts often use function calling internally to let the model select which MCP tool to invoke.

Can I use MCP with OpenAI models?

Yes. The OpenAI Agents SDK has built-in MCP support. It converts MCP tools into OpenAI's function calling format automatically. You can also use MCP servers in tools like Cursor that support multiple models including GPT-4o.

What happened to ChatGPT plugins?

OpenAI deprecated ChatGPT plugins in 2024. They were replaced by GPTs (custom GPTs) and the Agents SDK, which supports MCP. The plugin marketplace approach proved too restrictive compared to MCP's open ecosystem.

Should I migrate my existing function calling code to MCP?

Only if you need cross-host compatibility. If your application works fine with direct API function calling and only uses one model, there is no need to migrate. Add MCP when you want the same tools in Claude Desktop, Cursor, VS Code, etc.

Can RapidDev help decide between MCP and function calling?

Yes. RapidDev helps teams evaluate their AI architecture needs and choose the right combination of MCP servers, function calling, and direct API integrations for their specific use case.

Does MCP work with local/self-hosted models?

Yes. MCP is model-agnostic — any AI host that implements the MCP client protocol can use MCP servers, regardless of which model it runs. Hosts that use local models via Ollama or similar can still connect to MCP servers.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.