All terms
Agents & tools
MCP (Model Context Protocol)
Also known as: Model Context Protocol
An open protocol — pioneered by Anthropic in late 2024 — for connecting AI models to external tools, data sources, and services through a standard interface.
What it means
MCP (Model Context Protocol) is an open standard that defines how AI applications expose tools, data, and prompts to LLMs. Think of it as USB-C for AI: instead of every chatbot reinventing the integration with GitHub, Notion, your filesystem, and your database, you write one MCP server and any MCP-aware client (Claude Desktop, Cursor, ChatGPT, Zed, Continue, custom apps) can use it.
The protocol has three primitives: tools (functions the model can call), resources (data the model can read, like files or DB rows), and prompts (reusable templates). A server advertises what it offers, a client connects, and the LLM gets a typed list of capabilities it can invoke. The transport is JSON-RPC over stdio or HTTP — boring on purpose.
Anthropic open-sourced MCP in November 2024. By 2025 it had been adopted by OpenAI for ChatGPT, Google for Gemini, and most major IDE and agent frameworks. By 2026 it's the de facto integration standard — there are thousands of MCP servers for everything from Stripe and Linear to Postgres and Kubernetes. If you're building tool integrations and not using MCP, you're rebuilding what's already standardized.
The catch: MCP standardizes the wire format, not the quality. A bad MCP server with vague tool descriptions and leaky abstractions makes any model worse. The good ones look like well-designed APIs — narrow tools, clear naming, structured errors.
Example
You install the Postgres MCP server. Claude Desktop, Cursor, and your custom agent can now all query your database with the same `query_postgres` tool — no per-app integration code.
Why it matters
MCP is the closest thing AI has to a standard right now. It removes the N×M integration problem (N AI apps × M services) and replaces it with N+M. If you're picking a way to expose tools to LLMs in 2026, picking MCP is the safe default — it works across providers and isn't going away.