MCP stopped being Anthropic's protocol in November 2025. That's not a criticism — it's the point. The 2025-11-25 spec update is less about technical changes and more about a governance shift that turns MCP into the common language every major AI platform now speaks. OpenAI adopted it. Google adopted it. Anthropic handed the keys to the Linux Foundation. If you're building on MCP, this is the moment everything changed.

What Changed in MCP 2025-11-25

The November 2025 release is the fourth version of the MCP spec. Unlike the June 2025 update — which was technically dense with structured outputs and the controversial removal of JSON-RPC batching — the November release is primarily about ecosystem. The technical additions are real, but the governance story is what builders need to understand.

Here's what actually shipped in 2025-11-25:

  • MCP Apps — a new primitive for interactive UI components inside Claude
  • Linux Foundation governance — MCP donated to the newly created AAIF
  • SEP process — a formal community mechanism for proposing spec changes
  • OpenAI and Google adoption — both companies formally committed to the spec

MCP Apps: Interactive UI Inside Claude

The most visible technical addition in this release is MCP Apps. Up until now, MCP servers could return text, structured data, images, and other content — but everything was rendered by Claude's default UI. MCP Apps change that.

With MCP Apps, a server can return an interactive dashboard component that displays directly inside the Claude conversation window. Think: a live analytics chart from Amplitude, a project board from Asana, a design preview from Figma — all rendered inline, all interactive, without the user opening another tab.

How MCP Apps Work

Under the hood, MCP Apps are sandboxed UI components defined by the server. When a tool call returns an app response, Claude's client renders it in an isolated frame. The user can interact with the component — filter data, click buttons, drill into details — and the component can make further MCP tool calls to update its own state.

This is a significant departure from the stateless request-response model that defined earlier MCP versions. MCP Apps introduce a degree of persistence and interactivity that makes Claude a genuine workspace surface, not just a chat window.

Launch Partners

Anthropic launched MCP Apps with a set of integration partners who shipped production implementations on day one:

  • Amplitude — analytics dashboards with live charts and funnel visualisations
  • Asana — project boards, task lists, and timeline views
  • Figma — design previews and component inspection panels
  • Several additional partners across data, CRM, and productivity categories

These aren't toy demos. Each partner shipped a full MCP server that surfaces its core product inside Claude. If you have an Amplitude MCP server connected, you can ask Claude to show you your retention curve for the past 30 days and get an interactive chart — not a screenshot, not a table, but a live chart you can filter.

For a deeper look at how MCP Apps work and how to build them, see our dedicated MCP Apps explainer.

Linux Foundation Governance: Why This Matters

In late 2025, Anthropic formally donated MCP to the Linux Foundation, which created a new entity to govern it: the AI Agents Interoperability Foundation (AAIF).

This is a big deal, and it's worth understanding why companies donate specs to foundations instead of just open-sourcing them.

The Problem with Proprietary Protocols

When a single company owns a protocol, enterprise adoption slows down. Legal teams worry about licensing changes. Competitors hesitate to build on a rival's infrastructure. Standards bodies won't endorse vendor-controlled specs. And developers get nervous about betting their architecture on something one company can pivot away from overnight.

Anthropic solved all of this in one move. By placing MCP under the Linux Foundation's umbrella — the same foundation that governs Kubernetes, Node.js, and dozens of other critical infrastructure projects — they made MCP politically safe for everyone to adopt.

What the AAIF Actually Does

The AI Agents Interoperability Foundation is the governance body that now owns the MCP spec. It handles:

  • Reviewing and ratifying SEPs (Specification Enhancement Proposals)
  • Publishing official spec releases
  • Managing the conformance test suite
  • Arbitrating disputes about spec interpretation

Anthropic has a seat on the board, as do other major adopters. But no single company controls the spec. That's the point.

The SEP Process: Community-Driven Spec Evolution

Alongside the Linux Foundation handover, the 2025-11-25 release introduced the SEP (Specification Enhancement Proposal) process.

If you've worked with Python, you know what a PEP is. If you've worked with JavaScript, you know TC39 proposals. SEPs are the MCP equivalent — a formal, public process for proposing, debating, and ratifying changes to the spec.

How a SEP Works

The process has five stages:

  1. Draft — anyone can write a SEP and submit it to the AAIF repository
  2. Review — the community and working group comment and request changes
  3. Accepted — the AAIF board accepts the SEP in principle
  4. Implemented — a reference implementation ships
  5. Final — the change is incorporated into the next spec release

Before the SEP process existed, spec changes were made by Anthropic engineers in relative private. You'd find out about breaking changes when the spec dropped. The SEP process makes the roadmap visible months in advance and gives builders a voice in decisions that affect their servers.

This is one of the most underrated changes in the November release. The technical features are exciting, but the governance changes determine whether MCP remains trustworthy infrastructure five years from now.

OpenAI and Google: MCP Becomes a Real Standard

The headline that cut through the noise: OpenAI and Google both formally adopted MCP.

OpenAI announced that ChatGPT and its API tooling would support MCP servers. Google announced MCP support across Gemini and its Workspace AI integrations. Neither company invented MCP. Neither company was consulted on its early design. Both chose to adopt it anyway, because the alternative — maintaining proprietary tool-connection protocols — would put them at a competitive disadvantage as MCP became the de facto standard.

What This Means in Practice

If you build an MCP server today, you're building a tool that works with:

  • Claude (all versions)
  • ChatGPT (with MCP support enabled)
  • Google Gemini and Workspace AI
  • Any other MCP-compliant client that ships in the future

This is the USB-C moment for AI tool integration. Before MCP, connecting a tool to multiple AI systems meant building multiple integrations — one for Claude, one for ChatGPT, one for whatever Google shipped. Now you build once and it works everywhere.

The implications are significant for anyone maintaining an integration. If you've been hesitating on building an MCP server because "it only works with Claude" — that objection is now gone.

What Changes for Regular Users

If you use Claude through claude.ai or the Claude desktop app, here's what the 2025-11-25 update means for you:

  • Richer server interfaces. MCP servers you connect can now surface interactive dashboards inside your conversations, not just text responses.
  • More servers to choose from. The OpenAI and Google adoptions will accelerate the number of MCP servers that developers publish, because a single server now reaches a much larger audience.
  • More stable ecosystem. Linux Foundation governance means the spec won't have surprise breaking changes. The SEP process makes the roadmap public.

The governance changes are mostly invisible to end users, but they're what make the ecosystem trustworthy for the long haul.

What Changes for Developers Building MCP Servers

If you're building MCP servers, the November 2025 release opens up several things:

MCP Apps Are Your New Frontend

You no longer have to accept that your server's output will be rendered as plain text in a chat UI. MCP Apps let you ship a real interface — interactive, stateful, visually designed — that lives inside the AI client. This changes the product surface area dramatically.

Building an analytics server? Return an MCP App with charts. Building a project management server? Return a board view. The user never leaves Claude to get a rich experience from your product.

Target the Right Spec Version

MCP Apps require a client that implements 2025-11-25. If you're building a server that targets older clients, you can't use MCP Apps yet. Check which spec version your target client supports before building app-style responses.

For context on how earlier spec versions differ, see our comparison of MCP March vs June 2025.

Participate in the SEP Process

If you've ever run into a limitation in the MCP spec — something you wish it did that it doesn't — you can now formally propose a change. Submit a SEP to the AAIF repository, engage with the working group, and potentially get your change into a future spec release. This is a meaningful shift for the developer community.

Multi-Platform Publishing

With OpenAI and Google now MCP-native, you can publish your MCP server to multiple platform directories and reach all three major AI ecosystems. If your server was previously Claude-only, that constraint is lifted.

The Bigger Picture: MCP as the USB-C of AI Tools

The USB-C analogy gets used a lot in tech, but it applies cleanly here. Before USB-C, every device used its own connector. Before MCP, every AI system used its own tool-connection format. The cost was fragmentation: developers maintained parallel integrations, users had inconsistent experiences, and the ecosystem moved slowly.

USB-C didn't win because it was technically perfect. It won because enough powerful players adopted it to make it the default assumption. MCP is now in that position. When OpenAI and Google adopt the same protocol as Anthropic, the ecosystem tips. The question stops being "should I build to MCP?" and becomes "what should I build?"

The Linux Foundation governance and the SEP process are the institutional foundations that keep MCP trustworthy as it scales. A spec that can be changed overnight by one company doesn't attract long-term investment. A spec governed by a neutral foundation, with a public change process, does.

If you want to understand the full arc of how MCP got here, the complete MCP version history covers every release from the 2024 launch through this update.

Frequently Asked Questions

MCP Apps are interactive UI dashboards that MCP servers can surface directly inside a Claude conversation. Instead of plain text responses, servers can return rich components — charts, tables, data visualisers — that users can interact with without leaving Claude. The feature launched with production implementations from Amplitude, Asana, Figma, and other partners.

Donating MCP to the Linux Foundation signals that MCP is an open industry standard, not a proprietary Anthropic protocol. Neutral governance encourages enterprise adoption, eliminates vendor lock-in concerns, and makes it easier for companies like OpenAI and Google to formally adopt the spec without being seen as endorsing a competitor's product. The Linux Foundation has a track record of governing critical open infrastructure (Kubernetes, Node.js, etc.) that enterprises trust.

Yes. Both OpenAI and Google officially adopted MCP in 2025. This means their AI systems can connect to MCP servers, and any tool you build to the MCP spec can work across Claude, ChatGPT, and Google's AI products. You write the integration once; it reaches all three ecosystems.

SEP stands for Specification Enhancement Proposal. It's the formal community process for proposing changes to the MCP spec — similar to Python's PEP system or JavaScript's TC39 proposals. Any developer can submit a SEP to the AAIF repository. It goes through review, acceptance, implementation, and finally incorporation into a spec release. The process makes the MCP roadmap transparent and gives the developer community a direct input into how the spec evolves.