Table of contents
- What is MCP?
- Core components
- Why MCP matters
- Graphite's use of MCP — GT MCP
- Example scenario
- Frequently asked questions
What is MCP?
"MCP" refers to the Model Context Protocol, an open-standard communication framework—created by Anthropic in November 2024—that allows AI systems (especially LLMs and agents) to connect in a consistent, tool-agnostic way to external data sources, services, or tools.
Think of MCP like a universal "USB-C port for AI": instead of building bespoke integrations for each tool or data source, AI systems can use MCP to discover, request, and interact with diverse resources via a standardized protocol.
Core components
- MCP client: Embedded in the AI model or its host (e.g., a desktop assistant or coding agent). It sends requests according to the MCP spec.
- MCP server: Bridges the AI system to a specific resource—e.g., a file system, API, database—and serves structured context or tooling to the AI client.
- The protocol is built atop JSON-RPC 2.0, supporting both stdio and HTTP transports, with bidirectional messaging and capability negotiation.
Why MCP matters
- Interoperability: With MCP, AI agents can leverage multiple tools or services without per-integration engineering.
- Context awareness: AI agents can dynamically fetch relevant context—like code, docs, or data—enhancing reasoning and output quality.
- Adoption momentum: Leading AI providers like OpenAI, Google DeepMind, Block, Replit, Sourcegraph, and Microsoft have integrated or announced support for MCP in 2025.
Graphite's use of MCP — GT MCP
Graphite has embedded MCP support into its CLI as GT MCP (Graphite CLI v1.6.7 and later), allowing AI agents like Claude Code to generate stacked PR workflows: breaking down large AI-generated diffs into smaller, coherent pull requests that are easier to review.
How it works
- AI client (via MCP) sends transformation intent to Graphite's MCP server.
- Graphite responds with stepwise diffs configured as sequential PR stacks.
- Human reviewers validate each PR in order, mirroring senior engineer workflows.
This workflow significantly improves reviewability—you'll get incremental, testable changes rather than one overwhelming diff.
If you'd like to get started, check out the official GT MCP documentation on graphite.dev. For general Graphite features that complement this workflow, see the key features guide.
Example scenario
Suppose you want an AI agent to refactor a legacy feature across multiple files. Using GT MCP, the agent:
- Receives context from your repo via MCP.
- Generates the first pull request: refactor core logic.
- Once approved, the next PR updates tests.
- Then, a final PR cleans up docs and comments.
Instead of reviewing a monolithic change, reviewers assess each step sequentially—clarity, safety, and maintainability all improved.
Frequently asked questions
Does Graphite's MCP server implement the Anthropic MCP spec?
Yes. Graphite implements an MCP server that complies with the Model Context Protocol framework—allowing AI agents to connect via MCP and generate stacked PRs. It's an official MCP server released July 21 2025 in Go.
Are there known security risks with MCP?
Yes. Recent security research has identified risks like tool-poisoning and prompt injection. Mitigation strategies include strict authentication, tool filtering, logging, and governance frameworks like MCP Guardian.
How do I enable GT MCP in Graphite CLI?
Install or upgrade to Graphite CLI v1.6.7+, then use:
claude mcp add graphite gt mcp
To configure the agent to use Graphite's MCP server.
Can I use MCP servers for other tools (e.g., databases)?
Yes. The MCP ecosystem includes servers for Google Drive, Slack, GitHub, PostgreSQL, Puppeteer, etc.—you can build custom MCP servers or use open-source ones as needed.