# Use MCP Server

Trunk Flaky Tests includes a [Model Context Protocol (MCP)](https://modelcontextprotocol.io/docs/getting-started/intro) server. AI applications like Claude Code or Cursor can use MCP servers to connect to data sources, tools, and workflows, enabling them to access key information and perform tasks.

### Supported AI applications

The following applications are currently supported: Cursor, Claude Code, Gemini CLI, and GitHub Copilot.

{% hint style="info" %}
Gemini Code Assist and Windsurf are not supported due to their limited support for MCP servers
{% endhint %}

### API

The Trunk MCP server is available at `https://mcp.trunk.io/mcp` and exposes the following tools:

<table><thead><tr><th width="265.30859375">Tool</th><th>Capability</th></tr></thead><tbody><tr><td><a href="use-mcp-server/mcp-tool-reference/get-root-cause-analysis"><code>fix-flaky-test</code></a></td><td>Experimental: Retrieve insights around a failing/flaky test</td></tr><tr><td><a href="use-mcp-server/mcp-tool-reference/set-up-test-uploads"><code>setup-trunk-uploads</code></a></td><td>Experimental: Create a setup plan to upload test results</td></tr></tbody></table>

### Authorization

The Trunk MCP server supports the OAuth 2.0 + OpenID Connect standard for MCP authorization.

### Get started

**To get started, configure your AI application to communicate with Trunk's MCP server:**

* [Cursor](https://docs.trunk.io/flaky-tests/use-mcp-server/configuration/cursor-ide)
* [GitHub Copilot](https://docs.trunk.io/flaky-tests/use-mcp-server/configuration/github-copilot-ide)
* [Claude Code CLI](https://docs.trunk.io/flaky-tests/use-mcp-server/configuration/claude-code-cli)
* [Gemini CLI](https://docs.trunk.io/flaky-tests/use-mcp-server/configuration/gemini-cli)
