Tempo MCP Server Integration
Integrate Grafana Tempo tracing data with AI assistants using the Tempo MCP Server for seamless distributed system observability and real-time debugging within FlowHunt flows.

What does “Tempo” MCP Server do?
The Tempo MCP Server is a Go-based implementation of the Model Context Protocol (MCP) that integrates with Grafana Tempo, a distributed tracing backend. This server enables AI assistants to query and analyze distributed tracing data, allowing developers to gain insights into application performance and trace system behavior. By exposing tool definitions compatible with MCP, the Tempo MCP Server empowers AI clients (such as Claude Desktop) to perform tasks like querying trace data, streaming real-time events, and integrating tracing information into development workflows. Its support for both HTTP (with SSE for real-time updates) and standard input/output ensures flexible integration with a wide array of platforms and tools, enhancing observability and debugging capabilities for modern distributed systems.
List of Prompts
No prompt templates were found in the repository.
List of Resources
No explicit MCP resources were listed in the repository.
List of Tools
- Tempo Query Tool
- Allows AI clients to query and analyze distributed tracing data from Grafana Tempo. This tool provides programmatic access to trace data, enabling in-depth inspection of system performance and behavior via the MCP interface.
Use Cases of this MCP Server
- Distributed Tracing Analysis
- Developers can use AI assistants to query and visualize trace data from Grafana Tempo, helping them identify performance bottlenecks and debug distributed systems more effectively.
- Real-Time Event Streaming
- By leveraging the SSE endpoint, users can stream real-time trace events, making it easier to monitor system health and respond quickly to issues as they arise.
- Integration with AI Development Tools
- The MCP server can be integrated with AI clients like Claude Desktop, enabling contextual trace queries and automating observability tasks within developer workflows.
- Automated Debugging
- AI-powered tools can utilize Tempo’s trace data to suggest fixes, highlight anomalies, or provide summaries of system execution, thereby accelerating the debugging process.
How to set it up
Windsurf
- Ensure Go 1.21+ and Docker are installed.
- Build the server:
go build -o tempo-mcp-server ./cmd/server
- Add the MCP server configuration in Windsurf’s configuration file:
{ "mcpServers": { "tempo": { "command": "./tempo-mcp-server", "args": [] } } }
- Save the configuration and restart Windsurf.
- Verify integration by connecting an AI client to the MCP server endpoint.
Securing API Keys
Use environment variables for sensitive data:
{
"mcpServers": {
"tempo": {
"command": "./tempo-mcp-server",
"env": {
"SSE_PORT": "8080"
},
"inputs": {}
}
}
}
Claude
- Ensure Go 1.21+ and Docker are installed.
- Build and run the server as described above.
- Edit the Claude configuration to add:
{ "mcpServers": { "tempo": { "command": "./tempo-mcp-server", "args": [] } } }
- Restart Claude and test the MCP connection.
Cursor
- Install prerequisites (Go, Docker).
- Build and run
tempo-mcp-server
. - Update Cursor’s configuration with:
{ "mcpServers": { "tempo": { "command": "./tempo-mcp-server", "args": [] } } }
- Restart Cursor and verify that the MCP server appears as a tool.
Cline
- Install Go 1.21+ and Docker.
- Build/run the server using
go build
or Docker. - Add to Cline’s MCP servers config:
{ "mcpServers": { "tempo": { "command": "./tempo-mcp-server", "env": { "SSE_PORT": "8080" } } } }
- Save changes and restart Cline.
- Confirm connectivity with the MCP server on the specified port.
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"tempo": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “tempo” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | Found in README.md |
List of Prompts | ⛔ | No prompt templates found in the repository |
List of Resources | ⛔ | No explicit MCP resources listed |
List of Tools | ✅ | Tempo Query Tool |
Securing API Keys | ✅ | Example environment variable usage in setup instructions |
Sampling Support (less important in evaluation) | ⛔ | No evidence of sampling support in documentation or code |
Based on the above data, the Tempo MCP Server provides a practical integration for distributed tracing with Grafana Tempo, but lacks comprehensive MCP prompt templates and resource definitions, and does not explicitly support sampling or roots as per available documentation. The setup is straightforward for developers familiar with Go and Docker, but the overall MCP feature set is limited.
MCP Score
Has a LICENSE | ⛔ (No LICENSE file found) |
---|---|
Has at least one tool | ✅ (Tempo Query Tool) |
Number of Forks | 0 |
Number of Stars | 2 |
Our opinion:
Given the limited set of MCP features (no prompts/resources, no explicit sampling/roots support, and no license), but with a working tool and clear setup, this MCP scores a 3/10 for overall protocol implementation and ecosystem readiness.
Frequently asked questions
- What is the Tempo MCP Server?
The Tempo MCP Server is a Go-based implementation of the Model Context Protocol that connects AI assistants with Grafana Tempo, enabling them to query and analyze distributed tracing data for improved observability and debugging.
- What can I do with the Tempo Query Tool?
The Tempo Query Tool allows AI clients to programmatically access and analyze trace data from Grafana Tempo, helping you inspect system performance, trace system behavior, and identify bottlenecks or anomalies in distributed applications.
- How do I integrate the Tempo MCP Server into my FlowHunt workflow?
Add the MCP component to your FlowHunt flow and configure it with your Tempo MCP server details using the provided JSON format. This enables your AI agent to use all supported tools and functions from the MCP server.
- Does the Tempo MCP Server support real-time event streaming?
Yes. By using the SSE (Server-Sent Events) endpoint, the Tempo MCP Server allows you to stream real-time trace events for live monitoring and quick response to system issues.
- Are there any prompt templates or resource definitions available?
No. This MCP server does not include prompt templates or explicit resource definitions. It currently provides core tracing query capabilities via the Tempo Query Tool.
- Is there a license for this MCP server?
No LICENSE file was found in the repository. Please contact the maintainer for information regarding usage and licensing.
Empower Your AI with Distributed Tracing
Connect your AI workflows to distributed tracing data using the Tempo MCP Server and gain actionable insights into your systems’ performance and behavior.