
Model Context Protocol (MCP) Server
The Model Context Protocol (MCP) Server bridges AI assistants with external data sources, APIs, and services, enabling streamlined integration of complex workfl...
Empower your AI agents with direct access to your app’s traces and metrics for rapid debugging, exception tracking, and telemetry insights using Logfire MCP Server in FlowHunt.
The Logfire MCP Server is a Model Context Protocol (MCP) server that allows AI assistants and LLMs to access, retrieve, and analyze telemetry data sent to Logfire via the OpenTelemetry standard. By connecting your Logfire project, this server lets AI-driven tools and agents query distributed traces, inspect exception patterns, and run custom SQL queries over your application’s metrics and tracing data using the Logfire APIs. This integration enables rapid troubleshooting, observability, and the automation of common telemetry analysis tasks, providing developers with enhanced workflows for debugging, monitoring, and insight generation directly from their development environments or AI-assisted agents.
No explicit prompt templates are documented in the repository.
No explicit resources (as MCP resources) are documented in the repository.
find_exceptions
Retrieves exception counts from traces, grouped by file, within a specified time window.
find_exceptions_in_file
Provides detailed trace information about exceptions occurring in a specific file over a given timeframe.
arbitrary_query
Executes custom SQL queries on OpenTelemetry traces and metrics, allowing flexible data exploration.
get_logfire_records_schema
Returns the OpenTelemetry schema, enabling users to craft more precise custom queries.
Exception Monitoring and Analysis
Developers can quickly surface which files are generating the most exceptions, identify trends, and focus debugging efforts.
Root Cause Analysis
By drilling down into exception details within a specific file, teams can accelerate the identification and resolution of critical issues.
Custom Telemetry Reporting
The ability to run arbitrary SQL queries empowers teams to generate bespoke metrics reports and dashboards tailored to their unique needs.
Schema Exploration
With access to the OpenTelemetry schema, developers can better understand the available data fields to optimize custom queries and integrations.
No setup instructions provided for Windsurf.
{
"command": ["uvx"],
"args": ["logfire-mcp"],
"type": "stdio",
"env": {
"LOGFIRE_READ_TOKEN": "YOUR_TOKEN"
}
}
"YOUR_TOKEN"
with your actual Logfire read token.Securing API Keys:
Store your token in the env
section as above to keep it out of arguments and source control.
uv
installed..cursor/mcp.json
file in your project root.{
"mcpServers": {
"logfire": {
"command": "uvx",
"args": ["logfire-mcp", "--read-token=YOUR-TOKEN"]
}
}
}
"YOUR-TOKEN"
with your actual Logfire read token.Note: Cursor does not support the env
field; use the --read-token
argument instead.
cline_mcp_settings.json
.{
"mcpServers": {
"logfire": {
"command": "uvx",
"args": ["logfire-mcp"],
"env": {
"LOGFIRE_READ_TOKEN": "YOUR_TOKEN"
},
"disabled": false,
"autoApprove": []
}
}
}
"YOUR_TOKEN"
with your Logfire read token.Securing API Keys:
Tokens are kept secure via the env
field in your configuration.
No setup instructions provided for Windsurf.
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:
Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"logfire": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change "logfire"
to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | No prompt templates are documented. |
List of Resources | ⛔ | No resources are documented. |
List of Tools | ✅ | 4 tools documented: exceptions, queries, and schema access. |
Securing API Keys | ✅ | Environment variable and config JSON examples provided. |
Sampling Support (less important in evaluation) | ⛔ | No mention of sampling support. |
Based on the above, Logfire MCP Server is a focused, production-quality MCP server for observability, but lacks documentation for prompt templates, resources, roots, or sampling support. It excels at exposing a small set of high-value tools for telemetry and debugging. Final rating: 6/10 — excellent for its use case, but not a full-featured MCP reference implementation.
Has a LICENSE | ⛔ (No LICENSE file found) |
---|---|
Has at least one tool | ✅ |
Number of Forks | 9 |
Number of Stars | 77 |
The Logfire MCP Server enables AI agents and LLMs to access and analyze telemetry data (traces, metrics, exceptions) collected via OpenTelemetry, using Logfire APIs for real-time observability and troubleshooting.
Logfire MCP exposes tools for exception counting and drilling down (find_exceptions, find_exceptions_in_file), custom SQL over telemetry (arbitrary_query), and schema discovery (get_logfire_records_schema).
Store your Logfire read token in environment variables (env fields in config) for Claude and Cline, and as a CLI argument for Cursor. Avoid hardcoding tokens in source-controlled files.
Typical use cases include exception monitoring, root cause analysis, custom telemetry reporting, and schema exploration—all accessible to AI agents in FlowHunt via the MCP integration.
Add the MCP component in your FlowHunt flow, configure it with your Logfire MCP server details, and your AI agent will be able to run queries and analyses on your application's telemetry data.
Integrate Logfire MCP Server with FlowHunt to unlock real-time telemetry queries, exception insights, and custom reporting for your AI-powered workflows.
The Model Context Protocol (MCP) Server bridges AI assistants with external data sources, APIs, and services, enabling streamlined integration of complex workfl...
The ModelContextProtocol (MCP) Server acts as a bridge between AI agents and external data sources, APIs, and services, enabling FlowHunt users to build context...
The Loki MCP Server connects AI assistants with Grafana Loki, allowing seamless querying and analysis of log data via the Model Context Protocol. It empowers LL...