
Simple Loki MCP Server
The Simple Loki MCP Server integrates Grafana Loki log querying into AI workflows via the Model Context Protocol. It enables AI agents to analyze, filter, and r...
Integrate Grafana Loki log querying into your AI workflows with the Loki MCP Server for real-time insights, monitoring, and operational automation.
The Loki MCP Server is a Go-based implementation of the Model Context Protocol (MCP) designed to integrate with Grafana Loki, a log aggregation system. It serves as a bridge between AI assistants and external log data sources, enabling the AI to query and interact with log streams stored in Loki. By exposing Loki’s querying capabilities via the MCP protocol, developers and AI clients can enhance their workflows—such as searching, filtering, and analyzing logs—directly through standardized LLM-driven interfaces. This empowers tasks like real-time log investigation, troubleshooting, and dashboard creation, providing seamless access to operational data for improved observability and automation.
No prompt templates are documented in the repository.
No explicit MCP resources are described in the repository.
query
: LogQL query stringurl
: Loki server URL (default from LOKI_URL env or http://localhost:3100)start
: Start time for the query (default: 1 hour ago)end
: End time for the query (default: now)limit
: Max number of entries to return (default: 100)Install Go 1.16 or higher.
Build the server:go build -o loki-mcp-server ./cmd/server
Edit your Windsurf configuration to add the MCP server.
Add the Loki MCP server with a JSON snippet (adapt as needed):
{
"mcpServers": {
"loki-mcp": {
"command": "./loki-mcp-server",
"args": []
}
}
}
Save the configuration and restart Windsurf.
Verify the server is running and accessible.
Securing API Keys (Environment Variables Example):
{
"mcpServers": {
"loki-mcp": {
"command": "./loki-mcp-server",
"env": {
"LOKI_URL": "https://your-loki-server.example"
}
}
}
}
Install Go 1.16 or higher.
Build the server as above.
Open Claude’s MCP configuration file.
Add the Loki MCP server:
{
"mcpServers": {
"loki-mcp": {
"command": "./loki-mcp-server",
"args": []
}
}
}
Save/restart Claude.
Confirm setup is working.
Securing API Keys:
{
"mcpServers": {
"loki-mcp": {
"command": "./loki-mcp-server",
"env": {
"LOKI_URL": "https://your-loki-server.example"
}
}
}
}
Ensure Go 1.16+ is installed.
Build Loki MCP server.
Edit Cursor’s configuration.
Add Loki MCP server entry:
{
"mcpServers": {
"loki-mcp": {
"command": "./loki-mcp-server",
"args": []
}
}
}
Save and restart Cursor.
Verify integration.
Using Environment Variables:
{
"mcpServers": {
"loki-mcp": {
"command": "./loki-mcp-server",
"env": {
"LOKI_URL": "https://your-loki-server.example"
}
}
}
}
Install Go >=1.16.
Build with:go build -o loki-mcp-server ./cmd/server
Locate Cline’s MCP server config.
Add Loki MCP server:
{
"mcpServers": {
"loki-mcp": {
"command": "./loki-mcp-server",
"args": []
}
}
}
Save and restart Cline.
Test the setup.
Secure API Keys via env:
{
"mcpServers": {
"loki-mcp": {
"command": "./loki-mcp-server",
"env": {
"LOKI_URL": "https://your-loki-server.example"
}
}
}
}
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:
Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"loki-mcp": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change "loki-mcp"
to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | Summary available in README.md |
List of Prompts | ⛔ | No prompt templates documented |
List of Resources | ⛔ | No explicit MCP resources listed |
List of Tools | ✅ | loki_query tool described in README.md |
Securing API Keys | ✅ | Uses LOKI_URL env variable |
Sampling Support (less important in evaluation) | ⛔ | No mention of sampling support |
Based on the above tables, Loki MCP Server offers a clear overview and a functional tool for querying logs, but lacks documented prompts, resources, and advanced MCP features like sampling or roots. The documentation is minimal, and setup is developer-oriented.
The Loki MCP Server is focused and functional for integrating LLMs with Grafana Loki log querying, but it is minimalistic and lacks breadth in MCP features and documentation. For a score, it would receive a 4/10: it works for its main purpose, but is not a feature-complete, polished, or highly documented MCP server.
Has a LICENSE | ⛔ |
---|---|
Has at least one tool | ✅ |
Number of Forks | 1 |
Number of Stars | 5 |
The Loki MCP Server is a Go-based service that connects AI assistants to Grafana Loki, allowing log data queries and analysis through the Model Context Protocol (MCP). It enables advanced log monitoring, troubleshooting, and dashboard automation within AI workflows.
It provides the `loki_query` tool, letting users query logs in Grafana Loki using LogQL, with support for parameters like query string, time range, and result limit.
Key use cases include log data exploration, automated log monitoring, AI-powered operational dashboards, and root cause analysis—all directly from your AI workflows.
Set sensitive information such as the Loki server URL via environment variables, for example: `LOKI_URL=https://your-loki-server.example` in your MCP server configuration.
No, it does not currently support prompt templates, sampling, or advanced MCP features—its functionality is focused on querying and analyzing logs through a single tool.
Add the MCP component to your FlowHunt flow, provide your Loki MCP connection details in JSON, and connect it to your AI agent. This enables direct log querying and analysis from your AI workflows.
Bridge the gap between AI and log data. Deploy Loki MCP Server to power advanced log analysis and monitoring in your FlowHunt workflows.
The Simple Loki MCP Server integrates Grafana Loki log querying into AI workflows via the Model Context Protocol. It enables AI agents to analyze, filter, and r...
The Logfire MCP Server connects AI assistants and LLMs to telemetry data via OpenTelemetry, enabling real-time querying, exception monitoring, root cause analys...
Integrate and automate Grafana’s dashboards, datasources, and monitoring tools into AI-driven development workflows using FlowHunt's Grafana MCP Server. Enable ...