
Kibana MCP Server Integration
The Kibana MCP Server bridges AI assistants with Kibana, enabling automated search, dashboard management, alert monitoring, and reporting through the standardiz...
Integrate your AI workflows with Kibela for real-time knowledge access, automated document retrieval, and enhanced team collaboration using the Kibela MCP Server.
The Kibela MCP Server is an implementation of the Model Context Protocol (MCP) designed to integrate with the Kibela API. By acting as a bridge between AI assistants and Kibela, it enables seamless access to external data, content, and services housed within Kibela workspaces. This integration allows AI agents to query, retrieve, and interact with documents and knowledge bases stored in Kibela, enhancing development workflows by automating tasks such as document search, information extraction, and collaboration. The Kibela MCP Server empowers developers and teams to leverage Large Language Models (LLMs) with up-to-date organizational knowledge, enabling efficient codebase exploration, knowledge management, and workflow automation through standardized MCP tools and resources.
No prompt templates are mentioned or defined in the available documentation or repository files.
No explicit resources are listed in the available documentation or repository files.
No explicit tools are listed in the available documentation or repository files such as server.py
(the repo is implemented in TypeScript/Node.js, and there is no direct mapping to a server.py
).
Ensure Node.js is installed on your system.
Locate the Windsurf configuration file (typically windsurf.config.json
).
Add the Kibela MCP Server package:@kiwamizamurai/mcp-kibela-server@latest
Insert the MCP server configuration under the mcpServers
object:
{
"mcpServers": {
"kibela": {
"command": "npx",
"args": ["@kiwamizamurai/mcp-kibela-server@latest"]
}
}
}
Save and restart Windsurf.
Verify the server appears in the MCP server list.
Install Node.js if not already present.
Find and open Claude’s configuration file.
Add Kibela MCP Server as follows:
{
"mcpServers": {
"kibela": {
"command": "npx",
"args": ["@kiwamizamurai/mcp-kibela-server@latest"]
}
}
}
Restart Claude.
Confirm the integration by checking available MCP endpoints.
Install Node.js.
Edit cursor.config.json
or the relevant MCP config file.
Add the following snippet:
{
"mcpServers": {
"kibela": {
"command": "npx",
"args": ["@kiwamizamurai/mcp-kibela-server@latest"]
}
}
}
Save and restart Cursor.
Test by initiating a Kibela-related query.
Make sure Node.js is installed.
Access the Cline MCP configuration file.
Add the Kibela server entry:
{
"mcpServers": {
"kibela": {
"command": "npx",
"args": ["@kiwamizamurai/mcp-kibela-server@latest"]
}
}
}
Save your changes and restart Cline.
Check that the Kibela MCP Server is running.
To secure your Kibela API keys, use environment variables. Here’s an example configuration:
{
"mcpServers": {
"kibela": {
"command": "npx",
"args": ["@kiwamizamurai/mcp-kibela-server@latest"],
"env": {
"KIBELA_API_KEY": "${KIBELA_API_KEY}"
},
"inputs": {
"workspace": "your_workspace_name"
}
}
}
}
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:
Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"kibela": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “kibela” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | None found |
List of Resources | ⛔ | None found |
List of Tools | ⛔ | None found |
Securing API Keys | ✅ | Environment variable example provided |
Sampling Support (less important in evaluation) | ⛔ | Not specified |
Between these tables:
The Kibela MCP Server provides basic documentation, a clear license, and setup instructions for major platforms. However, it lacks explicit lists of tools, resources, and prompt templates in the public documentation, which limits its out-of-the-box agentic utility. If these were added, its value would increase. As it stands, it’s suitable for basic Kibela integration but not for advanced or highly configurable MCP workflows.
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ⛔ |
Number of Forks | 5 |
Number of Stars | 6 |
The Kibela MCP Server acts as a bridge between AI assistants and Kibela, allowing seamless access to documents and knowledge bases within your Kibela workspace for advanced workflow automation.
It can automate document search, retrieval, summarization, updating records, generating reports, and AI-powered collaboration tasks like tagging documents or notifying team members.
Use environment variables in your MCP server configuration to securely store your API keys. Refer to the documentation's example for how to set this up in your platform’s config file.
The public documentation does not list explicit prompt templates or tools. The integration focuses on connecting Kibela’s knowledge base to AI workflows.
Setup instructions are provided for Windsurf, Claude, Cursor, and Cline. Node.js is a prerequisite for all platforms.
Unlock seamless AI-powered access to your organizational knowledge base. Automate search, retrieval, and workflow tasks with the Kibela MCP Server.
The Kibana MCP Server bridges AI assistants with Kibana, enabling automated search, dashboard management, alert monitoring, and reporting through the standardiz...
The Keboola MCP Server bridges your Keboola project with modern AI tools, allowing AI assistants and clients to access storage, run SQL transformations, manage ...
The ModelContextProtocol (MCP) Server acts as a bridge between AI agents and external data sources, APIs, and services, enabling FlowHunt users to build context...