
Databricks MCP Server
The Databricks MCP Server enables seamless integration between AI assistants and the Databricks platform, allowing natural language access to Databricks resourc...
Bridge your AI assistant with Databricks using the Genie MCP Server to unlock natural language querying, workspace metadata access, and multi-turn conversation management for streamlined data-driven workflows.
The Databricks Genie MCP Server is a Model Context Protocol (MCP) server designed to bridge AI assistants and the Databricks Genie API. This integration empowers large language models (LLMs) to interact with Databricks environments using natural language. Through the server, LLMs can perform actions such as listing Genie spaces, retrieving workspace metadata, initiating and managing Genie conversations, and running SQL queries—all via standardized MCP tools. By acting as a connector, the Databricks Genie MCP Server enables developers to enhance their workflows with conversational data exploration, direct SQL querying, and seamless interaction with Databricks conversational agents, streamlining data-driven development and analysis.
No explicit prompt templates are documented in the repository.
No explicit resources are described in the repository.
.env
file with your Databricks credentials (DATABRICKS_HOST
and DATABRICKS_TOKEN
).{
"mcpServers": {
"databricks-genie": {
"command": "python",
"args": ["main.py"]
}
}
}
{
"env": {
"DATABRICKS_HOST": "your-databricks-instance.cloud.databricks.com",
"DATABRICKS_TOKEN": "your-personal-access-token"
},
"inputs": {}
}
.env
with your Databricks host and token.mcp install main.py
.env
is configured.{
"mcpServers": {
"databricks-genie": {
"command": "python",
"args": ["main.py"]
}
}
}
.env
.{
"mcpServers": {
"databricks-genie": {
"command": "python",
"args": ["main.py"]
}
}
}
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:
Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"databricks-genie": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “databricks-genie” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | No prompt templates described in the repository |
List of Resources | ⛔ | No explicit MCP resources documented |
List of Tools | ✅ | 4 tools: see section above |
Securing API Keys | ✅ | Described via .env and JSON example |
Sampling Support (less important in evaluation) | ⛔ | No mention |
The Databricks Genie MCP Server provides a practical bridge between Databricks and LLMs, with clear setup instructions and tooling. However, it lacks prompt templates, explicit resources, and documentation on advanced MCP features like sampling or roots. The core tools are well-defined and useful for Databricks users. Overall, it scores above average but would benefit from richer MCP feature utilization.
Has a LICENSE | Yes (MIT) |
---|---|
Has at least one tool | Yes |
Number of Forks | 1 |
Number of Stars | 3 |
It is a Model Context Protocol server that connects large language models to Databricks Genie, enabling natural language interaction, SQL query generation, and workspace metadata retrieval directly from AI assistants.
You can list Genie spaces, retrieve space metadata, initiate and manage Genie conversations with natural language, and run or follow up on SQL queries.
It streamlines data exploration by allowing conversational, multi-turn queries and automated SQL generation, making data analysis more accessible and reducing manual SQL writing.
Credentials such as Databricks host and token are managed via environment variables, never hardcoded, to ensure sensitive information remains secure.
No, the repository does not include explicit prompt templates or additional MCP resources, but the core tools for conversation and SQL querying are fully supported.
Unlock conversational data analysis and direct SQL querying inside FlowHunt by connecting your Databricks workspace with the Genie MCP Server.
The Databricks MCP Server enables seamless integration between AI assistants and the Databricks platform, allowing natural language access to Databricks resourc...
The Databricks MCP Server connects AI assistants to Databricks environments, enabling autonomous exploration, understanding, and interaction with Unity Catalog ...
Integrate the Glean MCP Server with FlowHunt to empower your AI assistants with advanced enterprise search and conversational Q&A using the Glean API. Streamlin...