
Databricks MCP Server
The Databricks MCP Server enables seamless integration between AI assistants and the Databricks platform, allowing natural language access to Databricks resourc...
Seamlessly connect AI agents to Databricks for autonomous metadata exploration, SQL query execution, and advanced data automation using the Databricks MCP Server.
The Databricks MCP Server acts as a Model Context Protocol (MCP) server that connects AI assistants directly to Databricks environments, with a specific focus on leveraging Unity Catalog (UC) metadata. Its primary function is to enable AI agents to autonomously access, understand, and interact with Databricks data assets. The server provides tools that allow agents to explore UC metadata, comprehend data structures, and execute SQL queries. This empowers AI agents to answer data-related questions, perform database queries, and fulfill complex data requests independently, without requiring manual intervention at each step. By making detailed metadata accessible and actionable, the Databricks MCP Server enhances AI-driven development workflows and supports intelligent data exploration and management on Databricks.
No specific prompt templates are mentioned in the repository or documentation.
No explicit list of MCP resources is provided in the repository or documentation.
The following tools and features are described in the documentation as being available:
No Windsurf-specific setup instructions or JSON snippets are provided.
No Claude-specific setup instructions or JSON snippets are provided.
The repository mentions integration with Cursor:
requirements.txt
.mcpServers
object:{
"databricks-mcp": {
"command": "python",
"args": ["main.py"]
}
}
Securing API Keys using Environment Variables (example):
{
"databricks-mcp": {
"command": "python",
"args": ["main.py"],
"env": {
"DATABRICKS_TOKEN": "YOUR_API_KEY"
}
}
}
No Cline-specific setup instructions or JSON snippets are provided.
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:
Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"databricks-mcp": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “databricks-mcp” to your actual MCP server name and replace the URL with your own MCP server URL.
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | Good summary and motivation available |
List of Prompts | ⛔ | No prompt templates found |
List of Resources | ⛔ | No explicit MCP resources listed |
List of Tools | ✅ | High-level tools described in documentation |
Securing API Keys | ✅ | Example with "env" provided in Cursor section |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
Based on the available documentation, the Databricks MCP Server is well-scoped for Databricks/UC integration and agentic AI workflows, but is missing explicit prompt templates, resource lists, and mentions of roots or sampling features. Its setup and tool descriptions are clear for Cursor, but less so for other platforms.
The MCP server is focused and useful for Databricks + AI automation, but would benefit from more explicit documentation around prompts, resources, and multi-platform setup. For those seeking Databricks/UC integration, it is a solid and practical solution.
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ✅ |
Number of Forks | 5 |
Number of Stars | 11 |
The Databricks MCP Server is a Model Context Protocol server that connects AI agents to Databricks environments, enabling them to autonomously access Unity Catalog metadata, understand data structures, and perform SQL queries for advanced data exploration and automation.
It allows AI agents to explore Unity Catalog metadata, comprehend data structures, execute SQL queries, and operate in autonomous agent modes for multi-step data tasks.
Typical use cases include metadata discovery, automated SQL query building, data documentation assistance, intelligent data exploration, and complex task automation within Databricks.
You should use environment variables for sensitive information. In your MCP server configuration, set the `DATABRICKS_TOKEN` as an environment variable rather than hardcoding it.
Add the MCP component to your FlowHunt flow, configure it with your server details, and connect it to your AI agent. Use the provided JSON format in the system MCP configuration section to specify your Databricks MCP server connection.
Enable your AI workflows to interact directly with Databricks Unity Catalog metadata and automate data tasks. Try it with FlowHunt today.
The Databricks MCP Server enables seamless integration between AI assistants and the Databricks platform, allowing natural language access to Databricks resourc...
The Unity Catalog MCP Server enables AI assistants and developers to programmatically manage, discover, and manipulate Unity Catalog functions via the Model Con...
The DataHub MCP Server bridges FlowHunt AI agents with the DataHub metadata platform, enabling advanced data discovery, lineage analysis, automated metadata ret...