Databricks MCP Server
Seamlessly connect AI agents to Databricks for autonomous metadata exploration, SQL query execution, and advanced data automation using the Databricks MCP Server.

What does “Databricks” MCP Server do?
The Databricks MCP Server acts as a Model Context Protocol (MCP) server that connects AI assistants directly to Databricks environments, with a specific focus on leveraging Unity Catalog (UC) metadata. Its primary function is to enable AI agents to autonomously access, understand, and interact with Databricks data assets. The server provides tools that allow agents to explore UC metadata, comprehend data structures, and execute SQL queries. This empowers AI agents to answer data-related questions, perform database queries, and fulfill complex data requests independently, without requiring manual intervention at each step. By making detailed metadata accessible and actionable, the Databricks MCP Server enhances AI-driven development workflows and supports intelligent data exploration and management on Databricks.
List of Prompts
No specific prompt templates are mentioned in the repository or documentation.
List of Resources
No explicit list of MCP resources is provided in the repository or documentation.
List of Tools
The following tools and features are described in the documentation as being available:
- Explore Unity Catalog Metadata
Allows AI agents to explore Databricks Unity Catalog metadata, including catalogs, schemas, tables, and columns. - Understand Data Structures
Enables agents to understand the structure of Databricks datasets, facilitating more accurate SQL query construction. - Execute SQL Queries
Provides the ability for AI agents to run SQL queries on Databricks, supporting various data requests and analysis. - Autonomous Agent Actions
Supports agent modes where the AI can iterate through requests and perform complex, multi-step data tasks independently.
Use Cases of this MCP Server
- Database Metadata Discovery
AI agents can autonomously explore Databricks Unity Catalog metadata to understand data assets and relationships without manual lookup. - Automated SQL Query Building
Agents use metadata to automatically construct and execute SQL queries tailored to user needs or analytical tasks. - Data Documentation Assistance
By leveraging UC metadata, AI can assist in documenting data assets or verifying documentation completeness and accuracy. - Intelligent Data Exploration
Developers can use the MCP server to have AI agents answer ad hoc data questions or perform exploratory data analysis. - Complex Task Automation
The server’s agent mode allows AI to chain together multiple steps, such as discovering data, running queries, and returning results, all without human intervention.
How to set it up
Windsurf
No Windsurf-specific setup instructions or JSON snippets are provided.
Claude
No Claude-specific setup instructions or JSON snippets are provided.
Cursor
The repository mentions integration with Cursor:
- Ensure you have Python and required dependencies installed.
- Clone the repository and install requirements from
requirements.txt
. - Locate configuration files for MCP servers in Cursor.
- Add the Databricks MCP Server to the
mcpServers
object:{ "databricks-mcp": { "command": "python", "args": ["main.py"] } }
- Save your configuration and restart Cursor if required.
Securing API Keys using Environment Variables (example):
{
"databricks-mcp": {
"command": "python",
"args": ["main.py"],
"env": {
"DATABRICKS_TOKEN": "YOUR_API_KEY"
}
}
}
Cline
No Cline-specific setup instructions or JSON snippets are provided.
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"databricks-mcp": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “databricks-mcp” to your actual MCP server name and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | Good summary and motivation available |
List of Prompts | ⛔ | No prompt templates found |
List of Resources | ⛔ | No explicit MCP resources listed |
List of Tools | ✅ | High-level tools described in documentation |
Securing API Keys | ✅ | Example with "env" provided in Cursor section |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
Based on the available documentation, the Databricks MCP Server is well-scoped for Databricks/UC integration and agentic AI workflows, but is missing explicit prompt templates, resource lists, and mentions of roots or sampling features. Its setup and tool descriptions are clear for Cursor, but less so for other platforms.
Our opinion
The MCP server is focused and useful for Databricks + AI automation, but would benefit from more explicit documentation around prompts, resources, and multi-platform setup. For those seeking Databricks/UC integration, it is a solid and practical solution.
MCP Score
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ✅ |
Number of Forks | 5 |
Number of Stars | 11 |
Frequently asked questions
- What is the Databricks MCP Server?
The Databricks MCP Server is a Model Context Protocol server that connects AI agents to Databricks environments, enabling them to autonomously access Unity Catalog metadata, understand data structures, and perform SQL queries for advanced data exploration and automation.
- What tools and features does it provide?
It allows AI agents to explore Unity Catalog metadata, comprehend data structures, execute SQL queries, and operate in autonomous agent modes for multi-step data tasks.
- What are the main use cases?
Typical use cases include metadata discovery, automated SQL query building, data documentation assistance, intelligent data exploration, and complex task automation within Databricks.
- How do I secure my Databricks API key?
You should use environment variables for sensitive information. In your MCP server configuration, set the `DATABRICKS_TOKEN` as an environment variable rather than hardcoding it.
- How do I integrate the Databricks MCP Server in FlowHunt?
Add the MCP component to your FlowHunt flow, configure it with your server details, and connect it to your AI agent. Use the provided JSON format in the system MCP configuration section to specify your Databricks MCP server connection.
Empower Your AI with Databricks MCP Server
Enable your AI workflows to interact directly with Databricks Unity Catalog metadata and automate data tasks. Try it with FlowHunt today.