
Databricks MCP Server
The Databricks MCP Server connects AI assistants to Databricks environments, enabling autonomous exploration, understanding, and interaction with Unity Catalog ...
Connect your AI agents to Databricks for automated SQL, job monitoring, and workflow management using the Databricks MCP Server in FlowHunt.
The Databricks MCP (Model Context Protocol) Server is a specialized tool that connects AI assistants to the Databricks platform, enabling seamless interaction with Databricks resources through natural language interfaces. This server acts as a bridge between large language models (LLMs) and Databricks APIs, allowing LLMs to execute SQL queries, list jobs, retrieve job statuses, and obtain detailed job information. By exposing these capabilities via the MCP protocol, the Databricks MCP Server empowers developers and AI agents to automate data workflows, manage Databricks jobs, and streamline database operations, thus enhancing productivity in data-driven development environments.
No prompt templates are described in the repository.
No explicit resources are listed in the repository.
pip install -r requirements.txt
..env
file with your Databricks credentials.{
"mcpServers": {
"databricks": {
"command": "python",
"args": ["main.py"]
}
}
}
Securing API Keys Example:
{
"mcpServers": {
"databricks": {
"command": "python",
"args": ["main.py"],
"env": {
"DATABRICKS_HOST": "${DATABRICKS_HOST}",
"DATABRICKS_TOKEN": "${DATABRICKS_TOKEN}",
"DATABRICKS_HTTP_PATH": "${DATABRICKS_HTTP_PATH}"
}
}
}
}
.env
file with Databricks credentials.{
"mcpServers": {
"databricks": {
"command": "python",
"args": ["main.py"]
}
}
}
.env
with credentials.{
"mcpServers": {
"databricks": {
"command": "python",
"args": ["main.py"]
}
}
}
.env
.{
"mcpServers": {
"databricks": {
"command": "python",
"args": ["main.py"]
}
}
}
Note: Always secure your API keys and secrets by using environment variables as shown in the configuration examples above.
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:
Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"databricks": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “databricks” to the actual name of your MCP server and replace the URL with your own MCP server URL.
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | No prompt templates specified in repo |
List of Resources | ⛔ | No explicit resources defined |
List of Tools | ✅ | 4 tools: run_sql_query, list_jobs, get_job_status, get_job_details |
Securing API Keys | ✅ | Via environment variables in .env and config JSON |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
| Roots Support | ⛔ | Not mentioned |
Based on the availability of core features (tools, setup and security guidance, but no resources or prompt templates), the Databricks MCP Server is effective for Databricks API integration but lacks some advanced MCP primitives. I would rate this MCP server a 6 out of 10 for overall completeness and utility in the context of the MCP ecosystem.
Has a LICENSE | ⛔ (not found) |
---|---|
Has at least one tool | ✅ |
Number of Forks | 13 |
Number of Stars | 33 |
The Databricks MCP Server is a bridge between AI assistants and Databricks, exposing Databricks capabilities like SQL execution and job management via the MCP protocol for automated workflows.
It supports executing SQL queries, listing all jobs, retrieving job statuses, and obtaining detailed information about specific Databricks jobs.
Always use environment variables, for example by placing them in a `.env` file or configuring them in your MCP server setup, instead of hardcoding sensitive information.
Yes, simply add the MCP component to your flow, configure it with your Databricks MCP server details, and your AI agents will be able to access all supported Databricks functions.
Based on available tools, setup guidance, and security support, but lacking resources and prompt templates, this MCP Server rates a 6 out of 10 for completeness in the MCP ecosystem.
Automate SQL queries, monitor jobs, and manage Databricks resources directly from conversational AI interfaces. Integrate Databricks MCP Server into your FlowHunt flows for next-level productivity.
The Databricks MCP Server connects AI assistants to Databricks environments, enabling autonomous exploration, understanding, and interaction with Unity Catalog ...
The MCP Database Server enables secure, programmatic access to popular databases like SQLite, SQL Server, PostgreSQL, and MySQL for AI assistants and automation...
The Databricks Genie MCP Server enables large language models to interact with Databricks environments through the Genie API, supporting conversational data exp...