Databricks Genie MCP Server
Bridge your AI assistant with Databricks using the Genie MCP Server to unlock natural language querying, workspace metadata access, and multi-turn conversation management for streamlined data-driven workflows.

What does “Databricks Genie” MCP Server do?
The Databricks Genie MCP Server is a Model Context Protocol (MCP) server designed to bridge AI assistants and the Databricks Genie API. This integration empowers large language models (LLMs) to interact with Databricks environments using natural language. Through the server, LLMs can perform actions such as listing Genie spaces, retrieving workspace metadata, initiating and managing Genie conversations, and running SQL queries—all via standardized MCP tools. By acting as a connector, the Databricks Genie MCP Server enables developers to enhance their workflows with conversational data exploration, direct SQL querying, and seamless interaction with Databricks conversational agents, streamlining data-driven development and analysis.
List of Prompts
No explicit prompt templates are documented in the repository.
List of Resources
No explicit resources are described in the repository.
List of Tools
- get_genie_space_id()
Lists available Genie space IDs and titles in your Databricks workspace. - get_space_info(space_id: str)
Retrieves the title and description metadata of a specified Genie space. - ask_genie(space_id: str, question: str)
Starts a new Genie conversation by posing a natural language question and returns the SQL and result tables. - follow_up(space_id: str, conversation_id: str, question: str)
Continues an existing Genie conversation with a follow-up question.
Use Cases of this MCP Server
- Conversational Data Exploration
Developers and analysts can use natural language to interactively query Databricks data via Genie, making data analysis more accessible and intuitive. - Automated SQL Query Generation
The server converts natural language questions into SQL statements, executing them on Genie spaces and returning structured results, saving time and reducing errors. - Workspace Metadata Retrieval
Easily fetch metadata (titles, descriptions) about Genie spaces to understand and document available data resources. - Conversation Management
Maintain context over multi-turn conversations, allowing for complex analytical workflows where questions build on previous answers. - Integration with AI Assistants
Seamlessly add Databricks Genie capabilities to AI-powered IDEs or chat interfaces, streamlining data science workflows within familiar tools.
How to set it up
Windsurf
- Ensure Python 3.7+ is installed on your system.
- Clone the Databricks Genie MCP repository and install dependencies.
- Create a
.env
file with your Databricks credentials (DATABRICKS_HOST
andDATABRICKS_TOKEN
). - In your Windsurf configuration, add the MCP server using the following JSON snippet:
{ "mcpServers": { "databricks-genie": { "command": "python", "args": ["main.py"] } } }
- Restart Windsurf and verify the server appears in your available MCP servers.
- Securing API Keys:
Use environment variables to keep credentials safe. Example:{ "env": { "DATABRICKS_HOST": "your-databricks-instance.cloud.databricks.com", "DATABRICKS_TOKEN": "your-personal-access-token" }, "inputs": {} }
Claude
- Install Python 3.7+ and dependencies from the repo.
- Configure
.env
with your Databricks host and token. - From your project directory, run:
mcp install main.py
- Open Claude Desktop, navigate to Resources → Add Resource, and select your Genie MCP Server.
- Start chatting with your Databricks data.
Cursor
- Ensure all prerequisites and dependencies are met and
.env
is configured. - Add the following to your Cursor configuration:
{ "mcpServers": { "databricks-genie": { "command": "python", "args": ["main.py"] } } }
- Save configuration and restart Cursor.
- Verify the server connection and ensure environment variables are set as shown above.
Cline
- Install Python 3.7+, clone the repo, and set up your
.env
. - Add the MCP server in your Cline config:
{ "mcpServers": { "databricks-genie": { "command": "python", "args": ["main.py"] } } }
- Restart Cline and verify the MCP server is active.
- Use environment variables to protect your credentials.
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"databricks-genie": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “databricks-genie” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | No prompt templates described in the repository |
List of Resources | ⛔ | No explicit MCP resources documented |
List of Tools | ✅ | 4 tools: see section above |
Securing API Keys | ✅ | Described via .env and JSON example |
Sampling Support (less important in evaluation) | ⛔ | No mention |
Our opinion
The Databricks Genie MCP Server provides a practical bridge between Databricks and LLMs, with clear setup instructions and tooling. However, it lacks prompt templates, explicit resources, and documentation on advanced MCP features like sampling or roots. The core tools are well-defined and useful for Databricks users. Overall, it scores above average but would benefit from richer MCP feature utilization.
MCP Score
Has a LICENSE | Yes (MIT) |
---|---|
Has at least one tool | Yes |
Number of Forks | 1 |
Number of Stars | 3 |
Frequently asked questions
- What is the Databricks Genie MCP Server?
It is a Model Context Protocol server that connects large language models to Databricks Genie, enabling natural language interaction, SQL query generation, and workspace metadata retrieval directly from AI assistants.
- What tasks can be performed via the Genie MCP Server?
You can list Genie spaces, retrieve space metadata, initiate and manage Genie conversations with natural language, and run or follow up on SQL queries.
- How does the Genie MCP Server improve data workflows?
It streamlines data exploration by allowing conversational, multi-turn queries and automated SQL generation, making data analysis more accessible and reducing manual SQL writing.
- How are credentials secured?
Credentials such as Databricks host and token are managed via environment variables, never hardcoded, to ensure sensitive information remains secure.
- Does this server provide prompt templates or explicit resources?
No, the repository does not include explicit prompt templates or additional MCP resources, but the core tools for conversation and SQL querying are fully supported.
Supercharge Databricks with Genie MCP
Unlock conversational data analysis and direct SQL querying inside FlowHunt by connecting your Databricks workspace with the Genie MCP Server.