Microsoft Fabric MCP Server
Leverage the Microsoft Fabric MCP Server to supercharge your AI workflows with advanced data engineering, analytics, and intelligent PySpark development—all accessible via natural language and FlowHunt integrations.

What does “Microsoft Fabric” MCP Server do?
The Microsoft Fabric MCP Server is a Python-based Model Context Protocol (MCP) server designed for seamless interaction with Microsoft Fabric APIs. It empowers AI assistants to connect with external Microsoft Fabric resources, enabling a robust development workflow for data engineering and analytics. The server facilitates advanced operations such as workspace, lakehouse, warehouse, and table management, delta table schema retrieval, SQL query execution, and more. Additionally, it offers intelligent PySpark notebook development and optimization through LLM integration, providing context-aware code generation, validation, performance analysis, and real-time monitoring. This integration significantly boosts developer productivity by allowing natural language interaction, automated code assistance, and streamlined deployment within the Microsoft Fabric ecosystem.
List of Prompts
No explicit prompt templates are mentioned in the repository files or documentation.
List of Resources
No explicit MCP resources are listed in the repository files or documentation.
List of Tools
No explicit tool definitions found in server.py or the repository files. The README mentions:
- PySpark Tools: For notebook creation, code generation, validation, analysis, and deployment.
- PySpark Helpers: For auxiliary Spark-related operations.
- Template Manager: For managing notebook/code templates.
- Code Validators: For checking code syntax and best practices.
- Code Generators: For automated code production. (Actual MCP tool interface details are not available.)
Use Cases of this MCP Server
- Workspace and Lakehouse Management: Simplifies the creation and management of workspaces, lakehouses, warehouses, and tables in Microsoft Fabric, making it easier for developers to organize and manipulate data environments.
- Delta Table Schema and Metadata Retrieval: Enables AI-powered querying and exploration of delta table schemas and metadata, supporting advanced data engineering tasks.
- SQL Query Execution: Facilitates running SQL queries and loading data in Fabric resources programmatically, streamlining analytics pipelines.
- Advanced PySpark Notebook Development: Offers intelligent notebook creation, validation, and optimization with LLM integration, accelerating the development of performant Spark jobs.
- Performance Analysis and Real-Time Monitoring: Provides tools for analyzing and optimizing notebook performance, with real-time execution insights to support continuous improvement.
How to set it up
Windsurf
- Ensure Python and Node.js are installed.
- Locate your Windsurf configuration file (e.g.,
~/.windsurf/config.json
). - Add the Microsoft Fabric MCP Server to the
mcpServers
section:{ "mcpServers": { "fabric-mcp": { "command": "python", "args": ["-m", "fabric_mcp"] } } }
- Save the configuration and restart Windsurf.
- Verify setup by accessing the MCP server from Windsurf’s interface.
Securing API Keys
Use environment variables for sensitive API keys:
{
"mcpServers": {
"fabric-mcp": {
"command": "python",
"args": ["-m", "fabric_mcp"],
"env": {
"FABRIC_API_KEY": "${FABRIC_API_KEY}"
},
"inputs": {
"api_key": "${FABRIC_API_KEY}"
}
}
}
}
Claude
- Ensure Python is installed and accessible.
- Open Claude’s configuration file (e.g.,
claude.config.json
). - Add the MCP server:
{ "mcpServers": { "fabric-mcp": { "command": "python", "args": ["-m", "fabric_mcp"] } } }
- Save changes and restart Claude.
- Confirm the MCP server is listed in Claude’s MCP integration panel.
Cursor
- Install Python and Node.js if not already present.
- Edit Cursor’s settings file (e.g.,
cursor.config.json
). - Register the MCP server:
{ "mcpServers": { "fabric-mcp": { "command": "python", "args": ["-m", "fabric_mcp"] } } }
- Save the file and relaunch Cursor.
- Check connectivity to the MCP server through Cursor’s interface.
Cline
- Make sure Python is set up on your system.
- Open Cline’s configuration (e.g.,
cline.json
). - Add the server entry:
{ "mcpServers": { "fabric-mcp": { "command": "python", "args": ["-m", "fabric_mcp"] } } }
- Save and restart Cline.
- Test MCP server availability from Cline’s command palette.
For all platforms:
- Use environment variables in the
env
section of JSON for API keys or secrets.
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"fabric-mcp": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “fabric-mcp” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | No prompt templates found |
List of Resources | ⛔ | No explicit MCP resources listed |
List of Tools | ⛔ | Only general tool categories mentioned |
Securing API Keys | ✅ | Example JSON config with env included |
Sampling Support (less important in evaluation) | ⛔ | No evidence of sampling support |
Based on the available documentation, the Microsoft Fabric MCP server offers a strong overview and setup guidance, but lacks detailed, explicit listings for prompts, resources, and tools in its public files. It provides good security practices but does not document sampling support.
Our opinion
This MCP server is promising for Fabric development workflows thanks to its focus on advanced PySpark and LLM integration. However, the absence of explicit prompts, resources, and tool schemas in documentation limits its immediate plug-and-play utility. It scores well for architecture and setup clarity, but would benefit from richer developer-facing documentation and feature exposure.
MCP Score
Has a LICENSE | ⛔ |
---|---|
Has at least one tool | ✅ |
Number of Forks | 1 |
Number of Stars | 3 |
Frequently asked questions
- What is the Microsoft Fabric MCP Server?
The Microsoft Fabric MCP Server is a Python-based Model Context Protocol (MCP) server for interacting with Microsoft Fabric APIs. It enables AI assistants to manage workspaces, lakehouses, warehouses, tables, run SQL queries, retrieve delta table schemas, and develop PySpark notebooks with LLM-powered code generation, validation, and optimization.
- How do I set up the Fabric MCP Server in FlowHunt or my dev environment?
You configure your development tool (Windsurf, Claude, Cursor, or Cline) by adding the MCP server to its configuration file, specifying the command and arguments for the Fabric MCP Server. Secure API keys via environment variables as shown in the setup instructions.
- What can I do with the Microsoft Fabric MCP integration?
You can manage Microsoft Fabric resources, run advanced data engineering and analytics tasks, develop and optimize PySpark notebooks, query delta table schemas, and automate workflows using AI agents in FlowHunt.
- Does the server have ready-made prompts, tools, or resources?
No explicit prompt templates, resources, or tool schemas are provided in the repository documentation. General categories like PySpark tools, code generators, and code validators are mentioned, but not detailed.
- How are API keys and sensitive data secured?
API keys should be stored using environment variables in your configuration file, ensuring sensitive credentials are not exposed in code or config files directly.
Connect to Microsoft Fabric with FlowHunt
Empower your AI agents to automate and optimize Microsoft Fabric workflows. Try the Fabric MCP server integration for advanced data engineering, analytics, and AI-powered code assistance.