ZenML MCP Server Integration

MCP Integration ZenML AI Workflow Pipeline Orchestration

Contact us to host your MCP Server in FlowHunt

FlowHunt provides an additional security layer between your internal systems and AI tools, giving you granular control over which tools are accessible from your MCP servers. MCP servers hosted in our infrastructure can be seamlessly integrated with FlowHunt's chatbot as well as popular AI platforms like ChatGPT, Claude, and various AI editors.

What does “ZenML” MCP Server do?

The ZenML MCP Server is an implementation of the Model Context Protocol (MCP) that acts as a bridge between AI assistants (such as Cursor, Claude Desktop, and others) and your ZenML MLOps and LLMOps pipelines. By exposing ZenML’s API via the MCP standard, it enables AI clients to access live information about users, pipelines, pipeline runs, steps, services, and more from a ZenML server. This integration empowers developers and AI workflows to query metadata, trigger new pipeline runs, and interact with ZenML’s orchestration features directly through supported AI tools. The ZenML MCP Server is especially useful in enhancing productivity by connecting LLM-powered assistants to robust MLOps infrastructure, facilitating tasks across the ML lifecycle.

List of Prompts

No information found about prompt templates in the repository.

Logo

Ready to grow your business?

Start your free trial today and see results within days.

List of Resources

  • Users – Access information about ZenML users.
  • Stacks – Retrieve details on available stack configurations.
  • Pipelines – Query metadata about pipelines managed in ZenML.
  • Pipeline Runs – Get information and status about pipeline executions.
  • Pipeline Steps – Explore details of steps within pipelines.
  • Services – Information about services managed by ZenML.
  • Stack Components – Metadata on different components in the ZenML stack.
  • Flavors – Retrieve information about different stack component flavors.
  • Pipeline Run Templates – Templates for launching new pipeline runs.
  • Schedules – Data about scheduled pipeline executions.
  • Artifacts – Metadata about data artifacts (not the data itself).
  • Service Connectors – Information about connectors to external services.
  • Step Code – Access code related to pipeline steps.
  • Step Logs – Retrieve logs for steps (when run on cloud-based stacks).

List of Tools

  • Trigger New Pipeline Run – Allows triggering a new pipeline run if a run template is present.
  • Read Resources – Tools to read metadata and status from ZenML server objects (users, stacks, pipelines, etc.).

Use Cases of this MCP Server

  • Pipeline Monitoring and Management: Developers can use AI assistants to query the status of pipeline runs, retrieve logs, and monitor progress directly from ZenML.
  • Triggering Pipeline Executions: AI assistants can initiate new pipeline runs through the MCP server, streamlining experiment iterations and deployment cycles.
  • Resource and Artifact Exploration: Instantly retrieve metadata about datasets, models, and other artifacts managed by ZenML, enabling fast context retrieval for experiments.
  • Stack and Service Inspection: Quickly review stack configurations and service details, simplifying troubleshooting and optimization.
  • Automated Reporting: Use AI assistants to generate reports about ML experiments, pipeline history, and artifact lineage by querying the MCP server.

How to set it up

Windsurf

No explicit instructions for Windsurf found; use generic MCP configuration:

  1. Ensure Node.js and uv are installed.
  2. Clone the repository.
  3. Obtain your ZenML server URL and API key.
  4. Edit your Windsurf MCP configuration file to add the ZenML MCP server.
  5. Save and restart Windsurf.
{
  "mcpServers": {
    "zenml": {
      "command": "/usr/local/bin/uv",
      "args": ["run", "/path/to/zenml_server.py"],
      "env": {
        "LOGLEVEL": "INFO",
        "NO_COLOR": "1",
        "PYTHONUNBUFFERED": "1",
        "PYTHONIOENCODING": "UTF-8",
        "ZENML_STORE_URL": "https://your-zenml-server-goes-here.com",
        "ZENML_STORE_API_KEY": "your-api-key-here"
      }
    }
  }
}

Note: Secure your API keys by setting them in the env section as shown above.

Claude

  1. Install Claude Desktop.
  2. Open ‘Settings’ > ‘Developer’ > ‘Edit Config’.
  3. Add the MCP server as shown below.
  4. Replace paths and credentials with your own.
  5. Save and restart Claude Desktop.
{
  "mcpServers": {
    "zenml": {
      "command": "/usr/local/bin/uv",
      "args": ["run", "/path/to/zenml_server.py"],
      "env": {
        "LOGLEVEL": "INFO",
        "NO_COLOR": "1",
        "PYTHONUNBUFFERED": "1",
        "PYTHONIOENCODING": "UTF-8",
        "ZENML_STORE_URL": "https://your-zenml-server-goes-here.com",
        "ZENML_STORE_API_KEY": "your-api-key-here"
      }
    }
  }
}

Note: Always store your API keys securely in the environment variables, as above.

Cursor

  1. Install Cursor.
  2. Locate Cursor’s MCP configuration file.
  3. Add the ZenML MCP server section as shown.
  4. Fill in the correct paths and credentials.
  5. Save and restart Cursor.
{
  "mcpServers": {
    "zenml": {
      "command": "/usr/local/bin/uv",
      "args": ["run", "/path/to/zenml_server.py"],
      "env": {
        "LOGLEVEL": "INFO",
        "NO_COLOR": "1",
        "PYTHONUNBUFFERED": "1",
        "PYTHONIOENCODING": "UTF-8",
        "ZENML_STORE_URL": "https://your-zenml-server-goes-here.com",
        "ZENML_STORE_API_KEY": "your-api-key-here"
      }
    }
  }
}

Note: API keys should be set using environment variables in the env section for security.

Cline

No explicit instructions for Cline found; use generic MCP configuration:

  1. Install any prerequisites required by Cline.
  2. Clone the MCP-ZenML repository.
  3. Obtain your ZenML server credentials.
  4. Edit the Cline MCP configuration file to include the ZenML MCP server.
  5. Save and restart Cline.
{
  "mcpServers": {
    "zenml": {
      "command": "/usr/local/bin/uv",
      "args": ["run", "/path/to/zenml_server.py"],
      "env": {
        "LOGLEVEL": "INFO",
        "NO_COLOR": "1",
        "PYTHONUNBUFFERED": "1",
        "PYTHONIOENCODING": "UTF-8",
        "ZENML_STORE_URL": "https://your-zenml-server-goes-here.com",
        "ZENML_STORE_API_KEY": "your-api-key-here"
      }
    }
  }
}

Note: Secure API keys in the env section as above.

Securing API Keys:
Set your ZenML API key and server URL securely using environment variables in the env section of the config, as in the JSON examples above.

How to use this MCP inside flows

Using MCP in FlowHunt

To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

FlowHunt MCP flow

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:

{
  "zenml": {
    "transport": "streamable_http",
    "url": "https://yourmcpserver.example/pathtothemcp/url"
  }
}

Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “zenml” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.


Overview

SectionAvailabilityDetails/Notes
Overview
List of PromptsNot found in repo
List of ResourcesCovers resources exposed by ZenML’s API
List of ToolsTrigger pipeline, read metadata, etc.
Securing API KeysExample config provided
Sampling Support (less important in evaluation)Not mentioned

Based on the tables above, the ZenML MCP server provides thorough documentation, clear setup guidance, and exposes a wide range of resources and tools. However, it lacks documentation for prompt templates and no explicit mention of sampling or roots support. The repository is active, with a permissive number of stars and forks, but some MCP advanced features are not covered.


MCP Score

Has a LICENSE⛔ (not shown in available files)
Has at least one tool
Number of Forks8
Number of Stars18

Frequently asked questions

Boost Your AI Workflows with ZenML MCP

Enable your AI assistants to orchestrate, monitor, and manage ML pipelines instantly by connecting FlowHunt to ZenML’s MCP Server.

Learn more

ZenML MCP Integration
ZenML MCP Integration

ZenML MCP Integration

Integrate FlowHunt with ZenML via the Model Context Protocol (MCP) to standardize, secure, and streamline ML pipeline access. Enable real-time workflow monitori...

4 min read
AI ZenML +4
Development Guide for MCP Servers
Development Guide for MCP Servers

Development Guide for MCP Servers

Learn how to build and deploy a Model Context Protocol (MCP) server to connect AI models with external tools and data sources. Step-by-step guide for beginners ...

20 min read
AI Protocol +4
DataHub MCP Server Integration
DataHub MCP Server Integration

DataHub MCP Server Integration

The DataHub MCP Server bridges FlowHunt AI agents with the DataHub metadata platform, enabling advanced data discovery, lineage analysis, automated metadata ret...

4 min read
AI Metadata +6