Replicate MCP Server Integration

Integrate Replicate’s extensive AI model catalog into your FlowHunt projects. Search, browse, and run models with ease using the Replicate MCP Server connector.

Replicate MCP Server Integration

What does “Replicate” MCP Server do?

The Replicate MCP Server is a Model Context Protocol (MCP) server designed to provide seamless access to Replicate’s API for AI assistants and clients. By bridging the gap between AI models and Replicate’s extensive model hub, it allows users to search, browse, and interact with various machine learning models directly from their development workflows. The server supports tasks such as semantic model search, retrieving model details, running predictions, and managing collections. This empowers developers to quickly experiment with and deploy AI capabilities like image generation, text analysis, and more, all while maintaining secure access through API tokens and standardized tool interfaces.

List of Prompts

No prompt templates are explicitly mentioned in the repository documentation or code.

List of Resources

No explicit MCP resources are described in the available documentation or code.

List of Tools

  • search_models: Find models using semantic search.
  • list_models: Browse available models on Replicate.
  • get_model: Get detailed information about a specific model.
  • list_collections: Browse collections of models.
  • get_collection: Get details about a specific model collection.
  • create_prediction: Run a selected model with user-provided inputs.

Use Cases of this MCP Server

  • Discovering AI Models: Developers can use semantic search and browsing features to quickly find models suitable for their tasks, accelerating experimentation and prototyping.
  • Model Information Retrieval: Easily fetch details and version history for models, supporting informed decision-making when integrating or deploying models.
  • Running Predictions: Execute models directly via the MCP tool interface, enabling tasks like image generation, text transformation, and more from within compatible AI platforms.
  • Managing Collections: Access and organize model collections, making it easier to manage and curate relevant models for teams or projects.
  • Workflow Automation: Integrate Replicate’s capabilities into automated development workflows, reducing manual overhead and streamlining repeated AI tasks.

How to set it up

Windsurf

  1. Ensure Node.js is installed.
  2. Obtain your Replicate API token from Replicate’s API tokens page.
  3. Add the MCP server configuration to your Windsurf settings file:
    {
      "mcpServers": {
        "replicate": {
          "command": "mcp-replicate",
          "env": {
            "REPLICATE_API_TOKEN": "your_token_here"
          }
        }
      }
    }
    
  4. Save the settings and restart Windsurf.
  5. Verify the Replicate MCP server is available in your interface.

Claude

  1. Install the server globally:
    npm install -g mcp-replicate
  2. Obtain your Replicate API token.
  3. Open Claude Desktop Settings and navigate to the “Developer” section.
  4. Click “Edit Config” and add:
    {
      "mcpServers": {
        "replicate": {
          "command": "mcp-replicate",
          "env": {
            "REPLICATE_API_TOKEN": "your_token_here"
          }
        }
      }
    }
    
  5. Save and restart Claude Desktop. Look for the hammer tool icon to confirm.

Cursor

  1. Install Node.js and obtain your Replicate API token.
  2. In Cursor’s configuration, add:
    {
      "mcpServers": {
        "replicate": {
          "command": "mcp-replicate",
          "env": {
            "REPLICATE_API_TOKEN": "your_token_here"
          }
        }
      }
    }
    
  3. Save and restart Cursor to activate the server.

Cline

  1. Ensure Node.js is installed and your Replicate API token is ready.
  2. Update the Cline configuration file:
    {
      "mcpServers": {
        "replicate": {
          "command": "mcp-replicate",
          "env": {
            "REPLICATE_API_TOKEN": "your_token_here"
          }
        }
      }
    }
    
  3. Save changes and restart Cline.

Note:
Always secure your API keys using environment variables as shown in the configuration examples above. Avoid hardcoding sensitive data in publicly accessible files.

Example with env and inputs

{
  "mcpServers": {
    "replicate": {
      "command": "mcp-replicate",
      "env": {
        "REPLICATE_API_TOKEN": "your_token_here"
      },
      "inputs": {}
    }
  }
}

How to use this MCP inside flows

Using MCP in FlowHunt

To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

FlowHunt MCP flow

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:

{
  "replicate": {
    "transport": "streamable_http",
    "url": "https://yourmcpserver.example/pathtothemcp/url"
  }
}

Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “replicate” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.


Overview

SectionAvailabilityDetails/Notes
Overview
List of PromptsNo prompt templates mentioned in repo.
List of ResourcesNo explicit resources described.
List of Tools6 tools for models and predictions.
Securing API KeysConfiguration via env vars, examples provided.
Sampling Support (less important in evaluation)No mention of sampling or roots in documentation.

Roots support: Not specified in available documentation.


Based on the tables above, the Replicate MCP Server is well-documented for installation and tool usage, but lacks prompt templates and explicit MCP resources. Sampling and roots support are not mentioned. For developers seeking Replicate API access through MCP, it is a strong choice if your focus is on model discovery and prediction tools, but it is less feature complete on advanced MCP primitives.

MCP Score

Has a LICENSE✅ (MIT)
Has at least one tool
Number of Forks16
Number of Stars72

Rating: 7/10
A solid, practical MCP server for Replicate with robust tooling and clear setup, but missing some advanced MCP features and documentation on prompt/resources.

Frequently asked questions

What is the Replicate MCP Server?

The Replicate MCP Server bridges FlowHunt and Replicate's API, allowing you to search, browse, and run predictions on thousands of AI models directly from your automated workflows.

Which tools does the Replicate MCP Server provide?

It offers semantic model search, browsing, detailed info retrieval, prediction execution, and collection management tools—making it easy to experiment with and deploy AI models.

How do I secure my API keys?

Always use environment variables (as shown in setup examples) to store your Replicate API token. Avoid hardcoding sensitive information in public files.

What are common use cases for this integration?

Typical use cases include rapid model discovery, running AI predictions (like image or text generation), retrieving model details, and automating workflows that leverage Replicate's model hub.

Does the Replicate MCP Server support prompt templates or custom resources?

No, the current documentation and code do not mention prompt templates or custom MCP resources. The focus is on tooling for model access and predictions.

Connect with Replicate MCP in FlowHunt

Supercharge your development workflows by integrating Replicate's powerful AI models with FlowHunt. Set up in minutes and unlock advanced machine learning capabilities for your projects.

Learn more