Any OpenAPI MCP Server

Enable AI assistants to semantically discover, read, and interact with any OpenAPI-compatible API. Perfect for dynamic private API integration in FlowHunt.

Any OpenAPI MCP Server

What does “Any OpenAPI” MCP Server do?

The “Any OpenAPI” MCP Server is a tool designed to connect AI assistants—such as Claude—with any external API that exposes an OpenAPI (Swagger) specification. It enables semantic search over large OpenAPI documents, intelligently chunking endpoints for rapid discovery and interaction. This allows AI clients to discover relevant API endpoints by natural language queries (e.g., “list products”), retrieve complete endpoint documentation instantly, and execute API requests directly from the server. The server is ideal for integrating private or large APIs into AI-powered workflows, streamlining operations like database queries or custom API integrations without requiring frequent manual updates.

List of Prompts

No specific prompt templates are mentioned in the available documentation or code.

List of Resources

No explicit MCP resources are listed or described in the available documentation or code.

List of Tools

  • custom_api_request_schema
    Discover relevant API endpoints by performing semantic search over the OpenAPI specification. This tool exposes endpoint documentation chunks based on natural language queries.
  • custom_make_request
    Execute an API request against the selected endpoint. Enables the AI assistant to interact directly with the API, submitting requests and retrieving responses.

Use Cases of this MCP Server

  • API Integration for Private Services
    Seamlessly connect Claude or other assistants to private APIs by providing the OpenAPI JSON URL, enabling secure and dynamic interaction with internal systems.
  • Rapid Endpoint Discovery
    Use in-memory semantic search (powered by FAISS and MiniLM-L3) to quickly find relevant API endpoints, even in large and complex OpenAPI documents.
  • Automated API Request Execution
    Allow AI clients to not only discover but execute API requests, enabling workflows like product listing, order management, or user lookup without additional tooling.
  • Contextual API Documentation Access
    Retrieve endpoint-specific documentation instantly, supporting detailed parameter discovery and usage for AI-driven automation.
  • Integration with Claude Desktop or Similar Clients
    Designed to work with Claude’s MCP client, overcoming document size limits and enabling practical use of large APIs for desktop AI applications.

How to set it up

Windsurf

  1. Ensure you have Node.js and Windsurf installed.
  2. Locate your Windsurf configuration file.
  3. Add the @any-openapi/mcp-server@latest entry to the mcpServers object.
  4. Provide the command and any necessary environment variables (like OPENAPI_JSON_DOCS_URL).
  5. Save and restart Windsurf, then verify the server appears as expected.

Example JSON:

{
  "mcpServers": {
    "any-openapi": {
      "command": "npx",
      "args": ["@any-openapi/mcp-server@latest"],
      "env": {
        "OPENAPI_JSON_DOCS_URL": "https://yourapi.com/openapi.json"
      }
    }
  }
}

Securing API Keys:

{
  "env": {
    "API_KEY": "${ANY_OPENAPI_API_KEY}"
  },
  "inputs": {
    "apiKey": "${ANY_OPENAPI_API_KEY}"
  }
}

Claude

  1. Ensure Claude supports MCP server integration.
  2. Open the settings or configuration panel for MCP servers.
  3. Insert the server details with the command and environment variables.
  4. Save changes and restart Claude if necessary.
  5. Confirm the server is discoverable and active.

Example JSON:

{
  "mcpServers": {
    "any-openapi": {
      "command": "npx",
      "args": ["@any-openapi/mcp-server@latest"],
      "env": {
        "OPENAPI_JSON_DOCS_URL": "https://yourapi.com/openapi.json"
      }
    }
  }
}

Cursor

  1. Install Cursor and navigate to the MCP server configuration section.
  2. Add a new MCP server entry using the @any-openapi/mcp-server@latest package.
  3. Set the environment variables as needed for your API.
  4. Save the configuration and restart Cursor.
  5. Verify integration by listing available tools.

Example JSON:

{
  "mcpServers": {
    "any-openapi": {
      "command": "npx",
      "args": ["@any-openapi/mcp-server@latest"],
      "env": {
        "OPENAPI_JSON_DOCS_URL": "https://yourapi.com/openapi.json"
      }
    }
  }
}

Cline

  1. Open your Cline configuration file.
  2. Add the MCP server configuration for @any-openapi/mcp-server@latest.
  3. Set required environment variables.
  4. Save and restart Cline.
  5. Ensure the server is listed among active MCP servers.

Example JSON:

{
  "mcpServers": {
    "any-openapi": {
      "command": "npx",
      "args": ["@any-openapi/mcp-server@latest"],
      "env": {
        "OPENAPI_JSON_DOCS_URL": "https://yourapi.com/openapi.json"
      }
    }
  }
}

Securing API Keys:
Use environment variables as shown above to avoid exposing sensitive credentials.

How to use this MCP inside flows

Using MCP in FlowHunt

To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

FlowHunt MCP flow

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:

{
  "any-openapi": {
    "transport": "streamable_http",
    "url": "https://yourmcpserver.example/pathtothemcp/url"
  }
}

Once configured, the AI agent can use this MCP as a tool with access to all its functions and capabilities. Remember to change “any-openapi” to the actual name of your MCP server and replace the URL with the correct endpoint.


Overview

SectionAvailabilityDetails/Notes
Overview
List of PromptsNone found
List of ResourcesNone found
List of ToolsTwo tools: discovery & execution
Securing API KeysVia environment variables
Sampling Support (less important in evaluation)Not mentioned

Roots support: Not specified in available documentation or code.


Based on the provided documentation and the above breakdown, this MCP server is focused, functional, and well-suited for API integration and dynamic endpoint discovery, but lacks explicit sample prompts/resources and documentation on sampling or roots. Its setup and security practices are clear and standard.

Our opinion

This MCP server is highly practical for developers needing to integrate large or private APIs with AI assistants, particularly Claude. Its utility is strong for semantic endpoint discovery and direct API execution, though more documentation and resource definition would be beneficial. Overall, it is a solid, focused server, but not a fully comprehensive example of all MCP features.

Rating: 7/10

MCP Score

Has a LICENSE✅ (MIT)
Has at least one tool
Number of Forks12
Number of Stars52

Frequently asked questions

What does the Any OpenAPI MCP Server do?

It lets AI assistants like Claude connect to any external API exposing an OpenAPI (Swagger) specification. It enables smart, semantic endpoint discovery and direct API request execution, making private or large API integration seamless.

Which AI assistants or clients are supported?

The server is designed for Claude but works with any AI client that supports MCP servers, including Windsurf, Cursor, and Cline.

How does endpoint discovery work?

It uses in-memory semantic search (FAISS with MiniLM-L3) to find relevant endpoints from OpenAPI documents based on natural language queries.

Is it secure to use API keys with this server?

Yes. Always use environment variables for API keys and other sensitive data, as shown in the configuration examples.

Can this server execute live API requests?

Yes. Once a relevant endpoint is discovered, the server enables the AI to execute API requests, retrieving live data or performing actions via the API.

What are typical use cases?

Integrating private APIs, automating workflows like product listing or user management, and rapidly discovering and using endpoints in large APIs.

Integrate Any API with FlowHunt's Any OpenAPI MCP Server

Supercharge your AI workflows by connecting Claude or other assistants to any OpenAPI-based API. Experience seamless, secure, and dynamic API integrations!

Learn more