OpenAPI MCP Server

Bridge the gap between AI agents and OpenAPI specs with the OpenAPI MCP Server—enabling API discovery, documentation, and code generation support for your workflows.

OpenAPI MCP Server

What does “OpenAPI” MCP Server do?

The OpenAPI MCP Server is a Model Context Protocol (MCP) server designed to connect AI assistants (such as Claude and Cursor) with the ability to search and explore OpenAPI specifications through oapis.org. By acting as a bridge, it enables AI models to gain a comprehensive understanding of complex APIs using simple language. The server follows a three-step process: identifying the required OpenAPI specification, summarizing it in accessible terms, and detailing the endpoints and their usage. While it does not execute API endpoints directly (due to authentication limitations), it excels at providing API overviews, facilitating code generation, and supporting development workflows where understanding and documenting API structure is essential.

List of Prompts

  • Overview Prompt: Requests a summary and understanding of an OpenAPI specification.
  • Operation Details Prompt: Retrieves detailed descriptions of specific API operations.
  • Endpoint Identification Prompt: Determines which endpoints are relevant based on a query.

List of Resources

  • OpenAPI Specification Overview: Provides summaries of entire API specifications.
  • API Operation Details: Supplies contextual details about specific endpoints and their parameters.
  • Format Flexibility: Supports both JSON and YAML formatted API specifications.
  • Compatibility Resource: Tested resources with Claude Desktop and Cursor for seamless context delivery.

List of Tools

  • No executable tools are exposed in v2; the server focuses on exploration and providing context about APIs but does not allow direct execution of endpoints as tools.

Use Cases of this MCP Server

  • API Documentation Generation: Automatically generate human-readable documentation from complex OpenAPI specs, making APIs easier to understand for developers.
  • API Code Generation Support: Assist developers in generating client code by providing clear endpoint descriptions and usage details.
  • API Discovery and Exploration: Quickly identify and summarize available endpoints, helping teams or AI models discover API capabilities.
  • Context Provision for AI Agents: Supply relevant API context to LLMs or agents, improving their ability to answer questions or write code involving external APIs.
  • Onboarding and Training: Help new team members or AI agents learn about unfamiliar APIs through simplified summaries and operation breakdowns.

How to set it up

Windsurf

  1. Ensure Node.js is installed on your system.
  2. Open your Windsurf configuration file.
  3. Add the OpenAPI MCP Server to the mcpServers section using the provided JSON snippet.
  4. Save the configuration and restart Windsurf.
  5. Verify connection to the MCP server.

Example configuration:

{
  "mcpServers": {
    "openapi-mcp": {
      "command": "npx",
      "args": ["@janwilmake/openapi-mcp-server@latest"],
      "env": {
        "OAS_API_KEY": "${OAS_API_KEY}"
      }
    }
  }
}

Note: Secure your API keys using environment variables as shown above.

Claude

  1. Install Node.js.
  2. Access Claude’s MCP integration settings.
  3. Add the OpenAPI MCP server with the following configuration.
  4. Save settings and restart Claude.
  5. Confirm that the server is available as an MCP resource.

Example configuration:

{
  "mcpServers": {
    "openapi-mcp": {
      "command": "npx",
      "args": ["@janwilmake/openapi-mcp-server@latest"],
      "env": {
        "OAS_API_KEY": "${OAS_API_KEY}"
      }
    }
  }
}

Cursor

  1. Make sure Node.js is installed.
  2. Locate Cursor’s configuration file.
  3. Insert the OpenAPI MCP server under mcpServers.
  4. Save and restart Cursor.
  5. Test with a sample OpenAPI query.

Example configuration:

{
  "mcpServers": {
    "openapi-mcp": {
      "command": "npx",
      "args": ["@janwilmake/openapi-mcp-server@latest"],
      "env": {
        "OAS_API_KEY": "${OAS_API_KEY}"
      }
    }
  }
}

Cline

  1. Install Node.js if not yet installed.
  2. Edit the Cline configuration file to include the OpenAPI MCP.
  3. Add the following JSON block.
  4. Save changes and restart Cline.
  5. Confirm MCP server is active.

Example configuration:

{
  "mcpServers": {
    "openapi-mcp": {
      "command": "npx",
      "args": ["@janwilmake/openapi-mcp-server@latest"],
      "env": {
        "OAS_API_KEY": "${OAS_API_KEY}"
      }
    }
  }
}

Securing API Keys:
Store sensitive keys in environment variables and reference them in your configuration as shown in the env property.

How to use this MCP inside flows

Using MCP in FlowHunt

To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

FlowHunt MCP flow

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:

{
  "openapi-mcp": {
    "transport": "streamable_http",
    "url": "https://yourmcpserver.example/pathtothemcp/url"
  }
}

Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “openapi-mcp” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.


Overview

SectionAvailabilityDetails/Notes
Overview
List of Prompts
List of Resources
List of ToolsNo endpoint execution, context/exploration only
Securing API KeysUses env variables in setup
Sampling Support (less important in evaluation)Not mentioned

Our opinion

The OpenAPI MCP Server is a focused and useful MCP that excels at providing context and exploration tools for OpenAPI specifications. Its lack of endpoint execution is a limitation for some advanced use cases, and sampling/roots support is not documented. However, its clear setup instructions, strong codebase, and active usage in the community make it a strong offering for developers needing API context and code generation support.

MCP Score

Has a LICENSE✅ (MIT)
Has at least one tool✅ (context tools)
Number of Forks76
Number of Stars691

Frequently asked questions

What is the OpenAPI MCP Server?

The OpenAPI MCP Server is a Model Context Protocol server that allows AI agents and developers to explore, summarize, and understand OpenAPI specifications via oapis.org. It provides API context and endpoint details but does not execute API endpoints directly.

What can I use the OpenAPI MCP Server for?

You can auto-generate API documentation, assist in code generation, explore available endpoints, provide API context to LLMs, and onboard team members with summarized API overviews.

Can the OpenAPI MCP Server execute API calls?

No, it does not execute API endpoints due to authentication and security considerations. It focuses on exploration, context, and documentation.

Is the OpenAPI MCP Server compatible with FlowHunt and other AI tools?

Yes, it's compatible with FlowHunt, Claude, Cursor, Cline, and other tools that support MCP servers, allowing seamless context delivery for AI agents.

How do I secure my API keys?

Always use environment variables to store sensitive keys, and reference them in the configuration under the 'env' property as shown in the setup instructions.

Try the OpenAPI MCP Server on FlowHunt

Supercharge your AI workflows with advanced API context, automatic documentation, and seamless integration into FlowHunt and popular AI agents.

Learn more