
Any OpenAPI MCP Server
Connect AI assistants like Claude to any API with an OpenAPI (Swagger) spec. The Any OpenAPI MCP Server enables semantic endpoint discovery and direct API reque...
Bridge the gap between AI agents and OpenAPI specs with the OpenAPI MCP Server—enabling API discovery, documentation, and code generation support for your workflows.
The OpenAPI MCP Server is a Model Context Protocol (MCP) server designed to connect AI assistants (such as Claude and Cursor) with the ability to search and explore OpenAPI specifications through oapis.org. By acting as a bridge, it enables AI models to gain a comprehensive understanding of complex APIs using simple language. The server follows a three-step process: identifying the required OpenAPI specification, summarizing it in accessible terms, and detailing the endpoints and their usage. While it does not execute API endpoints directly (due to authentication limitations), it excels at providing API overviews, facilitating code generation, and supporting development workflows where understanding and documenting API structure is essential.
mcpServers
section using the provided JSON snippet.Example configuration:
{
"mcpServers": {
"openapi-mcp": {
"command": "npx",
"args": ["@janwilmake/openapi-mcp-server@latest"],
"env": {
"OAS_API_KEY": "${OAS_API_KEY}"
}
}
}
}
Note: Secure your API keys using environment variables as shown above.
Example configuration:
{
"mcpServers": {
"openapi-mcp": {
"command": "npx",
"args": ["@janwilmake/openapi-mcp-server@latest"],
"env": {
"OAS_API_KEY": "${OAS_API_KEY}"
}
}
}
}
mcpServers
.Example configuration:
{
"mcpServers": {
"openapi-mcp": {
"command": "npx",
"args": ["@janwilmake/openapi-mcp-server@latest"],
"env": {
"OAS_API_KEY": "${OAS_API_KEY}"
}
}
}
}
Example configuration:
{
"mcpServers": {
"openapi-mcp": {
"command": "npx",
"args": ["@janwilmake/openapi-mcp-server@latest"],
"env": {
"OAS_API_KEY": "${OAS_API_KEY}"
}
}
}
}
Securing API Keys:
Store sensitive keys in environment variables and reference them in your configuration as shown in the env
property.
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:
Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"openapi-mcp": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “openapi-mcp” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ✅ | |
List of Resources | ✅ | |
List of Tools | ✅ | No endpoint execution, context/exploration only |
Securing API Keys | ✅ | Uses env variables in setup |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
The OpenAPI MCP Server is a focused and useful MCP that excels at providing context and exploration tools for OpenAPI specifications. Its lack of endpoint execution is a limitation for some advanced use cases, and sampling/roots support is not documented. However, its clear setup instructions, strong codebase, and active usage in the community make it a strong offering for developers needing API context and code generation support.
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ✅ (context tools) |
Number of Forks | 76 |
Number of Stars | 691 |
The OpenAPI MCP Server is a Model Context Protocol server that allows AI agents and developers to explore, summarize, and understand OpenAPI specifications via oapis.org. It provides API context and endpoint details but does not execute API endpoints directly.
You can auto-generate API documentation, assist in code generation, explore available endpoints, provide API context to LLMs, and onboard team members with summarized API overviews.
No, it does not execute API endpoints due to authentication and security considerations. It focuses on exploration, context, and documentation.
Yes, it's compatible with FlowHunt, Claude, Cursor, Cline, and other tools that support MCP servers, allowing seamless context delivery for AI agents.
Always use environment variables to store sensitive keys, and reference them in the configuration under the 'env' property as shown in the setup instructions.
Supercharge your AI workflows with advanced API context, automatic documentation, and seamless integration into FlowHunt and popular AI agents.
Connect AI assistants like Claude to any API with an OpenAPI (Swagger) spec. The Any OpenAPI MCP Server enables semantic endpoint discovery and direct API reque...
The OpenAPI Schema MCP Server exposes OpenAPI specifications to Large Language Models, enabling API exploration, schema search, code generation, and security re...
The OpenRPC MCP Server bridges AI assistants with JSON-RPC-enabled systems using the OpenRPC specification, enabling programmable, dynamic integration with exte...