Tyk Dashboard MCP Server
Expose any OpenAPI-compatible API as AI-accessible tools for your agents. The Tyk Dashboard MCP Server makes it easy to automate, test, and manage APIs with LLM-driven workflows.

What does “Tyk Dashboard” MCP Server do?
The Tyk Dashboard MCP Server is a dynamic tool designed to transform OpenAPI or Swagger specifications into accessible MCP (Model Context Protocol) servers. By doing so, it allows AI assistants to interact directly with REST APIs, making API endpoints available as tools for enhanced developer workflows. The Tyk Dashboard MCP Server enables seamless integration of external APIs with AI-powered clients, providing automated support for tasks such as API requests, authentication, and parameter handling. Its dynamic loading capabilities, support for overlays, and customizable mappings make it ideal for exposing any RESTful API to LLM-powered agents. Developers benefit by making their APIs immediately accessible for querying, file management, and other automated actions, streamlining integration and reducing manual overhead.
List of Prompts
No information about reusable prompt templates provided in the repository or documentation.
List of Resources
No explicit resources (as MCP resources) are listed in the available documentation or codebase.
List of Tools
- Dynamic OpenAPI Operations as Tools
The server automatically exposes each operation defined in a loaded OpenAPI specification as an MCP tool. Each REST API endpoint (e.g., GET, POST, PUT, DELETE routes) becomes an AI-accessible function, with full support for parameters, authentication, and operation metadata.
Use Cases of this MCP Server
- API Integration for AI Assistants
Instantly expose any OpenAPI-compatible API to LLM-powered agents for querying, updating, or managing external data sources. - Rapid Prototyping of API-Driven Workflows
Enable developers to quickly test and iterate on workflows involving external APIs by making endpoints available as configurable tools within AI environments. - Automated API Testing
Use LLMs to automate and validate API requests, responses, and authentication flows through the MCP server. - Custom API Tooling for Internal Teams
Provide internal users or teams with branded, AI-accessible versions of corporate APIs for automation, reporting, or management. - Standardized AI-API Interfaces
Transform API endpoints into standardized, discoverable tools that can be reused across multiple LLM agents or development projects.
How to set it up
Windsurf
No setup instructions for Windsurf provided.
Claude
- Ensure you have Node.js installed on your computer.
- Open Claude Desktop and navigate to Settings > Developer.
- Edit or create the configuration file:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows:
%APPDATA%\Claude\claude_desktop_config.json
- macOS:
- Add this configuration (customize as needed):
{
"mcpServers": {
"api-tools": {
"command": "npx",
"args": [
"-y",
"@tyktechnologies/api-to-mcp",
"--spec",
"https://petstore3.swagger.io/api/v3/openapi.json"
],
"enabled": true
}
}
}
- Restart Claude Desktop.
- You should now see a hammer icon in the chat input for API tools.
Cursor
No setup instructions for Cursor provided.
Cline
No setup instructions for Cline provided.
Securing API Keys
While the server supports passing custom HTTP headers via environment variables and CLI, there is no explicit example for securing API keys in the configuration. Users should ensure sensitive keys are loaded via environment variables in their system or deployment configuration.
Example (conceptual):
{
"mcpServers": {
"api-tools": {
"env": {
"API_KEY": "your_api_key"
},
"inputs": {
"header": "Authorization: Bearer ${API_KEY}"
}
}
}
}
Note: Adapt this according to your environment and security policies.
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"MCP-name": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “MCP-name” to whatever the actual name of your MCP server is (e.g., “github-mcp”, “weather-api”, etc.) and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | Found in README.md and project description |
List of Prompts | ⛔ | No prompt templates mentioned |
List of Resources | ⛔ | No explicit MCP resources listed |
List of Tools | ✅ | OpenAPI operations as tools |
Securing API Keys | ✅ | Supported via env variables and custom headers, not fully documented |
Sampling Support (less important in evaluation) | ⛔ | No evidence of sampling support found |
Roots support: The presence of a .roo
directory suggests root boundaries may be supported, but it is not explicitly documented.
Based on the two tables, the Tyk Dashboard MCP Server provides a robust way to turn OpenAPI endpoints into AI-usable tools. However, it lacks documentation/examples for prompt templates, explicit MCP resources, and details for some platforms. Sampling support and roots are not clearly addressed. Overall, this MCP server scores well for tool coverage and licensing, but could improve in documentation and feature breadth.
MCP Score
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ✅ |
Number of Forks | 9 |
Number of Stars | 1 |
RATING: 6/10
Frequently asked questions
- What does the Tyk Dashboard MCP Server do?
It transforms OpenAPI or Swagger specifications into MCP servers, making REST API endpoints directly accessible as tools for AI-powered agents. This enables LLMs to interact with, automate, and manage APIs.
- Which platforms are supported for setup?
Explicit setup instructions are provided for Claude Desktop. Other platforms (Windsurf, Cursor, Cline) are not explicitly documented but may be supported with custom configuration.
- How does the MCP server expose API endpoints?
Each operation (GET, POST, PUT, DELETE, etc.) in your OpenAPI specification is made available as an MCP tool for your AI agent, with support for parameters, authentication, and operation metadata.
- How are API keys and credentials secured?
API keys should be passed using environment variables and custom headers in your configuration. Sensitive information must not be hardcoded and should follow your security best practices.
- What are the main use cases?
Integrate APIs for AI assistants, automate API testing, enable rapid prototyping of workflows, provide internal API tooling, and create standardized AI-API interfaces with minimal effort.
Integrate APIs with FlowHunt's Tyk Dashboard MCP Server
Instantly turn your OpenAPI endpoints into AI-usable tools. Accelerate automation, testing, and prototyping by connecting your APIs to FlowHunt-powered AI assistants.