MCP Proxy Server
Aggregate multiple MCP servers into a single, unified endpoint for streamlined AI workflows, with real-time streaming and centralized configuration.

What does “MCP Proxy” MCP Server do?
The MCP Proxy Server is a tool that aggregates and serves multiple MCP (Model Context Protocol) resource servers through a single HTTP server. By acting as a proxy, it allows AI assistants and clients to connect to several different MCP servers at once, combining their tools, resources, and capabilities into a unified interface. This setup simplifies integration, as developers and AI workflows can access a variety of external data sources, APIs, or services through a single endpoint. The MCP Proxy Server supports real-time updates via SSE (Server-Sent Events) or HTTP streaming and is highly configurable, making it easier to perform complex tasks such as database queries, file management, or API interactions by routing them through the appropriate underlying MCP servers.
List of Prompts
No information about prompt templates is provided in the repository or documentation.
List of Resources
No explicit resources are documented in the repository or example configuration. The server aggregates resources from connected MCP servers, but none are listed directly.
List of Tools
No tools are directly provided by the MCP Proxy Server itself; it acts as a proxy to tools from other configured MCP servers (such as github, fetch, amap as seen in the configuration example).
Use Cases of this MCP Server
- Aggregating Multiple MCP Servers: Developers can connect several different MCP servers (e.g., for GitHub, Fetch, or Amap) through one proxy endpoint, simplifying setup and management.
- Unified API Gateway: Acts as a unified gateway for AI assistants to access various external APIs and data sources via the MCP protocol, reducing integration complexity.
- Real-Time Data Streaming: Supports SSE/HTTP streaming, enabling real-time updates from underlying MCP resource servers.
- Flexible Client Support: Can interface with different types of clients (stdio, sse, streamable-http), making it adaptable for diverse workflow requirements.
- Centralized Authentication & Logging: Offers centralized configuration for authentication tokens and logging, improving security and traceability when accessing multiple MCP resources.
How to set it up
Windsurf
- Ensure you have Node.js and access to Windsurf configuration files.
- Open your Windsurf configuration and locate the
mcpServers
section. - Add the MCP Proxy Server using the following JSON snippet:
"mcpServers": { "mcp-proxy": { "command": "npx", "args": ["@TBXark/mcp-proxy@latest"], "env": { "GITHUB_PERSONAL_ACCESS_TOKEN": "<YOUR_TOKEN>" } } }
- Save your configuration and restart Windsurf.
- Verify the MCP Proxy Server appears in the Windsurf UI.
Note: Secure your API keys using environment variables as shown above.
Claude
- Locate Claude’s configuration interface or file.
- Add the MCP Proxy Server to the
mcpServers
section:"mcpServers": { "mcp-proxy": { "command": "npx", "args": ["@TBXark/mcp-proxy@latest"], "env": { "GITHUB_PERSONAL_ACCESS_TOKEN": "<YOUR_TOKEN>" } } }
- Save the configuration and restart Claude.
- Confirm the MCP Proxy Server is recognized by Claude.
Note: Use environment variables for secret tokens.
Cursor
- Make sure you have Node.js installed and access to Cursor configuration.
- Edit the Cursor configuration and add the following:
"mcpServers": { "mcp-proxy": { "command": "npx", "args": ["@TBXark/mcp-proxy@latest"], "env": { "GITHUB_PERSONAL_ACCESS_TOKEN": "<YOUR_TOKEN>" } } }
- Save changes and restart Cursor.
- Check that the MCP Proxy Server is available.
Note: Use environment variables for sensitive credentials.
Cline
- Open the Cline configuration file.
- Insert the MCP Proxy Server details:
"mcpServers": { "mcp-proxy": { "command": "npx", "args": ["@TBXark/mcp-proxy@latest"], "env": { "GITHUB_PERSONAL_ACCESS_TOKEN": "<YOUR_TOKEN>" } } }
- Save and restart Cline.
- Confirm functionality in the Cline interface.
Note: Secure API keys using the env
property as in the example.
Example: Securing API Keys
"mcpServers": {
"github": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "<YOUR_TOKEN>"
}
}
}
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"mcp-proxy": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “mcp-proxy” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | No prompt templates documented in repo. |
List of Resources | ⛔ | No explicit resource definitions; aggregates from other MCP servers. |
List of Tools | ⛔ | No direct tools; proxies tools from configured servers. |
Securing API Keys | ✅ | Configuration supports env for secrets. |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned in available documentation. |
Based on the above, the MCP Proxy is a useful aggregation layer for MCP resources but lacks direct tools, resources, or prompt templates; it’s mainly a configuration and routing solution.
Our opinion
This MCP server is best rated as a backend utility, not suited for standalone use but excellent for aggregating and managing multiple MCP servers in a unified workflow. Its documentation is clear for configuration and security, but lacks details on prompts, tools, and resources. Overall, it is a solid infrastructure piece for advanced users. Score: 5/10.
MCP Score
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ⛔ (Proxy only, no tools) |
Number of Forks | 43 |
Number of Stars | 315 |
Frequently asked questions
- What is the MCP Proxy Server?
The MCP Proxy Server is a backend utility that aggregates multiple MCP (Model Context Protocol) resource servers into a single HTTP server. It enables AI assistants and developers to access tools, APIs, and data sources from several MCP servers via a unified endpoint, simplifying integration and management.
- What are the main use cases for the MCP Proxy Server?
Key use cases include: aggregating multiple MCP servers for simplified access, acting as a unified API gateway for diverse data sources, supporting real-time data streaming via SSE/HTTP, enabling flexible client integration, and centralizing authentication and logging for security.
- Does the MCP Proxy Server provide its own tools or resources?
No, the MCP Proxy Server does not directly provide tools or resources. Instead, it proxies and aggregates tools and resources from the underlying MCP servers configured in your environment.
- How do I secure sensitive API keys when configuring the MCP Proxy Server?
Always use environment variables (the `env` property in your configuration) to store secrets like API tokens, as shown in the example setup for each client. This helps ensure your credentials remain secure and are not exposed in configuration files.
- How do I use the MCP Proxy Server within FlowHunt?
Add an MCP component to your flow, and in the system MCP configuration, insert your MCP Proxy Server details in JSON format. This lets your AI agent access all aggregated tools and resources via a single endpoint. Make sure to update the server name and URL for your setup.
Integrate MCP Proxy Server with FlowHunt
Unify your AI and automation workflows by connecting multiple MCP servers through the powerful MCP Proxy. Simplify your integration today.