
ModelContextProtocol (MCP) Server Integration
The ModelContextProtocol (MCP) Server acts as a bridge between AI agents and external data sources, APIs, and services, enabling FlowHunt users to build context...
Easily connect to any OpenAI-compatible chat API via a single MCP server, streamlining multi-provider LLM workflows in FlowHunt and beyond.
The any-chat-completions-mcp MCP Server acts as a bridge between AI assistants and any OpenAI SDK-compatible Chat Completion API, such as OpenAI, Perplexity, Groq, xAI, and PyroPrompts. By adhering to the Model Context Protocol (MCP), it enables seamless integration of external LLM providers into development workflows. Its primary function is to relay chat-based questions to a configured AI chat provider, allowing developers to utilize various LLMs as tools within their preferred environments. This makes tasks like switching between providers or scaling LLM usage straightforward, fostering flexibility and efficiency in AI-powered applications.
No prompt templates are mentioned in the repository or documentation.
No explicit MCP resources are documented in the repository or README.
No platform-specific instructions are provided for Windsurf in the repository or documentation.
npx
are installed.claude_desktop_config.json
(on MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
; on Windows: %APPDATA%/Claude/claude_desktop_config.json
).mcpServers
object.env
object.JSON Example:
{
"mcpServers": {
"chat-openai": {
"command": "npx",
"args": [
"@pyroprompts/any-chat-completions-mcp"
],
"env": {
"AI_CHAT_KEY": "OPENAI_KEY",
"AI_CHAT_NAME": "OpenAI",
"AI_CHAT_MODEL": "gpt-4o",
"AI_CHAT_BASE_URL": "v1/chat/completions"
}
}
}
}
Securing API Keys (using environment variables):
"env": {
"AI_CHAT_KEY": "YOUR_PROVIDER_KEY"
}
No platform-specific instructions are provided for Cursor in the repository or documentation.
No platform-specific instructions are provided for Cline in the repository or documentation.
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:
Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"MCP-name": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “MCP-name” to whatever the actual name of your MCP server is (e.g., “github-mcp”, “weather-api”, etc.) and replace the URL with your own MCP server URL.
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | Covers purpose and features in README |
List of Prompts | ⛔ | No prompt templates mentioned |
List of Resources | ⛔ | No explicit MCP resources documented |
List of Tools | ✅ | “chat” tool described in README |
Securing API Keys | ✅ | Uses “env” in JSON for key management |
Sampling Support (less important in evaluation) | ⛔ | No mention of sampling features |
Based on the above, any-chat-completions-mcp is a focused, streamlined MCP server ideal for adding generic OpenAI-compatible chat APIs as tools. Its main strength is simplicity and broad compatibility, though it lacks resource and prompt abstractions. For routine LLM integration, it’s robust, but power users may want more features. Overall, I would rate this MCP at 6/10 for general-purpose usage.
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ✅ |
Number of Forks | 17 |
Number of Stars | 129 |
It's an MCP Server that bridges FlowHunt or any MCP-compatible client with any OpenAI SDK-compatible Chat Completion API, including providers like OpenAI, Perplexity, Groq, xAI, and PyroPrompts. It routes chat-based queries via a single, simple tool and configuration.
Unified LLM integration, rapid provider switching, powering desktop AI agents, benchmarking LLMs, and acting as a secure API gateway for chat-based queries.
Switching is as simple as updating environment variables (e.g., API key, base URL, model name) in your MCP server configuration. No code changes are needed—just restart your client after updating your config.
Yes, API keys are managed via environment variables in the configuration, keeping credentials out of your codebase for better security.
A single 'chat' tool that relays chat-based messages to any configured OpenAI-compatible API endpoint.
No, the server is focused and streamlined for chat completions. It does not provide prompt templates or additional resource layers.
Unify your AI chat API connections and switch providers effortlessly with the any-chat-completions-mcp MCP Server. Perfect for developers seeking flexibility and simplicity.
The ModelContextProtocol (MCP) Server acts as a bridge between AI agents and external data sources, APIs, and services, enabling FlowHunt users to build context...
Chat MCP is a cross-platform desktop chat application that leverages the Model Context Protocol (MCP) to interface with various Large Language Models (LLMs). It...
The Chatsum MCP Server enables AI agents to efficiently query and summarize chat messages from a user's chat database, providing concise conversation insights a...