any-chat-completions-mcp MCP Server
Easily connect to any OpenAI-compatible chat API via a single MCP server, streamlining multi-provider LLM workflows in FlowHunt and beyond.

What does “any-chat-completions-mcp” MCP Server do?
The any-chat-completions-mcp MCP Server acts as a bridge between AI assistants and any OpenAI SDK-compatible Chat Completion API, such as OpenAI, Perplexity, Groq, xAI, and PyroPrompts. By adhering to the Model Context Protocol (MCP), it enables seamless integration of external LLM providers into development workflows. Its primary function is to relay chat-based questions to a configured AI chat provider, allowing developers to utilize various LLMs as tools within their preferred environments. This makes tasks like switching between providers or scaling LLM usage straightforward, fostering flexibility and efficiency in AI-powered applications.
List of Prompts
No prompt templates are mentioned in the repository or documentation.
List of Resources
No explicit MCP resources are documented in the repository or README.
List of Tools
- chat: Relays a question to a configured AI Chat Provider. This is the main (and only) tool exposed by the server, allowing LLMs or clients to send chat-based queries to any OpenAI-compatible API endpoint.
Use Cases of this MCP Server
- Unified LLM Integration: Developers can use a single MCP server to access multiple LLM providers without changing their client code, simplifying provider management.
- Provider Switching: Easily switch between OpenAI, PyroPrompts, Perplexity, and others by updating environment variables, useful for cost optimization or fallback strategies.
- Custom Desktop AI Agents: Integrate advanced chat-based LLMs into desktop applications (e.g., Claude Desktop) to power enhanced assistant features.
- Experimentation and Benchmarking: Rapidly compare outputs from different LLMs in a standardized way for research, QA, or product development.
- API Gateway for LLMs: Acts as a lightweight gateway for securely routing chat messages to various LLM APIs, centralizing API key and endpoint management.
How to set it up
Windsurf
No platform-specific instructions are provided for Windsurf in the repository or documentation.
Claude
- Prerequisite: Ensure Node.js and
npx
are installed. - Locate Config File: Edit
claude_desktop_config.json
(on MacOS:~/Library/Application Support/Claude/claude_desktop_config.json
; on Windows:%APPDATA%/Claude/claude_desktop_config.json
). - Add MCP Server: Add the MCP server configuration under the
mcpServers
object. - Set Environment Variables: Place provider API keys and other info in the
env
object. - Save and Restart: Save the file and restart Claude Desktop to apply changes.
JSON Example:
{
"mcpServers": {
"chat-openai": {
"command": "npx",
"args": [
"@pyroprompts/any-chat-completions-mcp"
],
"env": {
"AI_CHAT_KEY": "OPENAI_KEY",
"AI_CHAT_NAME": "OpenAI",
"AI_CHAT_MODEL": "gpt-4o",
"AI_CHAT_BASE_URL": "https://api.openai.com/v1"
}
}
}
}
Securing API Keys (using environment variables):
"env": {
"AI_CHAT_KEY": "YOUR_PROVIDER_KEY"
}
Cursor
No platform-specific instructions are provided for Cursor in the repository or documentation.
Cline
No platform-specific instructions are provided for Cline in the repository or documentation.
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"MCP-name": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “MCP-name” to whatever the actual name of your MCP server is (e.g., “github-mcp”, “weather-api”, etc.) and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | Covers purpose and features in README |
List of Prompts | ⛔ | No prompt templates mentioned |
List of Resources | ⛔ | No explicit MCP resources documented |
List of Tools | ✅ | “chat” tool described in README |
Securing API Keys | ✅ | Uses “env” in JSON for key management |
Sampling Support (less important in evaluation) | ⛔ | No mention of sampling features |
Based on the above, any-chat-completions-mcp is a focused, streamlined MCP server ideal for adding generic OpenAI-compatible chat APIs as tools. Its main strength is simplicity and broad compatibility, though it lacks resource and prompt abstractions. For routine LLM integration, it’s robust, but power users may want more features. Overall, I would rate this MCP at 6/10 for general-purpose usage.
MCP Score
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ✅ |
Number of Forks | 17 |
Number of Stars | 129 |
Frequently asked questions
- What is any-chat-completions-mcp?
It's an MCP Server that bridges FlowHunt or any MCP-compatible client with any OpenAI SDK-compatible Chat Completion API, including providers like OpenAI, Perplexity, Groq, xAI, and PyroPrompts. It routes chat-based queries via a single, simple tool and configuration.
- What are the main use cases for this MCP Server?
Unified LLM integration, rapid provider switching, powering desktop AI agents, benchmarking LLMs, and acting as a secure API gateway for chat-based queries.
- How do I switch between LLM providers?
Switching is as simple as updating environment variables (e.g., API key, base URL, model name) in your MCP server configuration. No code changes are needed—just restart your client after updating your config.
- Is this server secure for managing API keys?
Yes, API keys are managed via environment variables in the configuration, keeping credentials out of your codebase for better security.
- What is the main tool provided by this MCP Server?
A single 'chat' tool that relays chat-based messages to any configured OpenAI-compatible API endpoint.
- Does it support prompt templates or resource abstractions?
No, the server is focused and streamlined for chat completions. It does not provide prompt templates or additional resource layers.
Integrate any-chat-completions-mcp in FlowHunt
Unify your AI chat API connections and switch providers effortlessly with the any-chat-completions-mcp MCP Server. Perfect for developers seeking flexibility and simplicity.