
Axiom MCP Server
The Axiom MCP Server connects AI assistants to the Axiom data platform, enabling real-time APL queries, dataset discovery, and analytics automation. Bring power...
Eunomia MCP Server brings powerful data policy orchestration (PII, access control) to LLM pipelines, enabling secure and compliant AI workflows through seamless integration with FlowHunt’s MCP ecosystem.
Eunomia MCP Server is an extension of the Eunomia framework that connects Eunomia instruments with Model Context Protocol (MCP) servers. Its primary purpose is to orchestrate data governance policies—such as Personally Identifiable Information (PII) detection or user access control—across text streams handled by LLM-based (Large Language Model) applications. By integrating with the MCP ecosystem, Eunomia MCP Server enables developers to enforce data policies on top of LLM or other text-based workflows and orchestrate communication between multiple servers using the MCP standard. This adds a robust layer of security and compliance, making it easier to govern data flows in AI-driven environments.
No prompt templates are mentioned in the repository or documentation.
No explicit MCP resources are detailed in the repository or documentation.
No clear list of MCP tools is provided in the repository or documentation.
Data Governance in LLM Pipelines
Eunomia MCP Server can enforce data governance policies such as PII detection and redaction in LLM-based text pipelines.
Multi-Server Orchestration
Developers can orchestrate multiple external MCP servers, ensuring coordinated policy enforcement across distributed systems.
Integration with External Tools
The server can connect with other MCP-based services (e.g., a web-browser-mcp-server) to expand the range of data governance and processing capabilities.
Automated Policy Enforcement
By defining instruments (such as PiiInstrument), organizations can ensure sensitive information is consistently handled according to policy.
uv
installed.git clone https://github.com/whataboutyou-ai/eunomia-mcp-server.git
{
"mcpServers": {
"eunomia-mcp-server": {
"command": "uv",
"args": ["tool", "run", "orchestra_server"],
"env": {
"REQUEST_TIMEOUT": "30"
}
}
}
}
uv
).{
"mcpServers": {
"eunomia-mcp-server": {
"command": "uv",
"args": ["tool", "run", "orchestra_server"],
"env": {
"REQUEST_TIMEOUT": "30"
}
}
}
}
{
"mcpServers": {
"eunomia-mcp-server": {
"command": "uv",
"args": ["tool", "run", "orchestra_server"],
"env": {
"REQUEST_TIMEOUT": "30"
}
}
}
}
uv
if not already present.{
"mcpServers": {
"eunomia-mcp-server": {
"command": "uv",
"args": ["tool", "run", "orchestra_server"],
"env": {
"REQUEST_TIMEOUT": "30"
}
}
}
}
Securing API keys:
Use environment variables in your configuration:
{
"mcpServers": {
"eunomia-mcp-server": {
"command": "uv",
"args": ["tool", "run", "orchestra_server"],
"env": {
"API_KEY": "${EUNOMIA_API_KEY}",
"REQUEST_TIMEOUT": "30"
}
}
}
}
Replace ${EUNOMIA_API_KEY}
with your environment variable.
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:
Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"eunomia-mcp-server": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “eunomia-mcp-server” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | None provided |
List of Resources | ⛔ | None provided |
List of Tools | ⛔ | None provided |
Securing API Keys | ✅ | Example in setup instructions |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
Between the two tables, this MCP provides a basic but important data governance orchestration layer for LLM applications, but lacks detailed documentation on prompts, resources, and tools. Given its deprecation notice and limited explicit features, the score is moderate for production use.
Has a LICENSE | ✅ Apache-2.0 |
---|---|
Has at least one tool | ⛔ |
Number of Forks | 2 |
Number of Stars | 5 |
Eunomia MCP Server is an extension for orchestrating data governance policies (like PII detection and access control) across LLM-based applications, enabling secure, compliant, and automated text data handling through the MCP standard.
It supports data governance in LLM pipelines (PII detection/redaction), orchestrating policy enforcement across multiple servers, integrating with other MCP-based tools, and automating sensitive data policy enforcement.
Add the MCP server details in your flow’s system MCP configuration using the provided JSON snippet. Connect it to your AI agent to enable policy enforcement in your flows.
Use environment variables (e.g., API_KEY) in your MCP server configuration to securely store sensitive credentials, following the setup examples provided.
Yes, it is released under the Apache-2.0 license.
Enhance data compliance and automate policy enforcement in your LLM workflows with Eunomia MCP Server, fully integrated with FlowHunt.
The Axiom MCP Server connects AI assistants to the Axiom data platform, enabling real-time APL queries, dataset discovery, and analytics automation. Bring power...
The Inoyu MCP Unomi Server bridges Anthropic’s Model Context Protocol (MCP) with Apache Unomi, empowering AI agents to interact with, manage, and contextualize ...
Remote MCP (Model Context Protocol) is a system that allows AI agents to access external tools, data sources, and services through standardized interfaces hoste...