Axiom MCP Server

Connect your AI agents to Axiom for real-time data querying and automated analytics. The Axiom MCP Server bridges FlowHunt with powerful data-driven insights, enabling interactive and informed AI conversations.

Axiom MCP Server

What does “Axiom” MCP Server do?

The Axiom MCP (Model Context Protocol) Server is an implementation that allows AI assistants to interface directly with the Axiom data platform using the Model Context Protocol. It enables AI agents to execute Axiom Processing Language (APL) queries and list available datasets, effectively bridging conversational AI with real-time data analytics. This integration helps developers and AI systems enhance their workflows by making it possible to directly query structured data, retrieve analytics, and automate insights from Axiom datasets within AI-driven environments. With the Axiom MCP Server, tasks like database querying and data exploration become accessible to AI clients, leading to more informed and context-aware AI interactions.

List of Prompts

No support for MCP prompts is currently available in this server.

List of Resources

No support for MCP resources is currently available in this server.

List of Tools

  • queryApl: Execute APL (Axiom Processing Language) queries against Axiom datasets. This tool enables AI agents to perform powerful analytic queries on your data stored in Axiom.
  • listDatasets: List available Axiom datasets. This allows AI agents to discover which datasets are accessible for querying within the connected Axiom account.

Use Cases of this MCP Server

  • Real-time Data Querying: Enables AI assistants to perform real-time APL queries on Axiom datasets, supporting data-driven conversations and insights.
  • Dataset Discovery: Allows AI agents to list and explore available datasets, simplifying data navigation and selection for further analysis.
  • Automated Analytics: Facilitates the automation of custom analytics by letting AI agents programmatically execute queries without manual intervention.
  • Enhanced AI-driven Decision Making: By integrating with Axiom, AI systems can ground their outputs in up-to-date data, improving the quality of recommendations or analyses.
  • Conversational Data Exploration: Developers can build workflows where users interactively explore datasets and run queries via natural language interfaces backed by this MCP server.

How to set it up

Windsurf

  1. Prerequisites: Ensure you have the latest Axiom MCP binary or install via Go (go install github.com/axiomhq/axiom-mcp@latest).
  2. Create a config file (e.g., config.txt) with your Axiom credentials.
  3. Edit Windsurf configuration to add the Axiom MCP server:
  4. Insert the following JSON into the mcpServers object:
    {
      "axiom": {
        "command": "/path/to/your/axiom-mcp-binary",
        "args": ["--config", "/path/to/your/config.txt"],
        "env": {
          "AXIOM_TOKEN": "xaat-your-token",
          "AXIOM_URL": "https://api.axiom.co",
          "AXIOM_ORG_ID": "your-org-id"
        }
      }
    }
    
  5. Save and restart Windsurf, then verify the server is active.

Claude

  1. Download or install the Axiom MCP binary.
  2. Create a configuration file (config.txt) with your Axiom API token and other parameters.
  3. Edit the Claude desktop app configuration:
    Open ~/Library/Application Support/Claude/claude_desktop_config.json.
  4. Add the MCP server entry:
    {
      "mcpServers": {
        "axiom": {
          "command": "/path/to/your/axiom-mcp-binary",
          "args": ["--config", "/path/to/your/config.txt"],
          "env": {
            "AXIOM_TOKEN": "xaat-your-token",
            "AXIOM_URL": "https://api.axiom.co",
            "AXIOM_ORG_ID": "your-org-id"
          }
        }
      }
    }
    
  5. Restart Claude and check connectivity.

Cursor

  1. Install the Axiom MCP binary.
  2. Prepare your configuration file as described above.
  3. Locate Cursor’s configuration file for MCP servers.
  4. Add the following JSON to configure the Axiom MCP:
    {
      "mcpServers": {
        "axiom": {
          "command": "/path/to/your/axiom-mcp-binary",
          "args": ["--config", "/path/to/your/config.txt"],
          "env": {
            "AXIOM_TOKEN": "xaat-your-token",
            "AXIOM_URL": "https://api.axiom.co",
            "AXIOM_ORG_ID": "your-org-id"
          }
        }
      }
    }
    
  5. Restart Cursor and verify the setup.

Cline

  1. Obtain and install the Axiom MCP server binary.
  2. Create and fill in your config.txt with the necessary settings.
  3. Open Cline’s MCP configuration file.
  4. Register the Axiom MCP server:
    {
      "mcpServers": {
        "axiom": {
          "command": "/path/to/your/axiom-mcp-binary",
          "args": ["--config", "/path/to/your/config.txt"],
          "env": {
            "AXIOM_TOKEN": "xaat-your-token",
            "AXIOM_URL": "https://api.axiom.co",
            "AXIOM_ORG_ID": "your-org-id"
          }
        }
      }
    }
    
  5. Save and relaunch Cline to activate the server.

Securing API Keys
Always store sensitive information such as API keys in environment variables, not directly in configuration files. Example:

"env": {
  "AXIOM_TOKEN": "xaat-your-token",
  "AXIOM_URL": "https://api.axiom.co",
  "AXIOM_ORG_ID": "your-org-id"
}

How to use this MCP inside flows

Using MCP in FlowHunt

To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

FlowHunt MCP flow

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:

{
  "axiom": {
    "transport": "streamable_http",
    "url": "https://yourmcpserver.example/pathtothemcp/url"
  }
}

Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “axiom” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.


Overview

SectionAvailabilityDetails/Notes
OverviewOverview and functionality explained
List of PromptsNo prompt support
List of ResourcesNo resource support
List of ToolsqueryApl, listDatasets
Securing API KeysVia env variables in config
Sampling Support (less important in evaluation)Not mentioned

Roots support not mentioned


Between the two tables, I would rate this MCP as a 5/10. It provides essential tools and clear setup instructions, but lacks advanced MCP features like resources, prompts, and sampling support, which limits its extensibility and integration depth.


MCP Score

Has a LICENSE✅ (MIT)
Has at least one tool
Number of Forks8
Number of Stars49

Frequently asked questions

What does the Axiom MCP Server do?

The Axiom MCP Server allows AI agents to connect directly to the Axiom data platform, execute Axiom Processing Language (APL) queries, and list datasets. This empowers AI-driven workflows with up-to-date analytics and data exploration capabilities.

Which tools are available in the Axiom MCP Server?

The server provides two main tools: `queryApl` for executing analytic queries using APL, and `listDatasets` to discover available datasets in your Axiom account.

What are common use cases for this integration?

Typical use cases include real-time data querying for conversational AI, automated analytics, dataset discovery, and building workflows where AI agents interactively analyze and explore data.

How do I secure my Axiom API keys when setting up?

Always store sensitive values such as AXIOM_TOKEN, AXIOM_URL, and AXIOM_ORG_ID as environment variables in your configuration, not directly in your flow or code.

How do I connect the Axiom MCP Server to my FlowHunt flow?

Add the MCP component to your flow, open its configuration, and insert the MCP server details in JSON format, specifying transport and URL. Replace the default placeholders with your actual MCP server information.

Integrate Axiom Analytics into Your AI Workflows

Empower your AI agents with direct access to Axiom datasets and real-time analytics. Try the Axiom MCP Server on FlowHunt today.

Learn more