Glean MCP Server Integration
The Glean MCP Server brings enterprise-grade search and conversational Q&A to your FlowHunt AI agents via seamless integration with the Glean API.

What does “Glean” MCP Server do?
The Glean MCP Server is an implementation of the Model Context Protocol (MCP) that integrates with the Glean API, enabling AI assistants to access advanced search and chat functionalities. By connecting to external data sources through the Glean platform, this server allows AI agents to retrieve search results or engage in Q&A conversations, thereby enhancing the capabilities of development workflows. With Glean MCP Server, users can streamline the process of querying enterprise knowledge bases or conversational chatbots, making it easier to surface relevant information and interact with data. This integration is especially beneficial for teams and organizations that rely on the Glean platform for internal knowledge management and wish to extend those capabilities to AI-powered applications.
List of Prompts
No prompt templates are mentioned in the repository.
List of Resources
No resources are explicitly documented in the repository.
List of Tools
- Search: Returns a list of search results for a given query using the Glean API.
- Chat: Provides a Q&A interface with the Glean-powered chatbot.
Use Cases of this MCP Server
- Enterprise Knowledge Search: Developers and users can retrieve relevant internal documentation, files, or messages from the Glean knowledge base using natural language queries.
- Conversational Q&A: Enables AI assistants to answer user questions by interacting with the Glean-powered chatbot, improving information accessibility.
- Workflow Integration: Embeds search and chat capabilities into developer tools or custom apps, streamlining information retrieval within existing workflows.
- Automated Assistance: Allows AI systems to surface organizational knowledge automatically when users are working on specific tasks or codebases.
- Support Automation: Integrates with support tools to provide instant knowledge-based answers to customer or team queries.
How to set it up
Windsurf
No setup instructions for Windsurf are provided in the repository.
Claude
- Build the Docker image:
docker build -t glean-server:latest -f src/glean/Dockerfile .
- Open your
claude_desktop_config.json
file. - Add the Glean MCP Server configuration:
{ "mcpServers": { "glean-server": { "command": "docker", "args": [ "run", "-i", "--rm", "-e", "GLEAN_API_KEY", "-e", "GLEAN_DOMAIN", "glean-server" ], "env": { "GLEAN_API_KEY": "YOUR_API_KEY_HERE", "GLEAN_DOMAIN": "YOUR_DOMAIN_HERE" } } } }
- Save the configuration file.
- Restart Claude Desktop to apply the changes.
Securing API Keys:
Environment variables are used to securely pass your API credentials:
"env": {
"GLEAN_API_KEY": "YOUR_API_KEY_HERE",
"GLEAN_DOMAIN": "YOUR_DOMAIN_HERE"
}
Cursor
No setup instructions for Cursor are provided in the repository.
Cline
No setup instructions for Cline are provided in the repository.
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"glean": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “glean” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | No prompt templates documented |
List of Resources | ⛔ | No explicit resources found |
List of Tools | ✅ | Search, Chat |
Securing API Keys | ✅ | Docker/JSON env example provided |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
Roots support is not mentioned anywhere in the repository.
Based on the above, the Glean MCP Server provides a minimal, functional MCP integration for the Glean API, offering clear tool interfaces but lacking detailed documentation for prompts, resources, and some platform setups. Its setup for Claude is straightforward, but support for other platforms and advanced features (roots, sampling) is missing.
Our opinion
The Glean MCP Server is a simple and focused MCP server implementation providing two essential tools—search and chat—integrated with the Glean API. While it is well-licensed and easy to set up for Claude, the lack of documentation for resources, prompts, and broader platform support limits its overall utility. Its straightforward Docker-based deployment and secure API key management are strong points. Overall, the repository’s completeness and usefulness rate a 5 out of 10.
MCP Score
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ✅ |
Number of Forks | 5 |
Number of Stars | 6 |
Frequently asked questions
- What is the Glean MCP Server?
The Glean MCP Server is an implementation of the Model Context Protocol (MCP) that connects AI agents to the Glean API, enabling advanced enterprise search and conversational Q&A over your organization’s knowledge base.
- What tools does this MCP provide?
It offers two tools: 'Search' to fetch results from Glean and 'Chat' for a Q&A interface powered by the Glean chatbot.
- What are common use cases for integrating Glean MCP?
Enterprise knowledge search, automated Q&A, developer workflow integration, and support automation—making it easier for AI agents to access and surface relevant organizational knowledge.
- How do I set up the Glean MCP Server for Claude?
Build the Docker image, update your configuration (as shown above), and securely provide your Glean API credentials via environment variables.
- Does this MCP Server support other platforms?
Explicit setup instructions are only provided for Claude. Other platforms like Windsurf, Cursor, and Cline are not documented in the repository.
- How secure is the integration?
API keys and credentials are securely managed using environment variables passed to the Docker container.
- Is there support for custom prompt templates or resources?
No, the repository does not document prompt templates or explicit resources. The integration is focused on its two main tools.
- What is the repository’s license and popularity?
It is MIT licensed and has 5 forks and 6 stars at the time of evaluation.
Empower Your AI with Glean MCP Integration
Unlock advanced enterprise search and automated Q&A for your AI agents by connecting FlowHunt with the Glean MCP Server.