
LLM Context MCP Server
The LLM Context MCP Server bridges AI assistants with external code and text projects, enabling context-aware workflows for code review, documentation generatio...
DocsMCP gives LLMs and developers instant access to technical documentation from local and remote sources, streamlining code help and API reference retrieval within FlowHunt or any MCP-ready environment.
DocsMCP is a Model Context Protocol (MCP) server designed to provide documentation access to Large Language Models (LLMs). By connecting to both local and remote documentation sources, DocsMCP enables LLMs to fetch, parse, and query documentation in real time. This enhances AI assistants and developer workflows by allowing seamless retrieval of technical references, guides, and API docs relevant to the current task or code context. DocsMCP acts as an intermediary, translating requests from LLMs into actionable searches or fetches against documentation resources, thereby enabling tasks such as searching for function usage, exploring library docs, or integrating contextual help directly into development environments.
No prompt templates are mentioned in the repository.
No explicit MCP resources are documented in the repository.
url
– The URL or file path to fetch the documentation from.No setup instructions provided for Windsurf.
No setup instructions provided for Claude.
npx
are installed..cursor/mcp.json
file.Example .cursor/mcp.json
:
{
"mcpServers": {
"docs-mcp": {
"command": "npx",
"args": [
"-y",
"docsmcp",
"'--source=Model Context Protocol (MCP)|https://modelcontextprotocol.io/llms-full.txt'"
]
}
}
}
Securing API Keys: No API keys or env configuration shown in the repository.
No setup instructions provided for Cline.
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:
Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"docs-mcp": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “docs-mcp” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | DocsMCP is a documentation server for LLMs via MCP. |
List of Prompts | ⛔ | No prompt templates found. |
List of Resources | ⛔ | No resources section in repo. |
List of Tools | ✅ | getDocumentationSources, getDocumentation |
Securing API Keys | ⛔ | No env or API key instructions present. |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned. |
DocsMCP is a focused and straightforward MCP server for documentation delivery to LLMs. It exposes essential tools for documentation fetching but lacks advanced features like resource exposure, prompt templates, or explicit security guidance. The absence of prompt and resource definitions, as well as missing setup guides for some platforms, limits its flexibility. It is open source under MIT, but community traction is minimal. Overall, it is a useful but basic MCP server.
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ✅ |
Number of Forks | 0 |
Number of Stars | 1 |
Rating: 4/10 – DocsMCP provides core MCP server utility for documentation, but lacks advanced or flexible MCP features, prompt templates, and broader platform support. Its simplicity is a strength for focused use cases but a limitation for extensibility.
DocsMCP is an MCP server that provides Large Language Models with real-time access to local and remote documentation sources, enabling code help, API lookups, and technical references directly within development environments.
DocsMCP exposes two main tools: 'getDocumentationSources' (to list all configured documentation sources) and 'getDocumentation' (to fetch and parse documentation from any specified URL or local file path).
DocsMCP is ideal for instant documentation lookup, contextual code help, automated API reference retrieval, and integrating both internal and external docs into a unified MCP interface for LLMs and developers.
In FlowHunt, you can add the MCP component and configure it to point to your DocsMCP server. This allows your AI agents to use all DocsMCP tools within your automated workflows for seamless documentation access.
No, DocsMCP does not include prompt templates or explicit resource definitions. It is a focused MCP server for documentation retrieval with a simple toolset.
Yes, DocsMCP is MIT licensed and open source, though it currently has minimal community traction.
No API keys or specific security configurations are described in the current DocsMCP documentation.
Empower your AI agents with real-time documentation access. Integrate DocsMCP into your FlowHunt workflow for instant, context-aware developer assistance and seamless code help.
The LLM Context MCP Server bridges AI assistants with external code and text projects, enabling context-aware workflows for code review, documentation generatio...
Langflow-DOC-QA-SERVER is an MCP server for document question-and-answer tasks, enabling AI assistants to query documents via a Langflow backend. Integrate docu...
The Microsoft Docs MCP Server provides real-time, authoritative access to Microsoft documentation for AI assistants, IDEs, and development tools. Enhance workfl...