
mcp-google-search MCP Server
The mcp-google-search MCP Server bridges AI assistants and the web, enabling real-time search and content extraction using the Google Custom Search API. It empo...
A simple, local, and privacy-preserving web search MCP server for real-time data access and Retrieval-Augmented Generation in FlowHunt and other AI workflows.
The mcp-local-rag MCP Server is a “primitive” Retrieval-Augmented Generation (RAG)-like web search Model Context Protocol (MCP) server that runs locally without requiring external APIs. Its main function is to connect AI assistants with the web as a data source, allowing large language models (LLMs) to execute web searches, fetch and embed search results, and extract relevant content—all within a privacy-respecting, local environment. The server orchestrates the process by submitting user queries to a search engine (DuckDuckGo), fetching multiple results, ranking them based on similarity using Google’s MediaPipe Text Embedder, and extracting relevant context from web pages. This enables developers and AI clients to access up-to-date web information, which can enhance workflows such as research, content creation, and question answering without relying on proprietary web APIs.
No specific prompt templates are mentioned in the repository or the documentation.
No explicit MCP “resources” are described in the available repository content.
No detailed tool definitions are directly listed in the available files or documentation.
Below are the general setup instructions for integrating the mcp-local-rag MCP Server with various MCP clients. Please adapt the configuration JSON as needed for your specific client.
mcpServers
object:{
"mcpServers": {
"mcp-local-rag": {
"command": "uvx",
"args": [
"--python=3.10",
"--from",
"git+https://github.com/nkapila6/mcp-local-rag",
"mcp-local-rag"
]
}
}
}
{
"mcpServers": {
"mcp-local-rag": {
"command": "uvx",
"args": [
"--python=3.10",
"--from",
"git+https://github.com/nkapila6/mcp-local-rag",
"mcp-local-rag"
]
}
}
}
{
"mcpServers": {
"mcp-local-rag": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"--init",
"-e",
"DOCKER_CONTAINER=true",
"ghcr.io/nkapila6/mcp-local-rag:latest"
]
}
}
}
No external API keys are required for mcp-local-rag, but if you need to set environment variables (for Docker or other purposes), use the env
object in your configuration:
{
"mcpServers": {
"mcp-local-rag": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"--init",
"-e",
"DOCKER_CONTAINER=true",
"ghcr.io/nkapila6/mcp-local-rag:latest"
],
"env": {
"EXAMPLE_ENV_VAR": "value"
},
"inputs": {}
}
}
}
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:
Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"mcp-local-rag": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “mcp-local-rag” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | None found |
List of Resources | ⛔ | None found |
List of Tools | ⛔ | None found |
Securing API Keys | ✅ | Example with env shown |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
Overall, mcp-local-rag is a straightforward, privacy-respecting MCP server for web search, but lacks detail in its prompt/templates, resources, and tool specification documentation. It is easy to set up and use with major clients, but is best suited for simple web RAG use cases.
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ⛔ |
Number of Forks | 12 |
Number of Stars | 48 |
It is a local, privacy-preserving web search MCP server for Retrieval-Augmented Generation (RAG). It connects LLMs to the web, fetches and embeds search results, and extracts relevant content without requiring external APIs or cloud dependencies.
Use cases include real-time web search for LLMs, content summarization, retrieval-augmented generation, developer productivity (e.g., searching documentation), and education (fetching fresh learning materials).
No external API keys are needed. It runs locally and uses DuckDuckGo for search, so your queries remain private and no paid API access is required.
Add the MCP component to your FlowHunt flow, open its configuration, and enter your MCP server details using the recommended JSON format. See setup instructions above for examples.
No explicit prompt templates, resources, or tools are defined in the documentation. The server is designed for straightforward web search and context retrieval.
Boost your AI's capabilities with private, real-time web search using mcp-local-rag. No external APIs or keys required.
The mcp-google-search MCP Server bridges AI assistants and the web, enabling real-time search and content extraction using the Google Custom Search API. It empo...
The RAG Web Browser MCP Server equips AI assistants and LLMs with live web search and content extraction capabilities, enabling retrieval-augmented generation (...
The mcp-rag-local MCP Server empowers AI assistants with semantic memory, enabling storage and retrieval of text passages based on meaning, not just keywords. I...