Milvus MCP Server Integration
Connect LLMs and AI agents to Milvus for powerful vector search, contextual memory, and data-driven recommendations directly in your FlowHunt workflows.

What does “Milvus” MCP Server do?
The Milvus MCP (Model Context Protocol) Server connects AI assistants and LLM-powered applications with the Milvus vector database. This enables seamless interaction between language models and large-scale vector data, providing a standardized way to access, query, and manage Milvus from within AI workflows. Using the Milvus MCP Server, developers can integrate Milvus-based search, retrieval, and data management capabilities directly into their AI agents, IDEs, or chat interfaces. The server supports multiple communication modes (stdio and Server-Sent Events), allowing it to fit diverse deployment scenarios and development environments. By bridging LLMs and Milvus, it greatly enhances the ability of AI systems to perform context-aware operations on high-dimensional data, unlocking richer and more intelligent LLM-powered experiences.
List of Prompts
No information about prompt templates is provided in the repository.
List of Resources
No explicit list of Model Context Protocol “resources” is described in the available documentation or code.
List of Tools
No explicit tool list or function names are documented in the available documentation or code files, including server.py
.
Use Cases of this MCP Server
- Vector Search Integration: Enables developers to use LLMs to query and retrieve relevant documents or data points from Milvus, enhancing contextual search in AI applications.
- Embedding Management: Allows LLMs and agents to store and manage vector embeddings within Milvus, supporting advanced semantic search workflows.
- Chatbot Contextual Memory: Facilitates chatbots or AI assistants in maintaining long-term memory by storing conversational data as vectors in Milvus for later retrieval.
- Data Analysis and Recommendation: Powers AI-driven recommendation systems by allowing LLMs to perform similarity searches over large datasets stored in Milvus.
- Real-time Data Access: Supports AI agents that require real-time access to high-dimensional data for analytics, pattern recognition, or anomaly detection.
How to set it up
Windsurf
- Ensure you have Python 3.10+ and a running Milvus instance.
- Clone the repository:
git clone https://github.com/zilliztech/mcp-server-milvus.git
- Run the server:
uv run src/mcp_server_milvus/server.py --milvus-uri http://localhost:19530
- Add the MCP server to your Windsurf configuration:
{
"mcpServers": {
"milvus-mcp": {
"command": "uv",
"args": ["run", "src/mcp_server_milvus/server.py", "--milvus-uri", "http://localhost:19530"]
}
}
}
- Save and restart Windsurf. Verify connection in the interface.
Securing API keys:
If the server requires sensitive info, use environment variables:
{
"env": {
"MILVUS_URI": "http://localhost:19530"
},
"inputs": {}
}
Claude
- Install prerequisites: Python 3.10+, Milvus, and uv.
- Clone and start the server as described above.
- In Claude’s settings, add the MCP server with:
{
"mcpServers": {
"milvus-mcp": {
"command": "uv",
"args": ["run", "src/mcp_server_milvus/server.py", "--milvus-uri", "http://localhost:19530"]
}
}
}
- Save and restart Claude. Confirm Milvus MCP appears in available tools.
Secure credentials via environment variables as above.
Cursor
- Install Python 3.10+ and Milvus, plus
uv
. - Clone the repo and run:
uv run src/mcp_server_milvus/server.py --milvus-uri http://localhost:19530
- In Cursor’s configuration, add:
{
"mcpServers": {
"milvus-mcp": {
"command": "uv",
"args": ["run", "src/mcp_server_milvus/server.py", "--milvus-uri", "http://localhost:19530"]
}
}
}
- Restart Cursor and verify setup.
Securing API keys:
Use environment variables as shown above.
Cline
- Prerequisites: Python 3.10+, Milvus, and
uv
. - Clone the repository and start the server.
- Edit Cline’s config to add:
{
"mcpServers": {
"milvus-mcp": {
"command": "uv",
"args": ["run", "src/mcp_server_milvus/server.py", "--milvus-uri", "http://localhost:19530"]
}
}
}
- Save changes and restart Cline.
Environment variables:
{
"env": {
"MILVUS_URI": "http://localhost:19530"
}
}
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"milvus-mcp": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “milvus-mcp” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | No prompt templates documented |
List of Resources | ⛔ | No explicit MCP resource list |
List of Tools | ⛔ | No explicit tools listed in available files |
Securing API Keys | ✅ | Uses environment variables, documented in setup examples |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
Roots support: Not mentioned
Sampling support: Not mentioned
Our opinion
The Milvus MCP Server is a practical and focused bridge for connecting LLMs with Milvus, with clear setup guides for popular dev tools. However, its documentation lacks detail on MCP resources, prompts, and actionable tool APIs, which limits out-of-the-box discoverability. Still, it’s a solid foundation for vector-based AI integrations.
MCP Score
Has a LICENSE | ✅ (Apache-2.0) |
---|---|
Has at least one tool | ⛔ |
Number of Forks | 32 |
Number of Stars | 139 |
Overall: 4/10
The server is useful for its niche but would benefit greatly from more explicit documentation on resources, prompt templates, and tool APIs for maximal interoperability and ease of use.
Frequently asked questions
- What is the Milvus MCP Server?
The Milvus MCP Server bridges AI assistants and LLM applications with the Milvus vector database, enabling seamless vector search, contextual memory, and data management for advanced AI workflows.
- What are common use cases for integrating Milvus MCP Server?
Key use cases include vector search, embedding management, contextual chatbot memory, AI-powered recommendations, and real-time data analysis using Milvus within FlowHunt.
- How do I secure my Milvus MCP Server setup?
Use environment variables (e.g., MILVUS_URI) to store sensitive connection info, as shown in the setup guides for each supported client.
- Does Milvus MCP Server provide prompt templates or tool APIs?
No explicit prompt templates or tool APIs are documented. The server focuses on providing a bridge for vector operations and embedding management.
- What is the overall evaluation of the Milvus MCP Server?
It is a solid foundation for connecting LLMs to vector databases, with clear setup instructions, but would benefit from more documentation on prompt and tool APIs for easier discoverability and integration.
Supercharge FlowHunt with Milvus MCP
Enhance your AI agents with seamless access to vector databases, enabling smarter search, recommendations, and contextual memory. Integrate the Milvus MCP Server with FlowHunt now!