VertexAI Search MCP Server
Easily integrate Google Vertex AI Search with your AI agents to enable reliable, grounded search over private datasets with the VertexAI Search MCP Server.

What does “VertexAI Search” MCP Server do?
The VertexAI Search MCP Server is designed to connect AI assistants with Google Vertex AI Search, enabling them to search and retrieve information from private datasets stored in Vertex AI Datastore. By leveraging Gemini with Vertex AI grounding, this server enhances the quality and accuracy of search results by grounding AI responses in your proprietary data. It supports integration with one or multiple Vertex AI data stores, making it a powerful tool for augmenting LLM-driven workflows with contextually relevant, organization-specific information. This capability empowers developers to automate document search, knowledge base queries, and streamline access to enterprise data within development and production environments.
List of Prompts
No prompt templates are mentioned in the repository.
List of Resources
No specific resources are detailed in the repository.
List of Tools
No explicit list of tools is provided in the repository or in server.py.
Use Cases of this MCP Server
- Enterprise Search Automation: Integrate Vertex AI Search into workflows to automate querying and retrieval of documents from private datasets, streamlining internal information access.
- Knowledge Base Augmentation: Enhance AI assistants with the ability to answer user queries grounded in organization-specific knowledge, improving response accuracy.
- Data-Driven Decision Making: Enable developers to surface relevant data from Vertex AI Datastores during application development, supporting evidence-based decisions.
- Custom AI Assistant Development: Build domain-specific AI agents capable of searching and contextualizing responses using curated Vertex AI data stores.
How to set it up
Windsurf
- Ensure Python and Docker are installed on your system.
- Clone the repository:
git clone git@github.com:ubie-oss/mcp-vertexai-search.git
- Create a virtual environment and install dependencies:
uv venv uv sync --all-extras
- Add the MCP server configuration in the Windsurf configuration file as follows:
{ "mcpServers": { "vertexai-search": { "command": "uv", "args": ["run", "mcp-vertexai-search"] } } }
- Save and restart Windsurf, then verify the MCP server is running.
Securing API Keys Example:
{
"mcpServers": {
"vertexai-search": {
"env": {
"GOOGLE_APPLICATION_CREDENTIALS": "/path/to/your/credentials.json"
},
"inputs": {}
}
}
}
Claude
- Ensure proper Python environment and dependencies are installed.
- Clone and set up the repository as above.
- Edit the Claude configuration to add the MCP server:
{ "mcpServers": { "vertexai-search": { "command": "uv", "args": ["run", "mcp-vertexai-search"] } } }
- Restart Claude and check server status.
Securing API Keys Example:
{
"mcpServers": {
"vertexai-search": {
"env": {
"GOOGLE_APPLICATION_CREDENTIALS": "/path/to/your/credentials.json"
},
"inputs": {}
}
}
}
Cursor
- Install prerequisites and set up the repository as outlined above.
- Update the Cursor configuration file:
{ "mcpServers": { "vertexai-search": { "command": "uv", "args": ["run", "mcp-vertexai-search"] } } }
- Save, restart Cursor, and verify operation.
Securing API Keys Example:
{
"mcpServers": {
"vertexai-search": {
"env": {
"GOOGLE_APPLICATION_CREDENTIALS": "/path/to/your/credentials.json"
},
"inputs": {}
}
}
}
Cline
- Follow the steps for repository setup as above.
- Modify the Cline configuration:
{ "mcpServers": { "vertexai-search": { "command": "uv", "args": ["run", "mcp-vertexai-search"] } } }
- Restart Cline and confirm the server is active.
Securing API Keys Example:
{
"mcpServers": {
"vertexai-search": {
"env": {
"GOOGLE_APPLICATION_CREDENTIALS": "/path/to/your/credentials.json"
},
"inputs": {}
}
}
}
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"vertexai-search": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “vertexai-search” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | Present in README.md |
List of Prompts | ⛔ | No prompt templates found |
List of Resources | ⛔ | No explicit resources detailed |
List of Tools | ⛔ | No explicit tools listed |
Securing API Keys | ✅ | Configuration examples provided |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
Based on the completeness of documentation and feature exposure, this MCP server provides a solid integration for Vertex AI Search but lacks detailed documentation on prompts, resources, and tools. The setup instructions and licensing are clear, but advanced MCP features are not discussed. Rating: 5/10
MCP Score
Has a LICENSE | ✅ (Apache-2.0) |
---|---|
Has at least one tool | ⛔ |
Number of Forks | 9 |
Number of Stars | 18 |
Frequently asked questions
- What is the VertexAI Search MCP Server?
The VertexAI Search MCP Server connects AI assistants with Google Vertex AI Search, allowing them to search and retrieve information from private datasets in Vertex AI Datastore. It grounds AI responses in your organization’s data for improved accuracy and context.
- What are typical use cases?
Use cases include automating enterprise document search, augmenting knowledge bases, enabling data-driven development, and building custom AI assistants that leverage proprietary datasets.
- How do I secure my API credentials?
Set the GOOGLE_APPLICATION_CREDENTIALS environment variable in your MCP configuration, pointing to your Google Cloud service account credentials JSON file. Example configurations are provided for each supported client.
- Can I use multiple Vertex AI Datastores?
Yes, the server supports integration with one or multiple Vertex AI Datastores, letting you query across various private datasets as needed.
- Where can I see the MCP server in action within FlowHunt?
Add the MCP component to your flow, configure it with your server’s details, and connect it to your AI agent. The agent can then access all the functions provided by the VertexAI Search MCP Server.
Try VertexAI Search MCP Server on FlowHunt
Supercharge your AI agents with private dataset search and grounded responses. Integrate VertexAI Search MCP Server in just a few steps.