
Vertica MCP Server
The Vertica MCP Server enables seamless integration between AI assistants and OpenText Vertica databases, supporting secure SQL operations, bulk data loading, s...
Easily integrate Google Vertex AI Search with your AI agents to enable reliable, grounded search over private datasets with the VertexAI Search MCP Server.
The VertexAI Search MCP Server is designed to connect AI assistants with Google Vertex AI Search, enabling them to search and retrieve information from private datasets stored in Vertex AI Datastore. By leveraging Gemini with Vertex AI grounding, this server enhances the quality and accuracy of search results by grounding AI responses in your proprietary data. It supports integration with one or multiple Vertex AI data stores, making it a powerful tool for augmenting LLM-driven workflows with contextually relevant, organization-specific information. This capability empowers developers to automate document search, knowledge base queries, and streamline access to enterprise data within development and production environments.
No prompt templates are mentioned in the repository.
No specific resources are detailed in the repository.
No explicit list of tools is provided in the repository or in server.py.
git clone git@github.com:ubie-oss/mcp-vertexai-search.git
uv venv
uv sync --all-extras
{
"mcpServers": {
"vertexai-search": {
"command": "uv",
"args": ["run", "mcp-vertexai-search"]
}
}
}
Securing API Keys Example:
{
"mcpServers": {
"vertexai-search": {
"env": {
"GOOGLE_APPLICATION_CREDENTIALS": "/path/to/your/credentials.json"
},
"inputs": {}
}
}
}
{
"mcpServers": {
"vertexai-search": {
"command": "uv",
"args": ["run", "mcp-vertexai-search"]
}
}
}
Securing API Keys Example:
{
"mcpServers": {
"vertexai-search": {
"env": {
"GOOGLE_APPLICATION_CREDENTIALS": "/path/to/your/credentials.json"
},
"inputs": {}
}
}
}
{
"mcpServers": {
"vertexai-search": {
"command": "uv",
"args": ["run", "mcp-vertexai-search"]
}
}
}
Securing API Keys Example:
{
"mcpServers": {
"vertexai-search": {
"env": {
"GOOGLE_APPLICATION_CREDENTIALS": "/path/to/your/credentials.json"
},
"inputs": {}
}
}
}
{
"mcpServers": {
"vertexai-search": {
"command": "uv",
"args": ["run", "mcp-vertexai-search"]
}
}
}
Securing API Keys Example:
{
"mcpServers": {
"vertexai-search": {
"env": {
"GOOGLE_APPLICATION_CREDENTIALS": "/path/to/your/credentials.json"
},
"inputs": {}
}
}
}
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:
Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"vertexai-search": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “vertexai-search” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | Present in README.md |
List of Prompts | ⛔ | No prompt templates found |
List of Resources | ⛔ | No explicit resources detailed |
List of Tools | ⛔ | No explicit tools listed |
Securing API Keys | ✅ | Configuration examples provided |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
Based on the completeness of documentation and feature exposure, this MCP server provides a solid integration for Vertex AI Search but lacks detailed documentation on prompts, resources, and tools. The setup instructions and licensing are clear, but advanced MCP features are not discussed. Rating: 5/10
Has a LICENSE | ✅ (Apache-2.0) |
---|---|
Has at least one tool | ⛔ |
Number of Forks | 9 |
Number of Stars | 18 |
The VertexAI Search MCP Server connects AI assistants with Google Vertex AI Search, allowing them to search and retrieve information from private datasets in Vertex AI Datastore. It grounds AI responses in your organization’s data for improved accuracy and context.
Use cases include automating enterprise document search, augmenting knowledge bases, enabling data-driven development, and building custom AI assistants that leverage proprietary datasets.
Set the GOOGLE_APPLICATION_CREDENTIALS environment variable in your MCP configuration, pointing to your Google Cloud service account credentials JSON file. Example configurations are provided for each supported client.
Yes, the server supports integration with one or multiple Vertex AI Datastores, letting you query across various private datasets as needed.
Add the MCP component to your flow, configure it with your server’s details, and connect it to your AI agent. The agent can then access all the functions provided by the VertexAI Search MCP Server.
Supercharge your AI agents with private dataset search and grounded responses. Integrate VertexAI Search MCP Server in just a few steps.
The Vertica MCP Server enables seamless integration between AI assistants and OpenText Vertica databases, supporting secure SQL operations, bulk data loading, s...
Integrate the Vectorize MCP Server with FlowHunt to enable advanced vector retrieval, semantic search, and text extraction for powerful AI-driven workflows. Eff...
The OpenSearch MCP Server enables seamless integration of OpenSearch with FlowHunt and other AI agents, allowing programmatic access to search, analytics, and c...