AnalyticDB PostgreSQL MCP Server
Connect AI-driven workflows to AnalyticDB PostgreSQL for seamless schema exploration, automated SQL execution, and performance analytics with FlowHunt’s MCP integration.

What does “AnalyticDB PostgreSQL” MCP Server do?
The AnalyticDB PostgreSQL MCP Server acts as a universal bridge between AI assistants and AnalyticDB PostgreSQL databases. It enables seamless interaction by allowing AI agents to retrieve database metadata, execute SQL queries, and manage database operations programmatically. By providing standardized access to database functionalities, this MCP server facilitates tasks such as schema exploration, query execution, collecting table statistics, and analyzing query performance. This makes it an essential tool for developers and data engineers looking to integrate AI-driven workflows with robust, enterprise-ready PostgreSQL analytics databases.
List of Prompts
No prompt templates are mentioned in the provided repository or documentation.
List of Resources
adbpg:///schemas
Retrieves all schemas present in the connected AnalyticDB PostgreSQL database.adbpg:///{schema}/tables
Lists all tables within a specified schema.adbpg:///{schema}/{table}/ddl
Provides the Data Definition Language (DDL) statement for a specific table.adbpg:///{schema}/{table}/statistics
Shows statistics related to a given table, aiding in performance analysis and optimization.
List of Tools
execute_select_sql
Executes SELECT SQL queries on the AnalyticDB PostgreSQL server to retrieve data.execute_dml_sql
Executes DML (Data Manipulation Language) operations such as INSERT, UPDATE, or DELETE.execute_ddl_sql
Executes DDL (Data Definition Language) operations like CREATE, ALTER, or DROP.analyze_table
Collects statistics for a table to optimize database performance.explain_query
Provides the execution plan for a given SQL query, helping users understand and optimize query performance.
Use Cases of this MCP Server
Database Exploration and Metadata Retrieval
Developers can easily explore database schemas, list tables, and access table definitions, improving productivity and understanding of data structures.Automated Query Execution
AI agents can execute SELECT and DML queries programmatically, enabling use cases like report generation, data updates, and automated workflows.Schema Management and Evolution
The server allows for executing DDL queries, facilitating schema changes such as creating, modifying, or dropping tables as part of CI/CD pipelines.Performance Tuning
Tools likeanalyze_table
andexplain_query
help developers gather statistics and execution plans, making it easier to identify bottlenecks and optimize queries.AI-driven Data Analysis
Integrating with AI assistants, the server can support context-aware data analysis, enabling intelligent data exploration and insight generation.
How to set it up
Windsurf
- Prerequisites:
Ensure Python 3.10+ and required packages are installed. - Clone or Install:
- Clone:
git clone https://github.com/aliyun/alibabacloud-adbpg-mcp-server.git
- Or install with pip:
pip install adbpg_mcp_server
- Clone:
- Edit Configuration:
Open the Windsurf MCP client configuration file. - Add MCP Server:
Insert the following JSON:"mcpServers": { "adbpg-mcp-server": { "command": "uv", "args": [ "--directory", "/path/to/adbpg-mcp-server", "run", "adbpg-mcp-server" ], "env": { "ADBPG_HOST": "host", "ADBPG_PORT": "port", "ADBPG_USER": "username", "ADBPG_PASSWORD": "password", "ADBPG_DATABASE": "database" } } }
- Save & Restart
Save the file and restart Windsurf.
Claude
- Prerequisites:
Python 3.10+ and dependencies installed. - Install Server:
pip install adbpg_mcp_server
- Edit Configuration:
Open Claude’s MCP configuration. - Add MCP Server:
"mcpServers": { "adbpg-mcp-server": { "command": "uvx", "args": [ "adbpg_mcp_server" ], "env": { "ADBPG_HOST": "host", "ADBPG_PORT": "port", "ADBPG_USER": "username", "ADBPG_PASSWORD": "password", "ADBPG_DATABASE": "database" } } }
- Save & Restart
Save configuration and restart Claude.
Cursor
- Prerequisites:
Ensure Python 3.10+ and dependencies. - Clone or Install:
Clone or runpip install adbpg_mcp_server
. - Edit Configuration:
Open Cursor’s MCP configuration file. - Add MCP Server:
"mcpServers": { "adbpg-mcp-server": { "command": "uvx", "args": [ "adbpg_mcp_server" ], "env": { "ADBPG_HOST": "host", "ADBPG_PORT": "port", "ADBPG_USER": "username", "ADBPG_PASSWORD": "password", "ADBPG_DATABASE": "database" } } }
- Save & Restart
Save and restart Cursor.
Cline
- Prerequisites:
Python 3.10+ and dependencies. - Clone or Install:
Use either Git or pip as above. - Edit Configuration:
Open the MCP configuration. - Add MCP Server:
"mcpServers": { "adbpg-mcp-server": { "command": "uvx", "args": [ "adbpg_mcp_server" ], "env": { "ADBPG_HOST": "host", "ADBPG_PORT": "port", "ADBPG_USER": "username", "ADBPG_PASSWORD": "password", "ADBPG_DATABASE": "database" } } }
- Save & Restart
Save configuration and restart Cline.
Securing API Keys
Environment variables are used for database credentials. To enhance security, use environment variables instead of hardcoding sensitive information:
"env": {
"ADBPG_HOST": "${ADBPG_HOST}",
"ADBPG_PORT": "${ADBPG_PORT}",
"ADBPG_USER": "${ADBPG_USER}",
"ADBPG_PASSWORD": "${ADBPG_PASSWORD}",
"ADBPG_DATABASE": "${ADBPG_DATABASE}"
}
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"adbpg-mcp-server": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “adbpg-mcp-server” to the actual name of your MCP server and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | No prompt templates found |
List of Resources | ✅ | Schemas, tables, table DDL, table statistics |
List of Tools | ✅ | 5 tools: select, dml, ddl, analyze, explain |
Securing API Keys | ✅ | Environment variable pattern documented |
Roots Support | ⛔ | Not mentioned |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
Based on the available documentation, the AnalyticDB PostgreSQL MCP Server offers solid integration for database-driven workflows, with clear tools and resource endpoints. However, it lacks in areas such as prompt templates and explicit support for Roots/Sampling.
MCP Score
Has a LICENSE | ✅ (Apache-2.0) |
---|---|
Has at least one tool | ✅ |
Number of Forks | 0 |
Number of Stars | 4 |
Opinion & Rating:
This MCP server is well-documented for its core database integration features, and covers essential developer needs for PostgreSQL. The absence of prompt templates and advanced MCP features like Roots or Sampling is a drawback, but its focus and clarity make it useful for database-oriented workflows. Rating: 7/10
Frequently asked questions
- What is the AnalyticDB PostgreSQL MCP Server?
This MCP server connects AI agents to AnalyticDB PostgreSQL databases, allowing programmatic access to schema metadata, SQL query execution, database management, and performance analysis.
- What tasks can I automate with this MCP server?
You can automate schema exploration, SQL (SELECT, DML, DDL) execution, statistics collection, query plan analysis, and schema evolution, supporting end-to-end analytics and data engineering workflows.
- How do I secure my database credentials?
Always use environment variables for sensitive data like host, user, and password. The MCP server supports environment variable configuration for secure credential management.
- Does it support advanced MCP features like Roots or Sampling?
No, according to the documentation, this MCP server does not provide explicit support for Roots or Sampling.
- Are there prompt templates included?
No, there are no built-in prompt templates documented for this MCP server. You may add your own as needed for your workflow.
- What are the main use cases?
Use cases include database exploration, automated reporting, schema management, query optimization, and AI-driven data analysis within enterprise-grade PostgreSQL analytics environments.
Integrate AnalyticDB PostgreSQL with FlowHunt
Empower your AI agents with robust, enterprise-ready PostgreSQL analytics. Set up the AnalyticDB PostgreSQL MCP Server with FlowHunt for seamless database automation and insights.