What does “AnalyticDB PostgreSQL” MCP Server do?
AnalyticDB PostgreSQL MCP Server acts as a universal interface between AI assistants and AnalyticDB PostgreSQL databases. This server enables AI agents to seamlessly communicate with AnalyticDB PostgreSQL, allowing them to retrieve database metadata and execute various SQL operations. By exposing database functionalities via the Model Context Protocol (MCP), it empowers AI models to perform tasks such as executing SELECT, DML, and DDL SQL queries, analyzing table statistics, and retrieving schema or table information. This greatly enhances development workflows by automating and streamlining tasks like database queries, schema exploration, and performance analysis from within AI-driven environments.
List of Prompts
No prompt templates are mentioned in the repository or documentation.
List of Resources
- adbpg:///schemas: Retrieve all schemas present in the connected AnalyticDB PostgreSQL database.
- adbpg:///{schema}/tables: List all tables within a specified schema.
- adbpg:///{schema}/{table}/ddl: Fetch the Data Definition Language (DDL) statement for a specific table.
- adbpg:///{schema}/{table}/statistics: Show detailed statistics for a particular table.
List of Tools
- execute_select_sql: Execute SELECT SQL queries on the AnalyticDB PostgreSQL server, enabling data retrieval.
- execute_dml_sql: Execute DML (INSERT, UPDATE, DELETE) SQL queries, allowing modification of database records.
- execute_ddl_sql: Execute DDL (CREATE, ALTER, DROP) SQL queries for managing database schema.
- analyze_table: Collect and update table statistics to optimize query planning.
- explain_query: Obtain the execution plan for a given SQL query to diagnose performance.
Use Cases of this MCP Server
- AI-driven Database Queries: Enable AI agents to run SELECT or DML SQL commands, facilitating direct data retrieval or modification via natural language interfaces.
- Schema and Metadata Exploration: Allow AI models to fetch and list schemas, tables, and DDLs for efficient database structure exploration.
- Automated Table Analysis: Use the
analyze_table
tool to collect and update statistics, improving query optimization and performance tuning. - Query Optimization Guidance: Utilize the
explain_query
tool to help developers or AI agents understand and optimize SQL queries. - Integration in Data Workflows: Seamlessly incorporate database operations within larger automated workflows managed by AI or orchestration tools.
How to set it up
Windsurf
- Ensure Python 3.10+ is installed.
- Download or clone the repository:
git clone https://github.com/aliyun/alibabacloud-adbpg-mcp-server.git
- In your Windsurf configuration file, add the MCP server:
"mcpServers": {
"adbpg-mcp-server": {
"command": "uv",
"args": [
"--directory",
"/path/to/adbpg-mcp-server",
"run",
"adbpg-mcp-server"
],
"env": {
"ADBPG_HOST": "host",
"ADBPG_PORT": "port",
"ADBPG_USER": "username",
"ADBPG_PASSWORD": "password",
"ADBPG_DATABASE": "database"
}
}
}
- Save the configuration and restart Windsurf.
- Verify connection by ensuring the server responds to MCP requests.
Claude
- Install Python 3.10+ and required packages.
- Install via pip:
pip install adbpg_mcp_server
- Add the server to the Claude configuration as follows:
"mcpServers": {
"adbpg-mcp-server": {
"command": "uvx",
"args": [
"adbpg_mcp_server"
],
"env": {
"ADBPG_HOST": "host",
"ADBPG_PORT": "port",
"ADBPG_USER": "username",
"ADBPG_PASSWORD": "password",
"ADBPG_DATABASE": "database"
}
}
}
- Save the configuration and restart Claude.
- Confirm the MCP server is operational.
Cursor
- Set up Python 3.10+ and dependencies.
- Choose either clone or pip install option (see above).
- Edit Cursor’s configuration file to include:
"mcpServers": {
"adbpg-mcp-server": {
"command": "uvx",
"args": [
"adbpg_mcp_server"
],
"env": {
"ADBPG_HOST": "host",
"ADBPG_PORT": "port",
"ADBPG_USER": "username",
"ADBPG_PASSWORD": "password",
"ADBPG_DATABASE": "database"
}
}
}
- Save, restart Cursor, and verify MCP server functionality.
Cline
- Make sure Python 3.10+ is ready and dependencies are installed.
- Clone or pip install the package.
- Update the Cline configuration as below:
"mcpServers": {
"adbpg-mcp-server": {
"command": "uvx",
"args": [
"adbpg_mcp_server"
],
"env": {
"ADBPG_HOST": "host",
"ADBPG_PORT": "port",
"ADBPG_USER": "username",
"ADBPG_PASSWORD": "password",
"ADBPG_DATABASE": "database"
}
}
}
- Save your changes and restart Cline.
- Check the connection to ensure the server is accessible.
Securing API Keys
Always store sensitive values such as database passwords in environment variables, not in plain-text configuration files. Example:
"env": {
"ADBPG_PASSWORD": "${ADBPG_PASSWORD_ENV}"
}
Configure your system environment variables accordingly for secure integration.
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"adbpg-mcp-server": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “adbpg-mcp-server” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | No prompt templates listed |
List of Resources | ✅ | Built-in & template |
List of Tools | ✅ | 5 documented tools |
Securing API Keys | ✅ | Environment variables |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
A review of this MCP server shows that it has solid documentation for setup, resources, and tools, but lacks prompt templates and does not mention advanced features like Roots or Sampling. Its focus is clearly on database-centric workflows.
MCP Score
Has a LICENSE | ✅ (Apache-2.0) |
---|---|
Has at least one tool | ✅ |
Number of Forks | 0 |
Number of Stars | 4 |
Rating:
I would rate this MCP server a 7/10. It is well-documented for basic integration and database use cases, but scores lower due to the absence of prompt templates, advanced MCP features, and low community adoption (stars/forks). For database-focused AI workflows, it is a strong starting point.
Frequently asked questions
- What is the AnalyticDB PostgreSQL MCP Server?
It is a middleware that connects AI assistants to AnalyticDB PostgreSQL databases, enabling them to run SQL queries, manage schemas, analyze tables, and retrieve metadata through the Model Context Protocol (MCP).
- What operations can AI agents perform with this MCP server?
AI agents can execute SELECT, DML (INSERT/UPDATE/DELETE), and DDL (CREATE/ALTER/DROP) queries, analyze table statistics, fetch schema/table info, and obtain SQL execution plans for optimization.
- How is sensitive information secured?
Database credentials, especially passwords, should be stored in environment variables rather than plain-text configs, ensuring secure integration and preventing credential leaks.
- What are typical use cases for this server?
It’s ideal for automating database queries, exploring schemas, updating table statistics, and integrating database operations into AI-powered or automated workflows.
- Is prompt template support available?
No prompt templates are provided in the current documentation.
- What is the community adoption for this server?
As of now, the server has 0 forks and 4 stars on GitHub.
Integrate AnalyticDB PostgreSQL with FlowHunt
Boost your AI’s capabilities with direct, secure SQL execution and database exploration. Start using AnalyticDB PostgreSQL MCP Server in your flows today!