
Airtable MCP Server Integration
The Airtable MCP Server connects FlowHunt and other AI assistants to Airtable’s API, enabling seamless automation of database workflows, intelligent schema mana...
Integrate Lark Bitable with FlowHunt using the Bitable MCP Server for effortless table discovery, schema analysis, and automated data queries within your AI-powered workflows.
The Bitable MCP Server provides seamless access to Lark Bitable, a collaborative spreadsheet and database platform, through the Model Context Protocol (MCP). This server enables AI assistants and developer tools to interact directly with Bitable tables using predefined tools. With Bitable MCP, users can automate database operations such as listing available tables, describing table schemas, and querying data using SQL-like statements. This MCP server streamlines workflows involving data extraction, management, and integration, making it easier to build intelligent assistants or automation pipelines that interact with structured data in Lark Bitable. Its integration with MCP also ensures compatibility with various AI platforms and development environments, enhancing productivity for developers and users working with data-driven applications.
No prompt templates are mentioned in the repository or documentation.
No explicit MCP resources are listed in the available documentation or code.
name
parameter (string) and returns a JSON-encoded list of columns in the table.sql
parameter (string) and returns a JSON-encoded list of query results.No setup instructions provided for Windsurf. Marked as “Coming soon” in the documentation.
Ensure you have uvx
installed.
Obtain your PERSONAL_BASE_TOKEN
and APP_TOKEN
from Lark Bitable.
Add the following to your Claude settings:
"mcpServers": {
"bitable-mcp": {
"command": "uvx",
"args": ["bitable-mcp"],
"env": {
"PERSONAL_BASE_TOKEN": "your-personal-base-token",
"APP_TOKEN": "your-app-token"
}
}
}
Alternatively, install via pip and update settings:
pip install bitable-mcp
"mcpServers": {
"bitable-mcp": {
"command": "python",
"args": ["-m", "bitable_mcp"],
"env": {
"PERSONAL_BASE_TOKEN": "your-personal-base-token",
"APP_TOKEN": "your-app-token"
}
}
}
Save your configuration and restart Claude.
Securing API Keys:
Store sensitive keys using env
in your JSON config:
"env": {
"PERSONAL_BASE_TOKEN": "your-personal-base-token",
"APP_TOKEN": "your-app-token"
}
No setup instructions provided for Cursor. Marked as “Coming soon” in the documentation.
No setup instructions provided for Cline.
For Zed, add to your settings.json
:
Using uvx:
"context_servers": [
"bitable-mcp": {
"command": "uvx",
"args": ["bitable-mcp"],
"env": {
"PERSONAL_BASE_TOKEN": "your-personal-base-token",
"APP_TOKEN": "your-app-token"
}
}
],
Using pip:
"context_servers": {
"bitable-mcp": {
"command": "python",
"args": ["-m", "bitable_mcp"],
"env": {
"PERSONAL_BASE_TOKEN": "your-personal-base-token",
"APP_TOKEN": "your-app-token"
}
}
},
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:
Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"bitable-mcp": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change "bitable-mcp"
to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | None mentioned |
List of Resources | ⛔ | None mentioned |
List of Tools | ✅ | list_table, describe_table, read_query |
Securing API Keys | ✅ | Uses env in config |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
The Bitable MCP server is straightforward and focused, offering essential tools for database interaction (listing, schema, query). There is no evidence of prompt templates or explicit MCP resources, and setup is only fully documented for Claude and Zed. The repository is open but basic, with no clear sign of advanced MCP features like roots or sampling.
MCP Table rating: 5/10.
It covers the basics well and is usable, but lacks documentation depth, resources, prompts, and advanced MCP features.
Has a LICENSE | ⛔ |
---|---|
Has at least one tool | ✅ |
Number of Forks | 3 |
Number of Stars | 2 |
The Bitable MCP Server provides direct access to Lark Bitable’s collaborative spreadsheet and database capabilities via the Model Context Protocol, allowing AI assistants and developer tools to list tables, explore schemas, and query data automatically.
The server supports three main tools: list_table (lists all tables in a workspace), describe_table (describes the schema for a given table), and read_query (executes SQL-like queries to extract data).
Use environment variables in your configuration (the 'env' section) to store sensitive keys like PERSONAL_BASE_TOKEN and APP_TOKEN. This helps keep credentials out of your source code.
Use cases include database table discovery, schema exploration, automated data extraction, AI-assisted data analysis, and workflow automation with tools like Claude and Zed.
Add an MCP component to your FlowHunt flow, then configure the MCP server using the provided JSON format, specifying the transport and URL for your Bitable MCP instance. This enables your AI agent to access all Bitable server tools.
Connect your AI agents to Lark Bitable for powerful database discovery, schema exploration, and automated querying. Streamline your data-driven processes with FlowHunt today.
The Airtable MCP Server connects FlowHunt and other AI assistants to Airtable’s API, enabling seamless automation of database workflows, intelligent schema mana...
The MCP Database Server enables secure, programmatic access to popular databases like SQLite, SQL Server, PostgreSQL, and MySQL for AI assistants and automation...
The Model Context Protocol (MCP) Server bridges AI assistants with external data sources, APIs, and services, enabling streamlined integration of complex workfl...