
Bitable MCP Server Integration
The Bitable MCP Server connects FlowHunt and other AI platforms with Lark Bitable, enabling seamless database automation, schema exploration, and SQL-like query...
Integrate FlowHunt AI workflows with Lark (Feishu) to automate spreadsheet operations and boost productivity with the Lark MCP Server.
The Lark(Feishu) MCP Server is a Model Context Protocol (MCP) implementation designed to connect AI assistants with Lark (also known as Feishu), a popular collaborative office suite. This server enables AI-driven workflows to interact with Lark sheets, messages, documents, and more. By providing a standardized interface, it allows AI models to perform actions such as writing data to Lark spreadsheets, making it possible to automate data entry, reporting, or collaborative tasks. The integration enhances development workflows by bridging AI capabilities with real-time document management, streamlining interactions with Lark’s ecosystem for tasks that would otherwise require manual intervention.
No prompt templates were mentioned in the repository.
No specific resources are listed in the repository.
Prerequisite: Ensure you have Node.js and Windsurf installed.
Create a Lark(Feishu) App:
Visit Lark Open Platform and create an app.
Apply Permissions:
Grant the app sheets:spreadsheet:readonly
permission.
Set Environment Variables:
Set LARK_APP_ID
and LARK_APP_SECRET
in your environment.
Configure in Windsurf:
Edit your configuration file to add the MCP server:
"mcpServers": {
"mcpServerLark": {
"description": "MCP Server For Lark(Feishu)",
"command": "uvx",
"args": [
"parent_of_servers_repo/servers/src/mcp_server_lark"
],
"env": {
"LARK_APP_ID": "xxx",
"LARK_APP_SECRET": "xxx"
}
}
}
Save and Restart:
Save the config, restart Windsurf, and verify the connection.
Set up Cline and Node.js.
Register and configure your Lark(Feishu) app with permissions.
Add the following to your Cline configuration:
"mcpServers": {
"mcpServerLark": {
"description": "MCP Server For Lark(Feishu)",
"command": "uvx",
"args": [
"parent_of_servers_repo/servers/src/mcp_server_lark"
],
"env": {
"LARK_APP_ID": "xxx",
"LARK_APP_SECRET": "xxx"
}
}
}
Save and restart Cline.
Test connection to confirm setup.
Always use environment variables to store sensitive configuration values such as API keys. Example:
"env": {
"LARK_APP_ID": "your_app_id",
"LARK_APP_SECRET": "your_app_secret"
}
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:
Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"lark-mcp": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “lark-mcp” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | General description available |
List of Prompts | ⛔ | No prompt templates found |
List of Resources | ⛔ | No resources specifically listed |
List of Tools | ✅ | write_excel only |
Securing API Keys | ✅ | Via environment variables in configuration |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
Roots Support | Sampling Support |
---|---|
⛔ | ⛔ |
Based on the content found, this MCP server is in a very early stage, with minimal tooling and documentation. It primarily exposes a single tool and lacks details on prompts or resources. The configuration instructions are clear but basic. For now, the server scores low in terms of completeness and usability for broader MCP workflows.
Has a LICENSE | ✅ |
---|---|
Has at least one tool | ✅ |
Number of Forks | 1 |
Number of Stars | 1 |
The Lark(Feishu) MCP Server is a Model Context Protocol implementation that connects AI assistants with the Lark (Feishu) office suite. It lets AI workflows interact with Lark sheets, messages, and documents, automating data entry, reporting, and collaboration tasks via FlowHunt.
Currently, the server exposes the 'write_excel' tool, which enables AI agents to write data to a Lark sheet and share a link to the result. An email address is required for access permission.
The server enables automated data entry, collaborative report generation, AI agent integration with Lark sheets, and workflow automation such as updating attendance or inventory lists directly from FlowHunt or other AI-powered platforms.
Always use environment variables to store sensitive values like LARK_APP_ID and LARK_APP_SECRET in your MCP configuration to avoid exposing them in code or version control.
Add the MCP component to your FlowHunt flow, edit its configuration, and insert your MCP server details in JSON format. This enables your AI agent to use all MCP server tools directly within your automated workflows.
Supercharge your Lark (Feishu) documents and workflows by connecting them directly to AI via FlowHunt’s Lark MCP Server.
The Bitable MCP Server connects FlowHunt and other AI platforms with Lark Bitable, enabling seamless database automation, schema exploration, and SQL-like query...
The Langfuse MCP Server connects FlowHunt and other AI clients to Langfuse prompt repositories using the Model Context Protocol, enabling centralized prompt dis...
The Quarkus MCP Server enables FlowHunt users to connect LLM-powered agents to external databases and services via Java-based MCP servers, streamlining automati...