Dify MCP Server
Connect AI assistants with Dify workflows to automate, orchestrate, and manage processes across cloud and local environments using the Dify MCP Server.

What does “dify” MCP Server do?
The dify MCP (Model Context Protocol) Server is a bridge that connects AI assistants with Dify workflows, enabling them to interact with external data sources, APIs, and services. By exposing Dify workflow tools through the MCP interface, this server allows AI agents to trigger and manage Dify workflows programmatically. This enhances development workflows by letting AI systems query databases, manage files, or interact with APIs using Dify as the backend. The server supports configuration via environment variables or YAML files, making it adaptable for both cloud and local setups.
List of Prompts
No information provided about prompt templates in the repository.
List of Resources
No explicit resources documented in the repository or README.
List of Tools
No explicit list of tools found in the repository or README. There is reference to “tools of MCP” but no specific tool names or descriptions are provided.
Use Cases of this MCP Server
- Workflow Orchestration: Enables AI agents to trigger and control Dify workflows remotely, automating complex business or development processes.
- API Integration: Facilitates the connection between AI systems and external services through Dify, allowing seamless API calls and data retrieval.
- Cloud Workflow Access: Makes it easy to connect cloud-hosted Dify workflows to MCP-compatible clients, improving scalability and access.
- Environment-based Configuration: Supports both environment variable and YAML config setups, making it suitable for both local and cloud deployments.
- Centralized Workflow Management: Allows management and invocation of multiple Dify workflows from a single MCP server instance for streamlined operations.
How to set it up
Windsurf
Ensure prerequisites such as Node.js and
uvx
/uv
are installed.Prepare configuration via environment variables or a YAML file.
Add the Dify MCP Server to your configuration:
{ "mcpServers": { "dify-mcp-server": { "command": "uvx", "args": [ "--from", "git+https://github.com/YanxingLiu/dify-mcp-server", "dify_mcp_server" ], "env": { "DIFY_BASE_URL": "https://cloud.dify.ai/v1", "DIFY_APP_SKS": "app-sk1,app-sk2" } } } }
Save and restart Windsurf.
Verify that the server is running and workflows are accessible.
Claude
Install
uvx
oruv
and set up environment variables or a config file.Add the following configuration to the Claude MCP client:
{ "mcpServers": { "dify-mcp-server": { "command": "uvx", "args": [ "--from", "git+https://github.com/YanxingLiu/dify-mcp-server", "dify_mcp_server" ], "env": { "DIFY_BASE_URL": "https://cloud.dify.ai/v1", "DIFY_APP_SKS": "app-sk1,app-sk2" } } } }
Save, restart, and verify setup.
Cursor
Make sure
uvx
/uv
is installed and environment variables are set or config.yaml is prepared.Insert the server configuration in Cursor’s MCP config:
{ "mcpServers": { "dify-mcp-server": { "command": "uvx", "args": [ "--from", "git+https://github.com/YanxingLiu/dify-mcp-server", "dify_mcp_server" ], "env": { "DIFY_BASE_URL": "https://cloud.dify.ai/v1", "DIFY_APP_SKS": "app-sk1,app-sk2" } } } }
Save and restart Cursor.
Confirm server operation.
Cline
Install
uvx
/uv
and set environment variables or provide a config.yaml.Add the Dify MCP Server to the MCP configuration:
{ "mcpServers": { "dify-mcp-server": { "command": "uvx", "args": [ "--from", "git+https://github.com/YanxingLiu/dify-mcp-server", "dify_mcp_server" ], "env": { "DIFY_BASE_URL": "https://cloud.dify.ai/v1", "DIFY_APP_SKS": "app-sk1,app-sk2" } } } }
Save and restart Cline.
Check that Dify workflows are reachable.
Securing API Keys
Always use environment variables to store sensitive data such as API keys. Example configuration:
{
"mcpServers": {
"dify-mcp-server": {
"command": "uvx",
"args": [
"--from", "git+https://github.com/YanxingLiu/dify-mcp-server", "dify_mcp_server"
],
"env": {
"DIFY_BASE_URL": "https://cloud.dify.ai/v1",
"DIFY_APP_SKS": "${DIFY_APP_SKS}" // Use system environment variable
}
}
}
}
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"dify-mcp-server": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “dify-mcp-server” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | No prompts/templates found |
List of Resources | ⛔ | No explicit resources documented |
List of Tools | ⛔ | No explicit tools listed |
Securing API Keys | ✅ | Env vars & config.yaml supported |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
Based on the available information, this MCP server offers basic but robust integration of Dify workflows into MCP-compatible platforms. However, documentation about prompts, resources, and tools is missing, which lowers its usability for advanced or standardized LLM interactions.
Our opinion
MCP Score: 4/10.
The dify-mcp-server is easy to set up and provides good cloud/local configuration support, but lacks documentation on prompts, resources, and tool capabilities, which limits its broader MCP utility.
MCP Score
Has a LICENSE | ⛔ (no LICENSE file detected) |
---|---|
Has at least one tool | ⛔ |
Number of Forks | 31 |
Number of Stars | 238 |
Frequently asked questions
- What is the Dify MCP Server?
The Dify MCP Server acts as a gateway between AI assistants and Dify workflows, enabling the automation and orchestration of external API calls, file management, and workflow execution via the MCP protocol.
- What are the main use cases for this MCP Server?
It is used for workflow orchestration, API integration, cloud workflow access, and centralized management of multiple Dify workflows from a single MCP server instance.
- How do I secure my API keys when configuring the server?
Always use environment variables to store sensitive information such as API keys. You can reference these variables in your server configuration to keep your credentials secure.
- Does the Dify MCP Server provide prompt templates or tools?
No prompt templates or explicit tool lists are provided in the current documentation, which may limit advanced LLM use cases.
- How does the Dify MCP Server integrate with FlowHunt?
Add the MCP component to your flow in FlowHunt, then configure it with your Dify MCP Server details. This enables your AI agent to access all workflow functions exposed by the server.
Integrate Dify Workflows with FlowHunt
Supercharge your AI agents by connecting them to Dify workflows through the Dify MCP Server. Automate complex processes and API calls with ease.