
Workflowy MCP Server Integration
The Workflowy MCP Server connects AI assistants with Workflowy, enabling automated note-taking, project management, and productivity workflows directly within F...
Connect AI assistants with Dify workflows to automate, orchestrate, and manage processes across cloud and local environments using the Dify MCP Server.
The dify MCP (Model Context Protocol) Server is a bridge that connects AI assistants with Dify workflows, enabling them to interact with external data sources, APIs, and services. By exposing Dify workflow tools through the MCP interface, this server allows AI agents to trigger and manage Dify workflows programmatically. This enhances development workflows by letting AI systems query databases, manage files, or interact with APIs using Dify as the backend. The server supports configuration via environment variables or YAML files, making it adaptable for both cloud and local setups.
No information provided about prompt templates in the repository.
No explicit resources documented in the repository or README.
No explicit list of tools found in the repository or README. There is reference to “tools of MCP” but no specific tool names or descriptions are provided.
Ensure prerequisites such as Node.js and uvx
/uv
are installed.
Prepare configuration via environment variables or a YAML file.
Add the Dify MCP Server to your configuration:
{
"mcpServers": {
"dify-mcp-server": {
"command": "uvx",
"args": [
"--from", "git+https://github.com/YanxingLiu/dify-mcp-server", "dify_mcp_server"
],
"env": {
"DIFY_BASE_URL": "https://cloud.dify.ai/v1",
"DIFY_APP_SKS": "app-sk1,app-sk2"
}
}
}
}
Save and restart Windsurf.
Verify that the server is running and workflows are accessible.
Install uvx
or uv
and set up environment variables or a config file.
Add the following configuration to the Claude MCP client:
{
"mcpServers": {
"dify-mcp-server": {
"command": "uvx",
"args": [
"--from", "git+https://github.com/YanxingLiu/dify-mcp-server", "dify_mcp_server"
],
"env": {
"DIFY_BASE_URL": "https://cloud.dify.ai/v1",
"DIFY_APP_SKS": "app-sk1,app-sk2"
}
}
}
}
Save, restart, and verify setup.
Make sure uvx
/uv
is installed and environment variables are set or config.yaml is prepared.
Insert the server configuration in Cursor’s MCP config:
{
"mcpServers": {
"dify-mcp-server": {
"command": "uvx",
"args": [
"--from", "git+https://github.com/YanxingLiu/dify-mcp-server", "dify_mcp_server"
],
"env": {
"DIFY_BASE_URL": "https://cloud.dify.ai/v1",
"DIFY_APP_SKS": "app-sk1,app-sk2"
}
}
}
}
Save and restart Cursor.
Confirm server operation.
Install uvx
/uv
and set environment variables or provide a config.yaml.
Add the Dify MCP Server to the MCP configuration:
{
"mcpServers": {
"dify-mcp-server": {
"command": "uvx",
"args": [
"--from", "git+https://github.com/YanxingLiu/dify-mcp-server", "dify_mcp_server"
],
"env": {
"DIFY_BASE_URL": "https://cloud.dify.ai/v1",
"DIFY_APP_SKS": "app-sk1,app-sk2"
}
}
}
}
Save and restart Cline.
Check that Dify workflows are reachable.
Always use environment variables to store sensitive data such as API keys. Example configuration:
{
"mcpServers": {
"dify-mcp-server": {
"command": "uvx",
"args": [
"--from", "git+https://github.com/YanxingLiu/dify-mcp-server", "dify_mcp_server"
],
"env": {
"DIFY_BASE_URL": "https://cloud.dify.ai/v1",
"DIFY_APP_SKS": "${DIFY_APP_SKS}" // Use system environment variable
}
}
}
}
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:
Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"dify-mcp-server": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “dify-mcp-server” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | No prompts/templates found |
List of Resources | ⛔ | No explicit resources documented |
List of Tools | ⛔ | No explicit tools listed |
Securing API Keys | ✅ | Env vars & config.yaml supported |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
Based on the available information, this MCP server offers basic but robust integration of Dify workflows into MCP-compatible platforms. However, documentation about prompts, resources, and tools is missing, which lowers its usability for advanced or standardized LLM interactions.
MCP Score: 4/10.
The dify-mcp-server is easy to set up and provides good cloud/local configuration support, but lacks documentation on prompts, resources, and tool capabilities, which limits its broader MCP utility.
Has a LICENSE | ⛔ (no LICENSE file detected) |
---|---|
Has at least one tool | ⛔ |
Number of Forks | 31 |
Number of Stars | 238 |
The Dify MCP Server acts as a gateway between AI assistants and Dify workflows, enabling the automation and orchestration of external API calls, file management, and workflow execution via the MCP protocol.
It is used for workflow orchestration, API integration, cloud workflow access, and centralized management of multiple Dify workflows from a single MCP server instance.
Always use environment variables to store sensitive information such as API keys. You can reference these variables in your server configuration to keep your credentials secure.
No prompt templates or explicit tool lists are provided in the current documentation, which may limit advanced LLM use cases.
Add the MCP component to your flow in FlowHunt, then configure it with your Dify MCP Server details. This enables your AI agent to access all workflow functions exposed by the server.
Supercharge your AI agents by connecting them to Dify workflows through the Dify MCP Server. Automate complex processes and API calls with ease.
The Workflowy MCP Server connects AI assistants with Workflowy, enabling automated note-taking, project management, and productivity workflows directly within F...
The Model Context Protocol (MCP) Server bridges AI assistants with external data sources, APIs, and services, enabling streamlined integration of complex workfl...
The Apify MCP Server connects AI assistants with the Apify platform, enabling seamless automation, data extraction, and workflow orchestration via standardized ...