
DeepSeek MCP Server
The DeepSeek MCP Server acts as a secure proxy, connecting DeepSeek’s advanced language models to MCP-compatible applications like Claude Desktop or FlowHunt, e...
Bring Deepseek’s transparent reasoning and chain-of-thought AI outputs into your MCP-enabled assistants with support for both cloud and local deployments.
Deepseek Thinker MCP Server acts as a Model Context Protocol (MCP) provider, delivering Deepseek model reasoning content to MCP-enabled AI clients, such as Claude Desktop. It enables AI assistants to access Deepseek’s thought processes and reasoning outputs either through the Deepseek API service or from a local Ollama server. By integrating with this server, developers can enhance their AI workflows with focused reasoning, leveraging either cloud or local inference capabilities. This server is especially useful for scenarios where detailed reasoning chains or chain-of-thought (CoT) outputs are required to inform downstream AI tasks, making it valuable for advanced development, debugging, and AI agent enrichment.
No explicit prompt templates are mentioned in the repository or documentation.
No explicit MCP resources are detailed in the documentation or codebase.
originPrompt
(string) — The user’s original prompt.windsurf_config.json
).mcpServers
object:{
"deepseek-thinker": {
"command": "npx",
"args": [
"-y",
"deepseek-thinker-mcp"
],
"env": {
"API_KEY": "<Your API Key>",
"BASE_URL": "<Your Base URL>"
}
}
}
claude_desktop_config.json
.{
"mcpServers": {
"deepseek-thinker": {
"command": "npx",
"args": [
"-y",
"deepseek-thinker-mcp"
],
"env": {
"API_KEY": "<Your API Key>",
"BASE_URL": "<Your Base URL>"
}
}
}
}
{
"mcpServers": {
"deepseek-thinker": {
"command": "npx",
"args": [
"-y",
"deepseek-thinker-mcp"
],
"env": {
"API_KEY": "<Your API Key>",
"BASE_URL": "<Your Base URL>"
}
}
}
}
{
"mcpServers": {
"deepseek-thinker": {
"command": "npx",
"args": [
"-y",
"deepseek-thinker-mcp"
],
"env": {
"API_KEY": "<Your API Key>",
"BASE_URL": "<Your Base URL>"
}
}
}
}
For all platforms, API keys and sensitive configuration values should be provided using environment variables in the env
section. Example:
{
"mcpServers": {
"deepseek-thinker": {
"command": "npx",
"args": [
"-y",
"deepseek-thinker-mcp"
],
"env": {
"API_KEY": "<Your API Key>",
"BASE_URL": "<Your Base URL>"
}
}
}
}
For local Ollama mode, set USE_OLLAMA
to "true"
in the env
object:
"env": {
"USE_OLLAMA": "true"
}
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:
Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"deepseek-thinker": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “deepseek-thinker” to your actual MCP server name and set the correct URL.
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | No prompt templates documented |
List of Resources | ⛔ | No explicit MCP resources found |
List of Tools | ✅ | get-deepseek-thinker tool |
Securing API Keys | ✅ | Environment variables in config |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
Based on the two tables below, Deepseek Thinker MCP Server provides a focused tool for reasoning integration, is easy to set up, but lacks detailed prompt templates and explicit resource definitions. The project is open source, has a moderate following, and supports secure credential handling. It scores a 6/10 for overall completeness and utility as an MCP server.
Has a LICENSE | ⛔ (No LICENSE file detected) |
---|---|
Has at least one tool | ✅ |
Number of Forks | 12 |
Number of Stars | 51 |
It is a Model Context Protocol server that brings Deepseek model reasoning to MCP-enabled AI clients, offering chain-of-thought outputs and transparent model thinking for advanced AI workflows and debugging.
It offers the 'get-deepseek-thinker' tool for performing reasoning with the Deepseek model and returning structured reasoning outputs.
Yes, Deepseek Thinker supports both cloud-based and local (Ollama) inference. Set the 'USE_OLLAMA' environment variable to 'true' for local mode.
API keys and sensitive values should be stored in the 'env' section of your MCP server configuration as environment variables, not hardcoded in source files.
Limits are determined by the underlying Deepseek model or API; exceeding them may truncate responses or cause errors, so adjust your configuration and inputs accordingly.
No explicit prompt templates or extra MCP resources are provided as part of the current Deepseek Thinker MCP Server documentation.
Integrate Deepseek Thinker MCP Server to give your AI agents detailed reasoning capabilities and boost development workflows.
The DeepSeek MCP Server acts as a secure proxy, connecting DeepSeek’s advanced language models to MCP-compatible applications like Claude Desktop or FlowHunt, e...
The DeepSeek MCP Server integrates DeepSeek's advanced language models with MCP-compatible applications, providing secure, anonymized API access and enabling sc...
The Deepseek R1 MCP Server enables seamless integration of DeepSeek’s advanced language models, such as Deepseek R1 and V3, into Claude Desktop and other suppor...