Langfuse MCP Server Integration
Integrate the Langfuse MCP Server with FlowHunt to centrally manage, retrieve, and compile AI prompts from Langfuse, empowering dynamic and standardized LLM workflows.

What does “Langfuse” MCP Server do?
The Langfuse MCP Server is a Model Context Protocol (MCP) server designed for Langfuse Prompt Management. It enables AI assistants and developers to access and manage prompts stored in Langfuse using the standardized MCP interface. By connecting AI clients to external prompt repositories through MCP, this server streamlines the retrieval, listing, and compilation of prompts, which enhances the development workflow for large language models (LLMs). The Langfuse MCP Server supports prompt discovery, retrieval, and compilation, allowing tasks such as dynamic prompt selection and variable substitution. This integration simplifies prompt management and standardizes interactions between LLMs and prompt databases, making it especially useful in environments where consistent prompt usage and sharing are required across teams or platforms.
List of Prompts
prompts/list
: Lists all available prompts in the Langfuse repository. Supports optional cursor-based pagination and provides prompt names with their required arguments. All arguments are assumed optional.prompts/get
: Retrieves a specific prompt by name and compiles it with provided variables. Supports both text and chat prompts, transforming them into MCP prompt objects.
List of Resources
- Langfuse Prompts Resource: Exposes all prompts labeled as
production
in Langfuse for discovery and retrieval by AI clients. - Prompt Arguments Resource: Returns information about prompt variables (all optional; no detailed descriptions due to Langfuse specification limits).
- Paginated Prompts Resource: Supports listing prompts with pagination for efficient access in large repositories.
List of Tools
get-prompts
: Lists available prompts with their arguments. Supports optionalcursor
parameter for pagination, returning a list of prompt names and arguments.get-prompt
: Retrieves and compiles a specific prompt. Requires aname
parameter and optionally takes a JSON object of variables to populate the prompt.
Use Cases of this MCP Server
- Centralized Prompt Management: Streamline prompt updates and sharing across teams by managing all prompts in Langfuse and exposing them via MCP to various AI clients.
- Standardized Prompt Retrieval: Ensure consistent prompt usage in LLM workflows by using MCP to retrieve validated, production-ready prompts on demand.
- Dynamic Prompt Compilation: Enable LLMs or AI agents to compile prompts with runtime variables, allowing for flexible and dynamic interactions.
- Prompt Discovery in Apps: Power prompt selection interfaces in developer tools or AI assistants by listing available prompts and their parameters.
- Integration with LLMOps Workflows: Connect Langfuse prompt repositories to LLMOps platforms and agent frameworks via the MCP protocol for better prompt governance and auditing.
How to set it up
Windsurf
No specific instructions for Windsurf were found in the repository.
Claude
- Ensure Node.js and npm are installed.
- Build the server with:
npm install npm run build
- Edit your
claude_desktop_config.json
to add the MCP server:{ "mcpServers": { "langfuse": { "command": "node", "args": ["<absolute-path>/build/index.js"], "env": { "LANGFUSE_PUBLIC_KEY": "your-public-key", "LANGFUSE_SECRET_KEY": "your-secret-key", "LANGFUSE_BASEURL": "https://cloud.langfuse.com" } } } }
- Replace the environment variables with your actual Langfuse API keys.
- Save the configuration and restart Claude Desktop.
- Verify the server is available in the Claude Desktop MCP interface.
Cursor
- Ensure Node.js and npm are installed.
- Build the server:
npm install npm run build
- In Cursor, add a new MCP server with:
- Name: Langfuse Prompts
- Type: command
- Command:
LANGFUSE_PUBLIC_KEY="your-public-key" LANGFUSE_SECRET_KEY="your-secret-key" LANGFUSE_BASEURL="https://cloud.langfuse.com" node absolute-path/build/index.js
- Replace the environment variables with your actual Langfuse API keys.
- Save and verify the server connection.
Cline
No specific instructions for Cline were found in the repository.
Securing API Keys
It is recommended to secure your API keys using environment variables. Here is an example JSON snippet for MCP server configuration:
{
"mcpServers": {
"langfuse": {
"command": "node",
"args": ["<absolute-path>/build/index.js"],
"env": {
"LANGFUSE_PUBLIC_KEY": "your-public-key",
"LANGFUSE_SECRET_KEY": "your-secret-key",
"LANGFUSE_BASEURL": "https://cloud.langfuse.com"
}
}
}
}
Replace the values with your actual API credentials.
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"langfuse": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change "langfuse"
to the actual name of your MCP server and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | Langfuse MCP for prompt management |
List of Prompts | ✅ | prompts/list , prompts/get |
List of Resources | ✅ | Prompt listing, prompt variables, paginated resources |
List of Tools | ✅ | get-prompts , get-prompt |
Securing API Keys | ✅ | Via environment variables in MCP config |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
Based on the available sections and features, the Langfuse MCP Server is well-documented and covers most critical MCP capabilities, especially for prompt management. Lack of explicit sampling or roots support lowers the extensibility slightly. Overall, it’s a strong implementation for its focus area.
MCP Score
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ✅ |
Number of Forks | 22 |
Number of Stars | 98 |
Frequently asked questions
- What is the Langfuse MCP Server?
The Langfuse MCP Server is a Model Context Protocol server that connects AI clients like FlowHunt to Langfuse’s prompt management platform. It enables prompt discovery, retrieval, and dynamic compilation, streamlining prompt workflows for LLMs and agents.
- Which features does the Langfuse MCP Server support?
It supports listing all available prompts, retrieving and compiling prompts with variables, paginated prompt discovery, and exposing prompt arguments. All arguments are assumed optional, and the server is designed for production prompt management in LLMOps scenarios.
- How do I secure my Langfuse API keys?
You should store API keys as environment variables in your MCP server configuration to keep them secure. See the provided configuration examples for details on environment variable setup.
- Can I use the Langfuse MCP Server in FlowHunt workflows?
Yes! Add the MCP component in your FlowHunt flow, configure it to point to your Langfuse MCP server, and your agents can dynamically access, discover, and compile prompts from Langfuse.
- What are common use cases for this integration?
Centralized prompt management, standardized retrieval for LLM workflows, dynamic prompt compilation with runtime variables, powering prompt selection interfaces, and integration with LLMOps tools for better governance and auditing.
Connect FlowHunt to Langfuse Prompt Management
Centralize and standardize your AI prompt workflows by integrating the Langfuse MCP Server with FlowHunt. Unlock efficient prompt discovery, retrieval, and dynamic compilation for advanced LLM operations.