Langfuse MCP Server Integration

AI MCP Prompt Management Langfuse

Contact us to host your MCP Server in FlowHunt

FlowHunt provides an additional security layer between your internal systems and AI tools, giving you granular control over which tools are accessible from your MCP servers. MCP servers hosted in our infrastructure can be seamlessly integrated with FlowHunt's chatbot as well as popular AI platforms like ChatGPT, Claude, and various AI editors.

What does “Langfuse” MCP Server do?

The Langfuse MCP Server is a Model Context Protocol (MCP) server designed for Langfuse Prompt Management. It enables AI assistants and developers to access and manage prompts stored in Langfuse using the standardized MCP interface. By connecting AI clients to external prompt repositories through MCP, this server streamlines the retrieval, listing, and compilation of prompts, which enhances the development workflow for large language models (LLMs). The Langfuse MCP Server supports prompt discovery, retrieval, and compilation, allowing tasks such as dynamic prompt selection and variable substitution. This integration simplifies prompt management and standardizes interactions between LLMs and prompt databases, making it especially useful in environments where consistent prompt usage and sharing are required across teams or platforms.

List of Prompts

  • prompts/list: Lists all available prompts in the Langfuse repository. Supports optional cursor-based pagination and provides prompt names with their required arguments. All arguments are assumed optional.
  • prompts/get: Retrieves a specific prompt by name and compiles it with provided variables. Supports both text and chat prompts, transforming them into MCP prompt objects.
Logo

Ready to grow your business?

Start your free trial today and see results within days.

List of Resources

  • Langfuse Prompts Resource: Exposes all prompts labeled as production in Langfuse for discovery and retrieval by AI clients.
  • Prompt Arguments Resource: Returns information about prompt variables (all optional; no detailed descriptions due to Langfuse specification limits).
  • Paginated Prompts Resource: Supports listing prompts with pagination for efficient access in large repositories.

List of Tools

  • get-prompts: Lists available prompts with their arguments. Supports optional cursor parameter for pagination, returning a list of prompt names and arguments.
  • get-prompt: Retrieves and compiles a specific prompt. Requires a name parameter and optionally takes a JSON object of variables to populate the prompt.

Use Cases of this MCP Server

  • Centralized Prompt Management: Streamline prompt updates and sharing across teams by managing all prompts in Langfuse and exposing them via MCP to various AI clients.
  • Standardized Prompt Retrieval: Ensure consistent prompt usage in LLM workflows by using MCP to retrieve validated, production-ready prompts on demand.
  • Dynamic Prompt Compilation: Enable LLMs or AI agents to compile prompts with runtime variables, allowing for flexible and dynamic interactions.
  • Prompt Discovery in Apps: Power prompt selection interfaces in developer tools or AI assistants by listing available prompts and their parameters.
  • Integration with LLMOps Workflows: Connect Langfuse prompt repositories to LLMOps platforms and agent frameworks via the MCP protocol for better prompt governance and auditing.

How to set it up

Windsurf

No specific instructions for Windsurf were found in the repository.

Claude

  1. Ensure Node.js and npm are installed.
  2. Build the server with:
    npm install
    npm run build
    
  3. Edit your claude_desktop_config.json to add the MCP server:
    {
      "mcpServers": {
        "langfuse": {
          "command": "node",
          "args": ["<absolute-path>/build/index.js"],
          "env": {
            "LANGFUSE_PUBLIC_KEY": "your-public-key",
            "LANGFUSE_SECRET_KEY": "your-secret-key",
            "LANGFUSE_BASEURL": "https://cloud.langfuse.com"
          }
        }
      }
    }
    
  4. Replace the environment variables with your actual Langfuse API keys.
  5. Save the configuration and restart Claude Desktop.
  6. Verify the server is available in the Claude Desktop MCP interface.

Cursor

  1. Ensure Node.js and npm are installed.
  2. Build the server:
    npm install
    npm run build
    
  3. In Cursor, add a new MCP server with:
    • Name: Langfuse Prompts
    • Type: command
    • Command:
      LANGFUSE_PUBLIC_KEY="your-public-key" LANGFUSE_SECRET_KEY="your-secret-key" LANGFUSE_BASEURL="https://cloud.langfuse.com" node absolute-path/build/index.js
      
  4. Replace the environment variables with your actual Langfuse API keys.
  5. Save and verify the server connection.

Cline

No specific instructions for Cline were found in the repository.

Securing API Keys

It is recommended to secure your API keys using environment variables. Here is an example JSON snippet for MCP server configuration:

{
  "mcpServers": {
    "langfuse": {
      "command": "node",
      "args": ["<absolute-path>/build/index.js"],
      "env": {
        "LANGFUSE_PUBLIC_KEY": "your-public-key",
        "LANGFUSE_SECRET_KEY": "your-secret-key",
        "LANGFUSE_BASEURL": "https://cloud.langfuse.com"
      }
    }
  }
}

Replace the values with your actual API credentials.

How to use this MCP inside flows

Using MCP in FlowHunt

To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

FlowHunt MCP flow

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:

{
  "langfuse": {
    "transport": "streamable_http",
    "url": "https://yourmcpserver.example/pathtothemcp/url"
  }
}

Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change "langfuse" to the actual name of your MCP server and replace the URL with your own MCP server URL.


Overview

SectionAvailabilityDetails/Notes
OverviewLangfuse MCP for prompt management
List of Promptsprompts/list, prompts/get
List of ResourcesPrompt listing, prompt variables, paginated resources
List of Toolsget-prompts, get-prompt
Securing API KeysVia environment variables in MCP config
Sampling Support (less important in evaluation)Not mentioned

Based on the available sections and features, the Langfuse MCP Server is well-documented and covers most critical MCP capabilities, especially for prompt management. Lack of explicit sampling or roots support lowers the extensibility slightly. Overall, it’s a strong implementation for its focus area.


MCP Score

Has a LICENSE✅ (MIT)
Has at least one tool
Number of Forks22
Number of Stars98

Frequently asked questions

Connect FlowHunt to Langfuse Prompt Management

Centralize and standardize your AI prompt workflows by integrating the Langfuse MCP Server with FlowHunt. Unlock efficient prompt discovery, retrieval, and dynamic compilation for advanced LLM operations.

Learn more

lingo.dev MCP Server
lingo.dev MCP Server

lingo.dev MCP Server

The lingo.dev MCP Server bridges AI assistants with external data sources, APIs, and services, enabling structured resource access, prompt templating, and tool ...

2 min read
MCP Servers AI Tools +3
Langflow-DOC-QA-SERVER MCP Server
Langflow-DOC-QA-SERVER MCP Server

Langflow-DOC-QA-SERVER MCP Server

Langflow-DOC-QA-SERVER is an MCP server for document question-and-answer tasks, enabling AI assistants to query documents via a Langflow backend. Integrate docu...

4 min read
AI MCP Server +3
any-chat-completions-mcp MCP Server
any-chat-completions-mcp MCP Server

any-chat-completions-mcp MCP Server

The any-chat-completions-mcp MCP Server connects FlowHunt and other tools to any OpenAI SDK-compatible Chat Completion API. It enables seamless integration of m...

4 min read
AI Chatbot +5