LinkedIn MCP Runner
Empower your AI assistant with real LinkedIn insights—generate, analyze, and rewrite posts in your true voice, directly from your FlowHunt workflows.

What does “LinkedIn MCP Runner” MCP Server do?
The LinkedIn MCP Runner is an official implementation of the Model Context Protocol (MCP) designed to connect AI assistants like GPT-based models with a user’s public LinkedIn data. It serves as a creative co-pilot, enabling AI tools such as Claude or ChatGPT to access your actual LinkedIn posts, analyze engagement, understand your writing tone, and help generate or rewrite posts in your unique voice. By leveraging your real content, it streamlines workflows for content creation, analytics, and engagement strategies—transforming AI assistants into LinkedIn-savvy strategists who can provide actionable insights and automate social media interaction, all while maintaining user consent and privacy.
List of Prompts
No explicit prompt templates are listed in the repository or README.
List of Resources
No explicit MCP resources are described in the repository or README.
List of Tools
No explicit tools (such as database queries, file management, or API calls) are described in the repository or README.
Use Cases of this MCP Server
- Personalized Content Creation
The server enables users to generate LinkedIn posts crafted in their own voice, using insights from their previous content to maintain authenticity and maximize engagement. - Content Analytics
Analyze the performance of past posts to determine what resonates most with an audience, guiding future content strategies. - Automated Rewriting
Rewrite existing drafts or posts to better align with a user’s historic style and tone, making posts more compelling and on-brand. - AI-Assisted Brainstorming
Brainstorm new content ideas based on past performance data and writing patterns, ensuring relevance and creativity. - Multi-Platform Integration
Seamless use with both Claude and ChatGPT, allowing users to leverage LinkedIn data across their preferred AI assistants.
How to set it up
Windsurf
No setup instructions or configuration examples are provided for Windsurf.
Claude
- Download the Claude desktop app from claude.ai/download.
- Visit ligo.ertiqah.com/integrations/claude.
- Click “Generate Installation Command” (authentication with LiGo required).
- Copy the generated command and run it in your terminal.
- Open Claude and start chatting.
No JSON configuration is shown in the documentation.
Cursor
No setup instructions or configuration examples are provided for Cursor.
Cline
No setup instructions or configuration examples are provided for Cline.
Securing API Keys
No information on API key management or environment variable usage is provided.
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"MCP-name": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “MCP-name” to whatever the actual name of your MCP server is (e.g., “github-mcp”, “weather-api”, etc.) and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | Not specified in repo or README |
List of Resources | ⛔ | Not specified in repo or README |
List of Tools | ⛔ | Not specified in repo or README |
Securing API Keys | ⛔ | Not specified in repo or README |
Sampling Support (less important in evaluation) | ⛔ | Not specified in repo or README |
Overall, the LinkedIn MCP Runner offers a unique AI-powered LinkedIn content experience, but the public documentation is missing protocol-level details—such as resources, prompt templates, and explicit tool lists. As such, developers may find it easy to use but lacking in technical transparency.
MCP Score
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ⛔ |
Number of Forks | 2 |
Number of Stars | 4 |
Rating:
Given the clear overview and use case explanations but lack of technical MCP details, I would rate the LinkedIn MCP Runner repository a 4 out of 10 for MCP clarity and developer readiness.
Frequently asked questions
- What is the LinkedIn MCP Runner?
The LinkedIn MCP Runner is an official implementation of the Model Context Protocol that connects AI assistants to your public LinkedIn data. It enables AI tools to analyze your posts, understand your writing style, and assist in creating or rewriting LinkedIn content tailored to your unique voice.
- How does the LinkedIn MCP Runner help with content creation?
It lets you generate posts and rewrites in your authentic tone, analyzes past engagement, and provides actionable insights for your LinkedIn strategy—directly via your favorite AI assistant.
- Is my privacy protected when using this MCP server?
Yes, the LinkedIn MCP Runner is designed to access only your public LinkedIn data with your consent, ensuring privacy and user control.
- Which AI assistants can use the LinkedIn MCP Runner?
The server works seamlessly with Claude, ChatGPT, and any AI assistant supporting the Model Context Protocol, making it easy to integrate into your FlowHunt workflows.
- How do I add the LinkedIn MCP Runner to my FlowHunt workflow?
In FlowHunt, add the MCP component to your flow, click to configure it, and insert your MCP server details using the provided JSON format. Be sure to use the correct server name and URL.
Supercharge Your LinkedIn Content with AI
Let FlowHunt and the LinkedIn MCP Runner transform your AI assistant into a LinkedIn-savvy strategist—generate posts, analyze engagement, and maintain your authentic voice.