py-mcp-line: LINE Chat MCP Server
A robust Python MCP server for AI-powered access and analysis of LINE Bot conversations, supporting real-time and historical data integrations.

What does “py-mcp-line” MCP Server do?
The py-mcp-line MCP Server is a Python-based implementation of the Model Context Protocol (MCP) designed to provide AI assistants, such as language models, with standardized access to LINE Bot messages. By acting as a bridge between AI clients and LINE conversations, the server enables LLMs to read, analyze, and interact with LINE data in real-time. Built with FastAPI and leveraging asynchronous Python features for responsiveness, py-mcp-line makes it possible to process webhook events, validate data, and store messages in structured JSON format. This greatly enhances development workflows for projects requiring conversational analysis, bot development, or integration of LINE messaging data into broader AI-driven applications by exposing LINE resources, validating requests, and handling various message types.
List of Prompts
List of Resources
- LINE Message Resources
- Exposes message types as resources with URIs like
line://<message_type>/data
, enabling clients to access different categories of LINE messages.
- Exposes message types as resources with URIs like
- Resource Descriptions
- Each resource includes metadata such as description and MIME type to help clients understand and utilize the data correctly.
- Message Filtering
- Resources support filtering by date, user, or content, allowing for targeted retrieval of conversation data.
List of Tools
- list_resources
- Lists all available message types and provides resource URIs for clients to access.
- read_resource
- Reads and returns messages of a specified type, supporting advanced filtering (e.g., by date or user).
Use Cases of this MCP Server
- Conversational Data Analysis
- Developers can retrieve and analyze historical LINE chat data for sentiment analysis, topic modeling, or user behavior insights.
- Chatbot Development
- Enables AI-driven assistants to interact with and respond to LINE messages, facilitating sophisticated conversational bots.
- Message Archiving
- Automates the storage and archival of LINE messages in JSON format for compliance or record-keeping purposes.
- Multimodal Data Integration
- Supports text, sticker, and image messages, allowing for analysis and processing of diverse data types in LINE conversations.
How to set it up
Windsurf
Claude
- Prerequisites: Ensure Python 3.8+ is installed and all dependencies from
requirements.txt
are installed. - Locate Configuration File: On MacOS, open
~/Library/Application Support/Claude/claude_desktop_config.json
. On Windows, open%APPDATA%/Claude/claude_desktop_config.json
. - Add MCP Server: Insert the following JSON snippet into the
mcpServers
object:{ "mcpServers": { "line": { "command": "python", "args": [ "server.py" ], "env": { "LINE_CHANNEL_SECRET": "your_channel_secret", "LINE_ACCESS_TOKEN": "your_access_token", "SERVER_PORT": "8000", "MESSAGES_FILE": "data/messages.json" } } } }
- Save and Restart: Save the file and restart Claude Desktop to apply the changes.
- Verify Setup: Ensure the MCP server is running and accessible from Claude.
Securing API Keys
Store sensitive credentials in environment variables using the env
key as shown above to prevent accidental exposure.
Cursor
Cline
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"line": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “line” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | Provided in README.md |
List of Prompts | ⛔ | No prompt templates found in the repository |
List of Resources | ✅ | Resource listing and reading via API, supports filtering |
List of Tools | ✅ | list_resources , read_resource in server.py |
Securing API Keys | ✅ | Environment variables documented |
Sampling Support (less important in evaluation) | ⛔ | No explicit mention of sampling support |
Based on the above, py-mcp-line provides a solid MCP implementation focused on LINE message access, with clear resource and tool exposure, environment-based security, and real-world setup guidance for Claude. The lack of prompt templates and explicit sampling/root features limits broader score, but for conversational analysis and bot integration, it is functional and well-documented.
MCP Score
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ✅ |
Number of Forks | 6 |
Number of Stars | 17 |
Overall, I would rate this MCP implementation a 6.5/10. It covers the core functionalities for LINE message integration and is well-suited for developers needing conversational data access, but lacks advanced MCP features like prompt templates, sampling, and roots support.
Frequently asked questions
- What is py-mcp-line?
py-mcp-line is a Python implementation of the Model Context Protocol (MCP) that provides AI assistants with secure, structured access to LINE Bot conversations for analysis, integration, and archiving.
- What resources does the MCP server expose?
It exposes LINE message types (such as text, sticker, image) as resources accessible via URIs, supporting advanced filtering by date, user, or content.
- What are common use cases?
Typical use cases include conversational data analysis (sentiment, topic modeling), chatbot development, message archiving, and multimodal data processing within LINE conversations.
- How do I secure my LINE credentials?
Store sensitive data like channel secrets and access tokens in environment variables as shown in the configuration examples, avoiding hardcoding in your codebase.
- Can I use this MCP server in FlowHunt?
Yes! Add an MCP component to your FlowHunt flow, then configure it with your py-mcp-line server details to enable AI agent access to LINE messages and tools.
- Does py-mcp-line support prompt templates or sampling?
No, it does not include prompt templates or explicit sampling/root features. It focuses on providing resource access and message handling.
Integrate LINE Messaging with AI Workflows
Use py-mcp-line to connect your AI agents to LINE chats for advanced conversational analysis, bot development, or message archiving.