
iTerm MCP Server
The iTerm MCP Server enables seamless integration between AI assistants and iTerm2 on macOS, allowing programmatic terminal automation, session management, and ...
Enable AI assistants to securely and efficiently interact with your iTerm terminal for streamlined development, REPL automation, and command execution.
The iterm-mcp MCP Server is a Model Context Protocol server designed to provide AI assistants with direct access to your iTerm session. This powerful tool enables large language models (LLMs) to execute commands, interact with REPLs, and manage terminal workflows within the currently active iTerm terminal. By bridging AI clients to the terminal environment, iterm-mcp enhances development workflows through natural, shared access—facilitating tasks such as running shell commands, inspecting terminal output, and sending control characters (e.g., interrupt signals). Its efficient token usage ensures only relevant output is surfaced, and its minimal dependencies make it easy to integrate with platforms like Claude Desktop and other MCP-enabled clients, streamlining CLI and REPL assistance for developers.
No prompt templates are explicitly mentioned in the repository.
No explicit resources are documented in the repository.
write_to_terminal
Writes input to the active iTerm terminal session. Commonly used to run shell commands and returns the number of output lines produced.
read_terminal_output
Reads a specified number of lines from the active iTerm terminal output, allowing models to retrieve recent terminal activity.
send_control_character
Sends control characters (like Ctrl+C or Ctrl+Z) to the active iTerm terminal, supporting process interruption or suspension.
REPL Automation and Assistance
Enables LLMs to interact with live REPL sessions, executing commands, inspecting results, and managing multi-step workflows interactively.
CLI Workflow Automation
Allows AI agents to execute and monitor shell commands, automate routine development tasks, and handle output parsing or error handling.
Terminal Output Inspection
Models can inspect the current or past terminal output, answer questions about what’s on the screen, and assist with debugging or log analysis.
Process Management
Through control characters, developers can delegate process interruption, suspension, or continuation tasks to AI assistants for improved workflow safety.
Code Execution and Testing
Facilitates running code snippets or scripts directly within the terminal, with the AI model capturing outputs and iterating based on results.
mcpServers
section:{
"mcpServers": {
"iterm-mcp": {
"command": "npx",
"args": [
"-y",
"iterm-mcp"
]
}
}
}
Securing API keys:
If the server requires environment variables or secrets, add them as follows:
{
"mcpServers": {
"iterm-mcp": {
"command": "npx",
"args": ["-y", "iterm-mcp"],
"env": {
"MY_SECRET_KEY": "value"
}
}
}
}
~/Library/Application Support/Claude/claude_desktop_config.json
(macOS)%APPDATA%/Claude/claude_desktop_config.json
(Windows)mcpServers
section:{
"mcpServers": {
"iterm-mcp": {
"command": "npx",
"args": [
"-y",
"iterm-mcp"
]
}
}
}
Securing API keys:
Add secrets under the env
object as needed.
{
"mcpServers": {
"iterm-mcp": {
"command": "npx",
"args": [
"-y",
"iterm-mcp"
]
}
}
}
Securing API keys:
Add secrets via the env
attribute.
{
"mcpServers": {
"iterm-mcp": {
"command": "npx",
"args": [
"-y",
"iterm-mcp"
]
}
}
}
Securing API keys:
Configure secrets as environment variables in the config, e.g.:
{
"mcpServers": {
"iterm-mcp": {
"command": "npx",
"args": ["-y", "iterm-mcp"],
"env": {
"MY_SECRET_KEY": "value"
}
}
}
}
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:
Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"iterm-mcp": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “MCP-name” to whatever the actual name of your MCP server is (e.g., “github-mcp”, “weather-api”, etc.) and replace the URL with your own MCP server URL.
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | No prompt templates documented |
List of Resources | ⛔ | No explicit resources documented |
List of Tools | ✅ | write_to_terminal, read_terminal_output, send_control_character |
Securing API Keys | ✅ | Configuration examples with env documented |
Sampling Support (less important in evaluation) | ⛔ | No sampling support mentioned |
Based on the available information, iterm-mcp offers robust terminal integration and tool exposure, with clear setup instructions and security guidance, but lacks documented prompt templates, explicit resources, and advanced MCP features like roots and sampling. This makes it suitable for terminal-centric workflows, but less feature-rich for broader MCP contexts.
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ✅ |
Number of Forks | 32 |
Number of Stars | 360 |
iterm-mcp is a Model Context Protocol server that allows AI assistants to directly access and interact with your iTerm terminal session. It enables command execution, REPL automation, terminal output inspection, and process management through secure, streamlined integration.
iterm-mcp exposes tools such as write_to_terminal (run shell commands), read_terminal_output (fetch recent terminal output), and send_control_character (send signals like Ctrl+C or Ctrl+Z for process management).
You can integrate iterm-mcp with Windsurf, Claude Desktop, Cursor, and Cline. Each platform requires a simple configuration update to add the MCP server.
Add secrets or environment variables under the `env` object in your MCP server configuration. This way, sensitive information remains protected during runtime.
iterm-mcp is ideal for REPL automation, CLI workflow automation, terminal output inspection, process management, and code execution/testing—all through AI-driven terminal access.
Supercharge your CLI workflows and automate REPL sessions by integrating iterm-mcp with FlowHunt. Boost productivity with seamless AI-powered terminal access.
The iTerm MCP Server enables seamless integration between AI assistants and iTerm2 on macOS, allowing programmatic terminal automation, session management, and ...
The ModelContextProtocol (MCP) Server acts as a bridge between AI agents and external data sources, APIs, and services, enabling FlowHunt users to build context...
The LLM Context MCP Server bridges AI assistants with external code and text projects, enabling context-aware workflows for code review, documentation generatio...