
MCP-Server-Creator MCP Server
The MCP-Server-Creator is a meta-server that enables rapid creation and configuration of new Model Context Protocol (MCP) servers. With dynamic code generation,...

Run Python code, install dependencies, and manage isolated environments directly inside your FlowHunt flows with the MCP Code Executor MCP Server.
FlowHunt provides an additional security layer between your internal systems and AI tools, giving you granular control over which tools are accessible from your MCP servers. MCP servers hosted in our infrastructure can be seamlessly integrated with FlowHunt's chatbot as well as popular AI platforms like ChatGPT, Claude, and various AI editors.
The MCP Code Executor is an MCP (Model Context Protocol) server that enables language models (LLMs) to execute Python code within a designated Python environment, such as Conda, virtualenv, or UV virtualenv. By connecting AI assistants to real, executable Python environments, it empowers them to perform a wide range of development tasks that require code execution, library management, and dynamic environment setup. This server supports incremental code generation to overcome token limitations, allows for on-the-fly installation of dependencies, and facilitates runtime configuration of the execution environment. Developers can leverage this tool to automate code evaluation, experiment with new packages, and manage computation within a controlled and secure environment.
No explicit prompt templates are listed in the repository or documentation.
No specific resources are described in the repository or documentation.
{
"mcpServers": {
"mcp-code-executor": {
"command": "node",
"args": [
"/path/to/mcp_code_executor/build/index.js"
],
"env": {
"CODE_STORAGE_DIR": "/path/to/code/storage",
"ENV_TYPE": "conda",
"CONDA_ENV_NAME": "your-conda-env"
}
}
}
}
{
"mcpServers": {
"mcp-code-executor": {
"env": {
"CODE_STORAGE_DIR": "/path/to/code/storage",
"ENV_TYPE": "conda",
"CONDA_ENV_NAME": "your-conda-env",
"MY_SECRET_API_KEY": "${MY_SECRET_API_KEY}"
},
"inputs": {
"apiKey": "${MY_SECRET_API_KEY}"
}
}
}
}
{
"mcpServers": {
"mcp-code-executor": {
"command": "node",
"args": [
"/path/to/mcp_code_executor/build/index.js"
],
"env": {
"CODE_STORAGE_DIR": "/path/to/code/storage",
"ENV_TYPE": "conda",
"CONDA_ENV_NAME": "your-conda-env"
}
}
}
}
{
"mcpServers": {
"mcp-code-executor": {
"command": "node",
"args": [
"/path/to/mcp_code_executor/build/index.js"
],
"env": {
"CODE_STORAGE_DIR": "/path/to/code/storage",
"ENV_TYPE": "conda",
"CONDA_ENV_NAME": "your-conda-env"
}
}
}
}
{
"mcpServers": {
"mcp-code-executor": {
"command": "node",
"args": [
"/path/to/mcp_code_executor/build/index.js"
],
"env": {
"CODE_STORAGE_DIR": "/path/to/code/storage",
"ENV_TYPE": "conda",
"CONDA_ENV_NAME": "your-conda-env"
}
}
}
}
Note: You can also use Docker. The provided Dockerfile is tested for
venv-uvenvironment type:
{
"mcpServers": {
"mcp-code-executor": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"mcp-code-executor"
]
}
}
}
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"mcp-code-executor": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “mcp-code-executor” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
| Section | Availability | Details/Notes |
|---|---|---|
| Overview | ✅ | |
| List of Prompts | ⛔ | No prompt templates found |
| List of Resources | ⛔ | No explicit resources described |
| List of Tools | ✅ | execute_code, install_dependencies, check_installed_packages |
| Securing API Keys | ✅ | Example provided with env inputs |
| Sampling Support (less important in evaluation) | ⛔ | Not specified |
This MCP server provides essential and robust functionality for code execution with LLM integration, along with clear setup instructions and tooling. However, it lacks prompt templates, explicit resources, and information on roots or sampling support. For a code-execution-focused MCP, it is very solid, scoring high for practical utility and ease of integration, but loses some points for missing advanced MCP features and documentation completeness.
| Has a LICENSE | ✅ (MIT) |
|---|---|
| Has at least one tool | ✅ |
| Number of Forks | 25 |
| Number of Stars | 144 |
Empower your flows with secure, automated Python code execution. Integrate the MCP Code Executor MCP Server and unlock dynamic workflows for data science, automation, and more.

The MCP-Server-Creator is a meta-server that enables rapid creation and configuration of new Model Context Protocol (MCP) servers. With dynamic code generation,...

The pydanticpydantic-aimcp-run-python MCP Server bridges AI assistants with secure, controlled Python code execution environments. It enables dynamic Python scr...

The Code Sandbox MCP Server provides a secure, containerized environment for executing code, enabling AI assistants and developer tools to run, test, and manage...
Cookie Consent
We use cookies to enhance your browsing experience and analyze our traffic. See our privacy policy.