Nocodb MCP Server

FlowHunt’s Nocodb MCP Server enables AI agents and LLMs to securely connect and manage Nocodb databases, automating CRUD, schema, and bulk data operations in your workflows.

Nocodb MCP Server

What does “Nocodb” MCP Server do?

The Nocodb MCP Server acts as a bridge between AI assistants and Nocodb databases using the Model Context Protocol (MCP). This server enables AI-powered clients to perform seamless CRUD (Create, Read, Update, Delete) operations on Nocodb tables, facilitating data management workflows. By exposing database functionalities through the MCP interface, it allows LLMs and AI agents to query, create, update, and delete records or columns, and even upload files to create tables. This integration enhances developer productivity by automating and standardizing database interactions, making it easier to build, test, and deploy database-centric AI applications and workflows.

List of Prompts

  • Get Records: Retrieve data from a specified Nocodb table.
  • Create Record: Add new rows with specified values to a table.
  • Update Record: Update existing records, such as modifying values or removing suffixes.
  • Delete Record: Remove records based on criteria like matching names.
  • Add Column: Add new columns to an existing table.
  • Update Column Values: Set column values for all rows.
  • Delete Column: Remove columns from a table.
  • Create Table from File: Create a new table using data from a JSON file.
  • Bulk Create Records: Add multiple new records in one operation.
  • Bulk Delete Records: Remove multiple records at once.

List of Resources

  • Nocodb Tables: Access to all tables within the connected Nocodb database, allowing data to be read and used as context.
  • Table Schema: Metadata about the structure of each table, such as column names and data types.
  • Uploaded Files: JSON files (e.g., example_upload.json) that can be processed to create or update tables.
  • Bulk Sample Data: Example bulk data and screenshots provided in the docs/sample-bulk directory for demo and context.

List of Tools

  • CRUD Operations: Tools for Create, Read, Update, and Delete functionalities on tables and records (as described in prompt templates).
  • Upload File: Tool to process and upload JSON files to create tables in Nocodb.
  • Bulk Operations: Tools for bulk creating and deleting records in tables.

Use Cases of this MCP Server

  • Database Management: Automate CRUD operations on Nocodb tables, streamlining data entry, modification, and cleanup tasks for developers.
  • Data Migration: Upload and process JSON files to quickly migrate or seed data into Nocodb databases.
  • Schema Evolution: Add or remove columns programmatically, supporting evolving application data models.
  • Bulk Data Operations: Efficiently handle large-scale record creation or deletion, useful for batch processing or automated testing scenarios.
  • AI-Powered Dashboards: Enable AI agents to fetch and manipulate data for real-time reporting, analytics, or dashboard integrations.

How to set it up

Windsurf

  1. Ensure prerequisites like Node.js and Nocodb are installed.
  2. Locate the Windsurf configuration file (e.g., settings.json).
  3. Add the Nocodb MCP Server using the following JSON snippet:
    {
      "mcpServers": {
        "nocodb-mcp": {
          "command": "npx",
          "args": ["@edwinbernadus/nocodb-mcp-server@latest"]
        }
      }
    }
    
  4. Save the configuration and restart Windsurf.
  5. Verify the server is running and accessible by testing a sample database operation.

Claude

  1. Install Node.js and ensure access to the Nocodb instance.
  2. Edit the Claude platform’s MCP configuration file.
  3. Add the server with:
    {
      "mcpServers": {
        "nocodb-mcp": {
          "command": "npx",
          "args": ["@edwinbernadus/nocodb-mcp-server@latest"]
        }
      }
    }
    
  4. Restart Claude and check server connectivity.
  5. Confirm with a test prompt.

Cursor

  1. Prepare your environment with Node.js and Nocodb credentials.
  2. Open Cursor’s settings or MCP integration panel.
  3. Insert the following configuration:
    {
      "mcpServers": {
        "nocodb-mcp": {
          "command": "npx",
          "args": ["@edwinbernadus/nocodb-mcp-server@latest"]
        }
      }
    }
    
  4. Save and restart Cursor.
  5. Validate the server by running a CRUD operation.

Cline

  1. Set up Node.js and ensure Nocodb is available.
  2. Edit the Cline configuration for MCP servers.
  3. Add the server as shown:
    {
      "mcpServers": {
        "nocodb-mcp": {
          "command": "npx",
          "args": ["@edwinbernadus/nocodb-mcp-server@latest"]
        }
      }
    }
    
  4. Save changes and restart Cline.
  5. Test the setup by connecting to a Nocodb table.

Securing API Keys

Store API keys using environment variables for security. Example:

{
  "env": {
    "NOCODB_API_KEY": "your-nocodb-key"
  },
  "inputs": {
    "api_key": "${NOCODB_API_KEY}"
  }
}

How to use this MCP inside flows

Using MCP in FlowHunt

To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

FlowHunt MCP flow

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:

{
  "nocodb-mcp": {
    "transport": "streamable_http",
    "url": "https://yourmcpserver.example/pathtothemcp/url"
  }
}

Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “nocodb-mcp” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.


Overview

SectionAvailabilityDetails/Notes
OverviewFull description and capabilities in README.md
List of PromptsPrompt templates listed in README.md
List of ResourcesTables, schemas, files; described in README.md/example_upload.json
List of ToolsCRUD, bulk, and upload tools outlined in README and API_FUNCTION.md
Securing API Keysenv.example and setup instructions
Sampling Support (less important in evaluation)Not mentioned

Based on the available documentation and structure, the Nocodb MCP Server offers solid MCP integration, a clear set of prompt templates, resources, and setup instructions. However, there is no explicit documentation for Roots or Sampling support, which might limit its versatility in advanced scenarios. Overall, it is a practical and well-documented MCP server for database workflows.


MCP Score

Has a LICENSE
Has at least one tool
Number of Forks7
Number of Stars24

Frequently asked questions

What is the Nocodb MCP Server?

The Nocodb MCP Server allows AI assistants and LLMs to perform automated CRUD operations, schema changes, and file-based table creation on Nocodb databases through the Model Context Protocol. This makes database interactions seamless and programmable within AI workflows.

Which operations are supported by this server?

Supported operations include retrieving records, creating new records, updating or deleting records, adding or removing columns, bulk record management, and creating tables from uploaded files.

What are the main use cases?

Use cases include automating database management, migrating or seeding data via JSON uploads, evolving schemas programmatically, handling bulk data operations, and powering AI-driven dashboards or reporting tools with real-time data access.

How do I secure my Nocodb API key?

Store your Nocodb API key in environment variables and reference it in your server configuration, for example: { \"env\": { \"NOCODB_API_KEY\": \"your-nocodb-key\" }, \"inputs\": { \"api_key\": \"${NOCODB_API_KEY}\" } }

How do I integrate the Nocodb MCP server into a FlowHunt flow?

Add the MCP component to your flow, open its configuration panel, and provide the Nocodb MCP server details in the system configuration. Your AI agent can then use all the server’s capabilities as tools within your workflow.

Automate Your Database Workflows with Nocodb MCP

Connect your AI agents to Nocodb for effortless CRUD operations, schema evolution, and bulk data tasks. Streamline development and empower your flows with robust database access.

Learn more