ShaderToy MCP Server

AI ShaderToy GLSL MCP

Contact us to host your MCP Server in FlowHunt

FlowHunt provides an additional security layer between your internal systems and AI tools, giving you granular control over which tools are accessible from your MCP servers. MCP servers hosted in our infrastructure can be seamlessly integrated with FlowHunt's chatbot as well as popular AI platforms like ChatGPT, Claude, and various AI editors.

What does “ShaderToy” MCP Server do?

ShaderToy-MCP is an MCP (Model Context Protocol) Server designed to bridge AI assistants with ShaderToy, a popular website for creating, running, and sharing GLSL shaders. By connecting LLMs (Large Language Models) like Claude to ShaderToy via MCP, this server allows the AI to query and read entire ShaderToy web pages, enabling it to generate and refine complex shaders beyond its standalone capabilities. This integration enhances the development workflow for shader artists and AI developers by providing seamless access to ShaderToy’s content, facilitating more sophisticated shader creation, exploration, and sharing.

List of Prompts

No information regarding prompt templates is provided in the repository.

Logo

Ready to grow your business?

Start your free trial today and see results within days.

List of Resources

No explicit resource definitions found in the available files or documentation.

List of Tools

No explicit tool list or server.py file is present in the repository with details on MCP tools.

Use Cases of this MCP Server

  • Shader Generation: Enables AI assistants to generate complex GLSL shaders by querying ShaderToy’s repository and using web context as inspiration or reference.
  • Shader Exploration: Allows users to explore and analyze ShaderToy shaders more efficiently with AI-powered summarization and explanation.
  • Creative Coding Assistance: AI can assist users in debugging or extending shader code by accessing ShaderToy examples and documentation through MCP.
  • Showcasing AI-Created Shaders: Facilitates the sharing of AI-generated shaders directly to ShaderToy, closing the loop between AI creation and community sharing.

How to set it up

Windsurf

  1. Ensure Node.js and Windsurf are installed.
  2. Locate your .windsurf/config.json configuration file.
  3. Add the ShaderToy MCP Server using the following JSON snippet:
    {
      "mcpServers": {
        "shadertoy": {
          "command": "npx",
          "args": ["@shadertoy/mcp-server@latest"]
        }
      }
    }
    
  4. Save the file and restart Windsurf.
  5. Verify the setup in Windsurf’s interface.

Claude

  1. Ensure Claude and Node.js are installed.
  2. Edit Claude’s config.json settings.
  3. Insert the ShaderToy MCP Server configuration:
    {
      "mcpServers": {
        "shadertoy": {
          "command": "npx",
          "args": ["@shadertoy/mcp-server@latest"]
        }
      }
    }
    
  4. Save the configuration and restart Claude.
  5. Confirm the server is available in Claude’s interface.

Cursor

  1. Install Node.js and Cursor.
  2. Find cursor.config.json in your user directory.
  3. Add this snippet:
    {
      "mcpServers": {
        "shadertoy": {
          "command": "npx",
          "args": ["@shadertoy/mcp-server@latest"]
        }
      }
    }
    
  4. Save and restart Cursor.
  5. Ensure ShaderToy MCP Server appears in the servers list.

Cline

  1. Install Node.js and Cline.
  2. Open the .cline/config.json file.
  3. Add the ShaderToy MCP Server:
    {
      "mcpServers": {
        "shadertoy": {
          "command": "npx",
          "args": ["@shadertoy/mcp-server@latest"]
        }
      }
    }
    
  4. Save and restart Cline.
  5. Verify the server is running via Cline’s diagnostics.

Securing API Keys (Example)

{
  "mcpServers": {
    "shadertoy": {
      "command": "npx",
      "args": ["@shadertoy/mcp-server@latest"],
      "env": {
        "SHADERTOY_API_KEY": "${SHADERTOY_API_KEY}"
      },
      "inputs": {
        "apiKey": "${SHADERTOY_API_KEY}"
      }
    }
  }
}

Note: Store your API keys in environment variables for security.

How to use this MCP inside flows

Using MCP in FlowHunt

To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

FlowHunt MCP flow

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:

{
  "shadertoy": {
    "transport": "streamable_http",
    "url": "https://yourmcpserver.example/pathtothemcp/url"
  }
}

Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “shadertoy” to the actual name of your MCP server and replace the URL with your own MCP server URL.


Overview

SectionAvailabilityDetails/Notes
OverviewOverview found in README.md
List of PromptsNo details on prompt templates
List of ResourcesNo explicit MCP resource definitions found
List of ToolsNo explicit tool listing or server.py in repo
Securing API KeysExample provided in setup instructions
Sampling Support (less important in evaluation)No mention of sampling support

Based on the above, ShaderToy-MCP provides a clear overview and setup guidance, but lacks documentation on prompt templates, tools, and resources. Its primary value is connecting LLMs to ShaderToy, but it would benefit from extended documentation and explicit MCP feature support. I would rate this MCP server a 4/10 for general MCP utility and documentation.

MCP Score

Has a LICENSE✅ (MIT)
Has at least one tool
Number of Forks3
Number of Stars21

Frequently asked questions

Connect FlowHunt to ShaderToy with MCP

Supercharge your AI workflows for shader creation, exploration, and sharing by integrating the ShaderToy MCP Server into FlowHunt.

Learn more

ShaderToy-MCP
ShaderToy-MCP

ShaderToy-MCP

Integrate FlowHunt with ShaderToy-MCP to automate shader discovery, analysis, and generation using AI-driven workflows. Unlock seamless connectivity between lar...

4 min read
AI ShaderToy +4
ModelContextProtocol (MCP) Server Integration
ModelContextProtocol (MCP) Server Integration

ModelContextProtocol (MCP) Server Integration

The ModelContextProtocol (MCP) Server acts as a bridge between AI agents and external data sources, APIs, and services, enabling FlowHunt users to build context...

3 min read
AI Integration +4
Unity MCP Server Integration
Unity MCP Server Integration

Unity MCP Server Integration

The Unity MCP Server bridges the Unity Editor with AI model clients such as Claude Desktop, Windsurf, and Cursor, enabling automation, intelligent assistance, a...

4 min read
Unity AI +5