What does “ShaderToy” MCP Server do?
ShaderToy-MCP is an MCP (Model Context Protocol) Server designed to bridge AI assistants with ShaderToy, a popular website for creating, running, and sharing GLSL shaders. By connecting LLMs (Large Language Models) like Claude to ShaderToy via MCP, this server allows the AI to query and read entire ShaderToy web pages, enabling it to generate and refine complex shaders beyond its standalone capabilities. This integration enhances the development workflow for shader artists and AI developers by providing seamless access to ShaderToy’s content, facilitating more sophisticated shader creation, exploration, and sharing.
List of Prompts
No information regarding prompt templates is provided in the repository.
List of Resources
No explicit resource definitions found in the available files or documentation.
List of Tools
No explicit tool list or server.py file is present in the repository with details on MCP tools.
Use Cases of this MCP Server
- Shader Generation: Enables AI assistants to generate complex GLSL shaders by querying ShaderToy’s repository and using web context as inspiration or reference.
- Shader Exploration: Allows users to explore and analyze ShaderToy shaders more efficiently with AI-powered summarization and explanation.
- Creative Coding Assistance: AI can assist users in debugging or extending shader code by accessing ShaderToy examples and documentation through MCP.
- Showcasing AI-Created Shaders: Facilitates the sharing of AI-generated shaders directly to ShaderToy, closing the loop between AI creation and community sharing.
How to set it up
Windsurf
- Ensure Node.js and Windsurf are installed.
- Locate your
.windsurf/config.json
configuration file. - Add the ShaderToy MCP Server using the following JSON snippet:
{ "mcpServers": { "shadertoy": { "command": "npx", "args": ["@shadertoy/mcp-server@latest"] } } }
- Save the file and restart Windsurf.
- Verify the setup in Windsurf’s interface.
Claude
- Ensure Claude and Node.js are installed.
- Edit Claude’s
config.json
settings. - Insert the ShaderToy MCP Server configuration:
{ "mcpServers": { "shadertoy": { "command": "npx", "args": ["@shadertoy/mcp-server@latest"] } } }
- Save the configuration and restart Claude.
- Confirm the server is available in Claude’s interface.
Cursor
- Install Node.js and Cursor.
- Find
cursor.config.json
in your user directory. - Add this snippet:
{ "mcpServers": { "shadertoy": { "command": "npx", "args": ["@shadertoy/mcp-server@latest"] } } }
- Save and restart Cursor.
- Ensure ShaderToy MCP Server appears in the servers list.
Cline
- Install Node.js and Cline.
- Open the
.cline/config.json
file. - Add the ShaderToy MCP Server:
{ "mcpServers": { "shadertoy": { "command": "npx", "args": ["@shadertoy/mcp-server@latest"] } } }
- Save and restart Cline.
- Verify the server is running via Cline’s diagnostics.
Securing API Keys (Example)
{
"mcpServers": {
"shadertoy": {
"command": "npx",
"args": ["@shadertoy/mcp-server@latest"],
"env": {
"SHADERTOY_API_KEY": "${SHADERTOY_API_KEY}"
},
"inputs": {
"apiKey": "${SHADERTOY_API_KEY}"
}
}
}
}
Note: Store your API keys in environment variables for security.
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"shadertoy": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “shadertoy” to the actual name of your MCP server and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | Overview found in README.md |
List of Prompts | ⛔ | No details on prompt templates |
List of Resources | ⛔ | No explicit MCP resource definitions found |
List of Tools | ⛔ | No explicit tool listing or server.py in repo |
Securing API Keys | ✅ | Example provided in setup instructions |
Sampling Support (less important in evaluation) | ⛔ | No mention of sampling support |
Based on the above, ShaderToy-MCP provides a clear overview and setup guidance, but lacks documentation on prompt templates, tools, and resources. Its primary value is connecting LLMs to ShaderToy, but it would benefit from extended documentation and explicit MCP feature support. I would rate this MCP server a 4/10 for general MCP utility and documentation.
MCP Score
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ⛔ |
Number of Forks | 3 |
Number of Stars | 21 |
Frequently asked questions
- What is the ShaderToy MCP Server?
ShaderToy MCP Server is a bridge between AI assistants and ShaderToy, enabling the AI to query, generate, and share GLSL shaders by accessing ShaderToy’s content and community through the Model Context Protocol.
- Which use cases does this MCP server support?
It supports AI-driven shader generation, exploration, creative coding assistance, and sharing of AI-created shaders to ShaderToy, enhancing workflows for shader artists and developers.
- Is there support for prompt templates or explicit tools?
No, the current documentation does not include prompt templates or explicit MCP tool/resource definitions.
- How do I secure my API keys?
Store your ShaderToy API keys in environment variables and reference them in your MCP server configuration to keep them secure and out of your codebase.
- What is the overall documentation and MCP utility score?
ShaderToy MCP Server has a well-documented setup but lacks prompt, tool, and resource documentation. It scores 4/10 for general MCP utility and documentation.
Connect FlowHunt to ShaderToy with MCP
Supercharge your AI workflows for shader creation, exploration, and sharing by integrating the ShaderToy MCP Server into FlowHunt.