Cursor Talk To Figma MCP Server
Automate, analyze, and modify Figma files programmatically with the Cursor Talk To Figma MCP Server—making design automation accessible to AI agents and developers.

What does “Cursor Talk To Figma” MCP Server do?
The Cursor Talk To Figma MCP Server provides a bridge between the Cursor AI development environment and Figma, enabling seamless interaction between AI assistants and design files. By exposing Figma’s design data and actions through the Model Context Protocol (MCP), this server allows developers and AI agents to read, analyze, and modify Figma designs programmatically. This integration streamlines workflows for designers and developers by automating repetitive design tasks, enabling bulk content replacement, propagating component overrides, and offering other automation capabilities directly from AI-powered tools. The server enhances productivity and collaboration by making Figma’s features accessible via standardized MCP endpoints.
List of Prompts
No prompt templates are explicitly listed in the repository or documentation.
List of Resources
No explicit list of MCP resources is provided in the repository or documentation.
List of Tools
No explicit list of MCP tools is included in the repository or server files as presented.
Use Cases of this MCP Server
- Bulk Text Content Replacement: Automate the replacement of text content across multiple Figma designs, reducing manual edits and saving significant time for design teams.
- Instance Override Propagation: Automatically propagate component instance overrides from one source to multiple targets, simplifying repetitive updates in large design systems.
- Design Automation: Enable AI-driven automation of various Figma tasks, such as updating styles, modifying layouts, or generating new design elements, directly from development environments.
- Integrating Figma with AI Agents: Allow AI agents in Cursor to read from and write to Figma files, enabling advanced design analysis, critique, or rapid prototyping.
- Collaborative Development and Design: Bridge the gap between development and design teams by allowing programmatic access to Figma designs from code, fostering tighter integration and faster feedback loops.
How to set it up
Windsurf
- Ensure you have Bun installed (
curl -fsSL https://bun.sh/install | bash
). - Clone the repository and run
bun setup
to install dependencies. - Start the WebSocket server:
bun socket
. - Add the MCP server to your Windsurf configuration:
{ "mcpServers": { "cursor-talk-to-figma": { "command": "bunx", "args": ["cursor-talk-to-figma-mcp"] } } }
- Save configuration and restart Windsurf. Verify the connection to the server.
Securing API Keys:
{
"mcpServers": {
"cursor-talk-to-figma": {
"command": "bunx",
"args": ["cursor-talk-to-figma-mcp"],
"env": {
"FIGMA_API_KEY": "${env.FIGMA_API_KEY}"
},
"inputs": {
"apiKey": "${env.FIGMA_API_KEY}"
}
}
}
}
Claude
- Install prerequisites (Bun).
- Run
bun setup
andbun socket
as above. - Add the MCP server to your Claude configuration file:
{ "mcpServers": { "cursor-talk-to-figma": { "command": "bunx", "args": ["cursor-talk-to-figma-mcp"] } } }
- Save and restart Claude.
Securing API Keys: (see example above)
Cursor
- Install Bun and run
bun setup
. - Start the WebSocket server:
bun socket
. - Add the following to your Cursor configuration:
{ "mcpServers": { "cursor-talk-to-figma": { "command": "bunx", "args": ["cursor-talk-to-figma-mcp"] } } }
- Save and restart Cursor, then verify the MCP server is active.
Securing API Keys: (see example above)
Cline
- Ensure Bun is installed.
- Run
bun setup
andbun socket
. - In your Cline configuration, add:
{ "mcpServers": { "cursor-talk-to-figma": { "command": "bunx", "args": ["cursor-talk-to-figma-mcp"] } } }
- Save, restart Cline, and verify.
Securing API Keys: (see example above)
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"cursor-talk-to-figma": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “cursor-talk-to-figma” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | Detailed in readme.md and project description |
List of Prompts | ⛔ | No prompt templates found |
List of Resources | ⛔ | Not explicitly listed |
List of Tools | ⛔ | Not explicitly listed |
Securing API Keys | ✅ | Environment variable example provided |
Sampling Support (less important in evaluation) | ⛔ | No mention found |
The repository provides a robust integration for automating Figma via MCP, but lacks detailed documentation of prompts, tools, and resources. The setup instructions and use cases are clear and practical, but deeper MCP-specific features (roots, sampling, etc.) are not documented.
MCP Score
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ⛔ |
Number of Forks | 433 |
Number of Stars | 4.4k |
Opinion and Rating:
Based on the two tables, this MCP server earns a 6/10. It is well-starred, actively used, and provides clear setup and valuable integration, but lacks explicit MCP prompt, resource, and tooling documentation, and provides no evidence of roots or sampling support.
Frequently asked questions
- What is the Cursor Talk To Figma MCP Server?
It is an integration layer that connects the Cursor AI development environment to Figma via the Model Context Protocol (MCP), allowing AI assistants and developers to read, analyze, and modify Figma designs programmatically for workflow automation.
- What are the main use cases for this server?
Key use cases include bulk text content replacement, propagating instance overrides across design systems, automating design tasks (like style or layout changes), integrating Figma with AI agents for design analysis or rapid prototyping, and bridging development and design workflows.
- How do I secure my Figma API keys?
Always store your FIGMA_API_KEY in environment variables and reference them within your MCP server configuration under the 'env' and 'inputs' fields to avoid exposing sensitive credentials in code.
- Does the server provide prompt templates or explicit tools?
No explicit prompt templates, MCP resources, or tools are listed in the repository or server documentation. The integration focuses on enabling Figma access through MCP endpoints for automation.
- How do I connect this MCP server in FlowHunt?
Add the MCP component to your FlowHunt flow, then configure the system MCP with your server details, specifying the transport and server URL. This enables your AI agent to access Figma functions via MCP.
- What is the overall evaluation of this MCP server?
It is robust, actively used, and clear in setup instructions, earning a 6/10 score. However, it lacks explicit documentation for MCP prompts, resources, and advanced features like roots and sampling.
Streamline Figma Workflows with AI
Integrate the Cursor Talk To Figma MCP Server to automate design tasks, accelerate prototyping, and bridge development and design teams using AI.