Snowflake MCP Server
Connect FlowHunt and your AI workflows to Snowflake databases with the Snowflake MCP Server—automate queries, manage schemas, and unlock data insights programmatically and securely.

What does “Snowflake” MCP Server do?
The Snowflake MCP Server is an implementation of the Model Context Protocol (MCP) that connects AI assistants and developer tools to a Snowflake database. It enables seamless database interaction by allowing users to execute SQL queries, manage database schemas, and access data insights through standardized MCP interfaces. By exposing Snowflake’s data and schema as accessible resources and providing tools for reading, writing, and managing tables, the server empowers AI-powered workflows, agents, and LLMs to perform database tasks. This dramatically enhances developer productivity by automating data analysis, table management, and schema exploration, all within secure and configurable boundaries.
List of Prompts
No prompt templates are explicitly mentioned in the repository or documentation.
List of Resources
memo://insights
- A continuously updated memo that aggregates discovered data insights. It is updated automatically when new insights are appended via the
append_insight
tool.
- A continuously updated memo that aggregates discovered data insights. It is updated automatically when new insights are appended via the
context://table/{table_name}
- (Available if prefetch is enabled) Provides per-table schema summaries, including columns and comments, exposed as individual resources.
List of Tools
read_query
- Executes
SELECT
SQL queries to read data from the Snowflake database, returning results as an array of objects.
- Executes
write_query
(enabled only with--allow-write
)- Executes
INSERT
,UPDATE
, orDELETE
SQL modification queries, returning the number of affected rows or a confirmation message.
- Executes
create_table
(enabled only with--allow-write
)- Allows creation of new tables in the Snowflake database using a
CREATE TABLE
SQL statement and returns a confirmation of table creation.
- Allows creation of new tables in the Snowflake database using a
list_databases
- Lists all databases in the Snowflake instance, returning an array of database names.
list_schemas
- Lists all schemas within a specified database.
list_tables
- Lists all tables within a specific database and schema, returning table metadata.
describe_table
- Provides column information for a specific table, including names, types, nullability, defaults, and comments.
Use Cases of this MCP Server
- Database Management and Exploration
- Developers and AI agents can automate the process of listing, describing, and managing databases, schemas, and tables within Snowflake, streamlining data infrastructure management.
- Automated Data Analysis
- Run parameterized queries to extract insights, generate reports, or feed downstream analytics pipelines.
- Schema Discovery and Documentation
- Automatically fetch and summarize schema details for documentation, compliance, or onboarding new team members.
- Contextual Data Insights
- Use the
memo://insights
resource to aggregate and access evolving data insights, supporting collaborative analytics or audit trails.
- Use the
- Table Creation and Data Engineering
- Programmatically create tables and update data via secure, auditable write operations, enabling automated ETL, data ingestion, or transformation workflows.
How to set it up
Windsurf
- Ensure you have Node.js installed and access to your Windsurf configuration.
- Open your Windsurf configuration file (often
windsurf.json
). - Add the Snowflake MCP Server as a new entry in the
mcpServers
array:{ "mcpServers": [ { "command": "mcp-snowflake-server", "args": ["--port", "8080"] } ] }
- Save the configuration and restart Windsurf.
- Verify connection to the Snowflake MCP Server in the Windsurf interface.
Securing API Keys (Example)
{
"command": "mcp-snowflake-server",
"env": {
"SNOWFLAKE_ACCOUNT": "your_account",
"SNOWFLAKE_USER": "your_user",
"SNOWFLAKE_PASSWORD": "${SNOWFLAKE_PASSWORD}"
},
"inputs": {
"database": "your_db"
}
}
Claude
- Ensure Claude supports MCP server integrations.
- Locate your Claude configuration file or MCP integration settings.
- Add the Snowflake MCP Server as a source:
{ "mcpServers": [ { "command": "mcp-snowflake-server", "args": [] } ] }
- Save changes and restart Claude.
- Confirm that Claude recognizes and can interact with the Snowflake MCP Server.
Cursor
- Install required dependencies and access Cursor’s configuration.
- Open the
cursor.json
or equivalent settings file. - Insert the Snowflake MCP Server in the
mcpServers
block:{ "mcpServers": [ { "command": "mcp-snowflake-server", "args": [] } ] }
- Save and restart Cursor.
- Check Cursor’s status page for MCP server connectivity.
Cline
- Make sure Cline is installed and up-to-date.
- Open the Cline configuration file.
- Register the Snowflake MCP Server as follows:
{ "mcpServers": [ { "command": "mcp-snowflake-server", "args": [] } ] }
- Save the configuration and restart Cline.
- Validate the connection to the Snowflake MCP Server.
Note on Securing API Keys
Store sensitive credentials such as Snowflake passwords or API tokens using environment variables. Reference them securely in your configuration files using the env
property.
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"snowflake-mcp": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “snowflake-mcp” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | |
List of Prompts | ⛔ | No prompt templates found. |
List of Resources | ✅ | memo://insights , context://table/{table_name} |
List of Tools | ✅ | read_query, write_query, create_table, list_databases, etc. |
Securing API Keys | ✅ | Example provided using environment variables. |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned in repo/docs. |
Based on the above, the Snowflake MCP Server offers a robust set of tools and resources for Snowflake database interaction, but lacks prompt templates and explicit sampling/roots support information.
Our opinion
The Snowflake MCP Server provides comprehensive Snowflake database access tools and useful resource primitives, is well-documented, and includes practical security/configuration guidance. However, the absence of prompt templates and explicit roots/sampling support reduces its MCP completeness. Overall, it is a strong and practical MCP implementation for database workflows.
MCP Score
Has a LICENSE | ✅ (GPL-3.0) |
---|---|
Has at least one tool | ✅ |
Number of Forks | 44 |
Number of Stars | 101 |
Frequently asked questions
- What does the Snowflake MCP Server do?
It connects AI assistants and developer tools to a Snowflake database, enabling SQL query execution, schema management, automated insights aggregation, and more through standardized MCP interfaces.
- What resources does the server expose?
It provides `memo://insights` for aggregated data insights and, if prefetch is enabled, `context://table/{table_name}` for per-table schema summaries.
- Which database operations are supported?
You can read (SELECT), write (INSERT/UPDATE/DELETE), create tables, list databases, schemas, and tables, and describe table schemas.
- Can I automate ETL and data engineering workflows?
Yes, using the write and create_table tools, you can automate table creation, data ingestion, transformation, and other engineering workflows programmatically.
- How do I securely configure the server with my credentials?
Store sensitive credentials in environment variables and reference them via the `env` property in your configuration, as shown in the setup examples.
- Is this server open-source?
Yes, it is licensed under GPL-3.0.
- Are prompt templates or sampling supported?
Prompt templates and sampling are not explicitly included in this server’s documentation.
Supercharge Your Data Workflows with Snowflake MCP Server
Experience automated database management, querying, and insights generation in your AI and developer workflows. Try FlowHunt’s Snowflake MCP Server integration today.