AWS Athena MCP Server
Connect your AI agents to AWS Athena for seamless SQL querying and analytics on data in Amazon S3—empowering smarter, data-driven applications with FlowHunt.

What does “aws-athena” MCP Server do?
The aws-athena MCP Server is a Model Context Protocol (MCP) implementation that empowers AI assistants to execute SQL queries directly against AWS Athena databases. By connecting AI-powered workflows to Athena, this server enables developers and AI agents to retrieve and analyze large-scale data stored in Amazon S3 with ease. The server acts as a bridge between conversational AI and enterprise data infrastructure, making it simple to incorporate robust data querying into automated workflows, code generation, and intelligent applications. Typical tasks include executing SQL statements, retrieving query results, and integrating data-driven insights into development processes, thereby streamlining database operations and accelerating data-centric application development.
List of Prompts
No prompt templates are explicitly mentioned in the available documentation or repository files.
List of Resources
No explicit resources are listed in the documentation or repository files.
List of Tools
- run_query:
Execute a SQL query using AWS Athena.- Parameters:
database
: The Athena database to queryquery
: The SQL query stringmaxRows
: Maximum number of rows to return (default: 1000, max: 10000)
- Returns:
- The results of the query if it completes within the specified timeout.
- Parameters:
Use Cases of this MCP Server
- Data Analytics for AI Agents
Allow AI assistants to run analytical SQL queries on large datasets stored in Amazon S3, enabling automated data exploration and reporting. - Business Intelligence Automation
Integrate Athena querying into business dashboards or workflow automation tools, providing up-to-date data insights without manual intervention. - Data-driven Code Generation
Enable LLMs to generate or refine code based on live database schemas or sample data retrieved via Athena queries. - ETL and Data Pipeline Integration
Use the server within data engineering pipelines to validate, transform, or audit data by executing custom SQL queries programmatically.
How to set it up
Windsurf
- Ensure you have Node.js installed and AWS credentials configured (via CLI, environment variables, or IAM role).
- Locate the Windsurf configuration file.
- Add the aws-athena MCP Server using the following JSON snippet:
{ "mcpServers": { "athena": { "command": "npx", "args": ["-y", "@lishenxydlgzs/aws-athena-mcp"], "env": { "OUTPUT_S3_PATH": "s3://your-bucket/athena-results/" } } } }
- Save and restart Windsurf.
- Verify the setup by attempting a sample query.
Claude
- Ensure Node.js and AWS credentials are set up.
- Edit the Claude MCP configuration file.
- Insert the server config:
{ "mcpServers": { "athena": { "command": "npx", "args": ["-y", "@lishenxydlgzs/aws-athena-mcp"], "env": { "OUTPUT_S3_PATH": "s3://your-bucket/athena-results/" } } } }
- Save changes and restart Claude.
- Test AWS Athena connectivity via the Claude interface.
Cursor
- Install Node.js and configure AWS credentials.
- Open Cursor’s settings or configuration file.
- Add the following snippet:
{ "mcpServers": { "athena": { "command": "npx", "args": ["-y", "@lishenxydlgzs/aws-athena-mcp"], "env": { "OUTPUT_S3_PATH": "s3://your-bucket/athena-results/" } } } }
- Save and restart Cursor.
- Confirm the server is available in the tool list.
Cline
- Verify Node.js installation and AWS credentials.
- Edit the Cline MCP configuration.
- Insert:
{ "mcpServers": { "athena": { "command": "npx", "args": ["-y", "@lishenxydlgzs/aws-athena-mcp"], "env": { "OUTPUT_S3_PATH": "s3://your-bucket/athena-results/" } } } }
- Save and restart Cline.
- Test the connection by running a sample Athena query.
Securing API Keys
Use environment variables to securely store sensitive AWS credentials.
Example configuration with secrets:
{
"mcpServers": {
"athena": {
"command": "npx",
"args": ["-y", "@lishenxydlgzs/aws-athena-mcp"],
"env": {
"OUTPUT_S3_PATH": "s3://your-bucket/athena-results/",
"AWS_ACCESS_KEY_ID": "${AWS_ACCESS_KEY_ID}",
"AWS_SECRET_ACCESS_KEY": "${AWS_SECRET_ACCESS_KEY}"
}
}
}
}
How to use this MCP inside flows
Using MCP in FlowHunt
To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:
{
"athena": {
"transport": "streamable_http",
"url": "https://yourmcpserver.example/pathtothemcp/url"
}
}
Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “athena” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.
Overview
Section | Availability | Details/Notes |
---|---|---|
Overview | ✅ | Overview and project goals are available |
List of Prompts | ⛔ | No prompt templates found |
List of Resources | ⛔ | No explicit MCP resources listed |
List of Tools | ✅ | run_query tool described in detail |
Securing API Keys | ✅ | Environment variable instructions included |
Sampling Support (less important in evaluation) | ⛔ | Not mentioned |
Our opinion
This MCP server is focused and production-ready for AWS Athena SQL querying, with clear setup and secure practices. However, it lacks prompt templates and explicit resource primitives, and does not mention sampling or roots support, limiting its score for versatility and advanced MCP features.
MCP Score
Has a LICENSE | ✅ (MIT) |
---|---|
Has at least one tool | ✅ (run_query ) |
Number of Forks | 9 |
Number of Stars | 25 |
Frequently asked questions
- What does the aws-athena MCP Server enable?
It allows AI assistants and workflows to execute SQL queries directly on Amazon S3 data via AWS Athena, returning results for analytics, reporting, and code generation.
- How do I securely provide AWS credentials?
Store AWS credentials as environment variables, not in plain config files. Reference them in your MCP server configuration using variable substitution.
- What tools are available with this server?
The server provides a 'run_query' tool to execute SQL queries on Athena databases, with options for database selection, query string, and result row limits.
- What are common use cases?
Common use cases include data analytics for AI agents, business intelligence automation, code generation based on live data, and ETL/data pipeline integration.
- Is there any prompt template or resource included?
No prompt templates or explicit resource primitives are included in the current documentation or repository files.
Integrate AWS Athena with FlowHunt
Unleash powerful data-driven AI workflows by connecting AWS Athena to your automation and analytics pipelines with FlowHunt’s streamlined MCP integration.