Google Search Console MCP Server

Google Search Console MCP Server

Published on Jan 14, 2026. Last modified on Jan 14, 2026 at 1:44 pm
SEO Search Console MCP AI Integration

What does “Google Search Console” MCP Server do?

The Google Search Console MCP Server enables seamless integration of Google Search Console data and functionality with AI assistants and development tools like Claude, Cursor, and Windsurf using the Model Context Protocol (MCP). By acting as a bridge between MCP clients and the Search Console API, it allows users to query search performance data, manage indexing, monitor site health, and analyze SEO metrics in natural language. This powerful integration unlocks comprehensive search analytics with support for multiple dimensions (queries, pages, countries, devices), detailed indexing management through the Google Indexing API, and automated SEO workflow capabilities. AI agents can now perform sophisticated SEO analysis, automate reporting, manage sitemaps, submit URLs for indexing, and provide actionable search optimization insights directly inside developer workflows or AI-powered tools, eliminating the need for manual dashboard navigation and accelerating data-driven SEO decisions.

List of Prompts

Search Performance Analysis

Show me the top 20 queries by impressions for the last 30 days
List all pages with more than 1000 impressions but CTR below 2% in the last 14 days
Compare clicks and impressions for the last 7 days versus the previous 30 days
Find queries where we rank in position 4-10 with high impression volume

Indexing Status and Management

Check the indexing status of my homepage and 10 most important landing pages
Submit these 50 new blog post URLs for indexing
Show me all URLs that failed indexing with error verdicts
Verify that the noindex tags were removed from our product pages

URL Inspection and Coverage

Inspect this URL and show me its mobile usability status and structured data issues
Check if these 20 URLs are actually indexed by Google
Find all pages excluded from indexing and show me the reasons

Sitemap Management

List all sitemaps for my website with their error counts and submission dates
Show me detailed information about the main XML sitemap including warnings
Submit this new sitemap URL to Google Search Console

Geographic and Device Performance

Show me search performance by country for the last 90 days
Compare mobile versus desktop performance metrics for all pages
Find which countries drive the most impressions but have the lowest CTR

List of Resources

No explicit resources are listed in the repository.

List of Tools

The Google Search Console MCP Server provides 13 comprehensive tools for SEO monitoring, search analytics, and indexing management:

Search Analytics & Performance

  • search_analytics - Get search analytics data from Google Search Console with comprehensive filtering, pagination, and aggregation support. Query by multiple dimensions (query, page, country, device, date, searchAppearance) and retrieve metrics including clicks, impressions, CTR, and position. Supports advanced filtering with operators (equals, contains, regex), pagination up to 25,000 rows, multiple search types (web, image, video, news, discover, googleNews), and configurable aggregation methods.

Indexing Management

  • submit_url_for_indexing - Submit a single URL to Google’s Indexing API for faster crawling and indexing of new or updated content. Supports both URL_UPDATED and URL_DELETED action types, enabling immediate notification to Google when content changes.

  • batch_submit_urls_for_indexing - Submit up to 100 URLs to Google’s Indexing API in a single batch request. Ideal for bulk operations like publishing multiple new pages, updating content across several URLs, or removing deleted pages from the index efficiently.

  • get_indexing_status - Get the indexing status of a previously submitted URL through the Indexing API. Check submission history, last notification time, and current indexing state for URLs you’ve actively submitted.

URL Inspection & Coverage

  • get_url_inspection - Inspect a specific URL for comprehensive indexing status and crawl information. This is the definitive tool to check if a URL is actually indexed by Google. Returns verdict codes (PASS=indexed, FAIL=error, NEUTRAL=excluded), coverage state, last crawl time, canonical URL, mobile usability, structured data issues, and more.

Site Management

  • list_sites - List all verified sites in your Google Search Console account. Returns site URLs in the proper format (URL-prefix properties like https://example.com/ or domain properties like sc-domain:example.com) along with permission levels.

  • get_site_info - Get detailed information about a specific verified site, including verification status, permission level, and site URL format.

  • add_site - Add or register a new site in Google Search Console. Supports both URL-prefix properties (https://example.com/ ) and domain properties (sc-domain:example.com). Note that the site must be verified before full access is granted.

  • delete_site - Remove or unregister a site from your Google Search Console account, cleaning up properties you no longer need to monitor.

Sitemap Management

  • get_sitemaps - List all sitemaps for a specific site. Returns submitted URL counts, errors, warnings, last submitted time, and processing status. Note: The ‘indexed’ field in contents is deprecated by Google and always returns 0 - use get_url_inspection to check actual indexing status of individual URLs.

  • get_sitemap - Get detailed information about a specific sitemap including last submitted and downloaded time, warnings, errors, content breakdown by type, and processing status. Essential for troubleshooting sitemap issues.

  • submit_sitemap - Submit a new sitemap to Google Search Console for crawling. Notifies Google about your site structure and helps prioritize crawling of important pages.

  • delete_sitemap - Delete or remove a sitemap from Google Search Console. Useful for cleaning up outdated or incorrect sitemap submissions.

Use Cases of this MCP Server

When Traffic Dropped 40% and You Need to Identify Which Pages Were Affected

You notice a sudden traffic decline in Google Analytics but can’t pinpoint the cause. Use the Google Search Console MCP Server to query search analytics data, comparing clicks and impressions across all pages for the last 30 days versus the previous period. The AI agent can automatically identify which specific URLs experienced the steepest declines, correlate them with query performance drops, and highlight whether the issue is widespread or isolated to specific page templates or content categories.

When Submitting 500 New Blog Posts for Indexing After Site Migration

Your team just migrated 500 blog articles to a new domain or URL structure, and waiting for Google to naturally discover and index them could take weeks. Using the batch_submit_urls_for_indexing tool, you can programmatically submit all 500 URLs in batches of 100, notifying Google immediately about the new content locations. The AI agent can track submission status for each batch and alert you if any URLs encounter indexing errors, accelerating your migration’s organic visibility recovery.

When Analyzing Which Search Queries Are Driving the Most Impressions But Zero Clicks

You want to identify high-potential keyword opportunities where your pages appear in search results but fail to attract clicks. Query search analytics data filtered by impressions greater than 1,000 and clicks equal to zero, sorted by impression volume. The AI agent reveals queries where your content ranks but underperforms on CTR, suggesting title and meta description optimizations to capture that untapped traffic potential.

When Checking Indexing Status of Recently Published Product Pages

Your e-commerce team published 50 new product pages yesterday, and stakeholders want confirmation they’re indexed and searchable. Use get_url_inspection to systematically check each product URL’s indexing verdict. The AI agent identifies which pages are indexed (PASS), which have errors (FAIL), and which are excluded (NEUTRAL), providing a prioritized list of URLs requiring immediate attention to maximize product discoverability during peak shopping season.

When Investigating Mobile Usability Issues Flagged by Google

Search Console shows mobile usability warnings affecting dozens of pages, but you need specifics to brief your development team. Use get_url_inspection to inspect affected URLs and extract detailed mobile usability diagnostics, including viewport configuration problems, text size issues, and clickable element spacing violations. The AI agent compiles a structured report with specific technical fixes needed for each page type, accelerating remediation.

When Tracking Core Web Vitals Performance Across Key Landing Pages

Your SEO strategy depends on maintaining excellent Core Web Vitals scores for high-converting landing pages. While the MCP server doesn’t directly expose Core Web Vitals metrics, you can use URL inspection to monitor crawl status and indexing health of these critical pages, ensuring they remain indexed and accessible while coordinating with separate performance monitoring tools for complete visibility.

When Discovering Why Your Homepage Isn’t Ranking for Your Brand Name

Your brand search rankings inexplicably dropped, and your homepage no longer appears for branded queries. Use get_url_inspection to verify your homepage’s indexing status, canonical URL configuration, and crawl accessibility. Query search analytics for your brand name to see if impressions persist but clicks dropped, indicating a ranking issue versus complete de-indexing. The AI agent correlates indexing data with query performance to diagnose whether the problem stems from technical issues, manual actions, or algorithm changes.

When Managing Sitemap Errors Affecting 1,200 URLs

Search Console reports errors for 1,200 URLs in your XML sitemap, but manually investigating each one is impractical. Use get_sitemap to retrieve detailed error breakdowns and warnings, then cross-reference problematic URLs with get_url_inspection to understand specific indexing issues. The AI agent categorizes errors by type (404s, redirect chains, noindex tags, etc.), prioritizes fixes by traffic impact, and generates an actionable remediation plan for your development team.

When Optimizing Content for Voice Search Queries

You want to identify question-based search queries driving impressions to your content and optimize for voice search opportunities. Query search analytics data filtering for queries containing “how”, “what”, “why”, “where”, and “when”, sorted by impressions. The AI agent reveals which question formats resonate with searchers, which pages rank for these queries, and where content gaps exist—guiding your voice search optimization strategy.

When Monitoring International SEO Performance by Country and Language

Your multilingual website serves customers in 15 countries, and you need country-specific search performance insights. Use search analytics with the country dimension to analyze clicks, impressions, CTR, and rankings segmented by geographic market. The AI agent identifies which countries underperform relative to traffic potential, which pages succeed internationally versus locally, and where localization improvements could yield the greatest organic growth.

When Validating That Noindex Tags Were Successfully Removed

Your team removed noindex tags from 200 pages last week after staging content accidentally went live with indexing blocked. Use get_url_inspection to verify that Google recognizes the removal and has re-crawled the pages without indexing restrictions. The AI agent confirms which URLs are now indexable, which remain excluded, and which require re-submission via submit_url_for_indexing to accelerate re-indexing.

When Identifying Content Cannibalization Issues

Multiple pages on your site target similar keywords, potentially competing against each other in search results. Query search analytics by page dimension for specific high-value queries to see which URLs receive impressions and clicks. The AI agent identifies when 3-4 pages split impressions for the same query, indicating cannibalization, and recommends consolidation strategies or content differentiation to strengthen your ranking potential.

When Detecting Sudden Increases in Crawl Errors After a Code Deployment

Your development team deployed a site redesign, and you need to ensure it didn’t introduce crawl accessibility problems. Use list_sites and get_site_info to access your property, then systematically inspect critical page types with get_url_inspection to detect new crawl errors, server errors, or robots.txt blocks introduced by the deployment. The AI agent flags regressions immediately, enabling rapid rollback or hotfixes before organic rankings suffer.

When Preparing Data-Driven SEO Reports for Executive Stakeholders

Your CMO requests monthly SEO performance updates with specific metrics: total clicks, impression trends, top-performing queries, and indexing coverage. Use search analytics to extract comprehensive performance data across dimensions (queries, pages, devices, countries), and combine with sitemap and URL inspection data to show indexing health. The AI agent compiles a natural language executive summary with data visualizations, eliminating hours of manual dashboard extraction and report formatting.

When Launching a New Website Section and Monitoring Index Coverage

You launched a new blog section with 75 articles and need to track how quickly Google indexes and ranks the content. Submit all URLs via batch_submit_urls_for_indexing, then monitor indexing progress with get_indexing_status and get_url_inspection. The AI agent tracks time-to-index for each article, identifies indexing bottlenecks (slow crawl rate, technical errors), and alerts you when all content achieves indexed status—ensuring your content marketing investment delivers organic visibility as quickly as possible.

How to set it up

Windsurf

  1. Ensure Python 3.10+ is installed.
  2. Clone the repository or install via PyPI if available.
  3. Add the Google Search Console MCP server to your mcpServers configuration:
    {
      "mcpServers": {
        "google-search-console-mcp": {
          "command": "python3",
          "args": ["-m", "google_search_console_mcp"]
        }
      }
    }
    
  4. Save the configuration and restart Windsurf.
  5. Verify the MCP server is listed and accessible in Windsurf’s UI.

Claude

  1. Ensure Python 3.10+ is installed.
  2. Use the provided claude-config-template.json as a starting point.
  3. Add or update the mcpServers field in your Claude configuration:
    {
      "mcpServers": {
        "google-search-console-mcp": {
          "command": "python3",
          "args": ["-m", "google_search_console_mcp"]
        }
      }
    }
    
  4. Save the configuration and restart Claude.
  5. Confirm the MCP server connection in Claude’s integrations panel.

Cursor

  1. Install Python 3.10+ and clone or install the MCP server.
  2. Locate Cursor’s configuration file.
  3. Add the MCP server entry:
    {
      "mcpServers": {
        "google-search-console-mcp": {
          "command": "python3",
          "args": ["-m", "google_search_console_mcp"]
        }
      }
    }
    
  4. Save and restart Cursor.
  5. Ensure the server appears under Cursor’s available MCP servers.

Cline

  1. Ensure Python 3.10+ is present.
  2. Download or install the MCP server.
  3. Modify Cline’s configuration to include:
    {
      "mcpServers": {
        "google-search-console-mcp": {
          "command": "python3",
          "args": ["-m", "google_search_console_mcp"]
        }
      }
    }
    
  4. Save, restart Cline, and check MCP server connectivity.

Securing API Keys (using environment variables):

To provide sensitive credentials (like Google Search Console API keys or service account files), use environment variables for security. Example configuration:

{
  "mcpServers": {
    "google-search-console-mcp": {
      "command": "python3",
      "args": ["-m", "google_search_console_mcp"],
      "env": {
        "GOOGLE_APPLICATION_CREDENTIALS": "/path/to/your/credentials.json"
      },
      "inputs": {
        "site_url": "sc-domain:example.com"
      }
    }
  }
}

How to use this MCP inside flows

Using MCP in FlowHunt

To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

FlowHunt MCP flow

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:

{
  "google-search-console-mcp": {
    "transport": "streamable_http",
    "url": "https://yourmcpserver.example/pathtothemcp/url"
  }
}

Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “google-search-console-mcp” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.


Overview

SectionAvailabilityDetails/Notes
Overview
List of PromptsNo prompt templates found
List of ResourcesNot explicitly listed
List of Tools13 comprehensive SEO and indexing tools
Securing API KeysEnv variable usage shown in config example
Sampling Support (less important in evaluation)Not documented

Between the documentation and the code, Google Search Console MCP provides a clear overview, comprehensive tool set, and setup instructions, but lacks detailed documentation on prompts and resources. For security, it supports environment variable configuration. Roots and sampling are not referenced.

Our opinion

Based on the tables above, this MCP server scores excellently for tool availability (13 comprehensive tools), overview, and setup instructions. It provides robust SEO and indexing capabilities with detailed tool descriptions. However, it is missing prompt templates and resource definitions. It is best suited for SEO professionals, developers, and marketing teams who understand Search Console concepts and need programmatic access to search performance data and indexing management.

MCP Score

Has a LICENSE✅ (MIT)
Has at least one tool✅ (13 tools)
Number of Forks12
Number of Stars89

Contact us to host your MCP Server in FlowHunt

FlowHunt provides an additional security layer between your internal systems and AI tools, giving you granular control over which tools are accessible from your MCP servers. MCP servers hosted in our infrastructure can be seamlessly integrated with FlowHunt's chatbot as well as popular AI platforms like ChatGPT, Claude, and various AI editors.

Frequently asked questions

What is the Google Search Console MCP Server?

It's a bridge between Google Search Console and AI/developer tools via the Model Context Protocol (MCP), enabling natural language access to SEO data, search analytics, indexing management, and seamless workflow integration for search performance optimization.

What are the main use cases?

SEO performance monitoring, automated search analytics reporting, URL indexing management, site health monitoring, keyword performance tracking, crawl error detection, sitemap management, and AI-driven SEO insights.

How do I secure my Google Search Console credentials?

Store sensitive information such as API keys or service account files in environment variables. For example, set 'GOOGLE_APPLICATION_CREDENTIALS' to your credentials file path in the MCP server config.

Can I submit multiple URLs for indexing at once?

Yes, the server includes a batch_submit_urls_for_indexing tool that allows you to submit up to 100 URLs in a single request, making bulk indexing operations efficient.

How do I check if my URLs are actually indexed by Google?

Use the get_url_inspection tool, which provides the definitive indexing status with verdict codes (PASS=indexed, FAIL=error, NEUTRAL=excluded). Note that the 'indexed' field in sitemap data is deprecated and always returns 0.

What search dimensions and metrics are available?

The search_analytics tool supports dimensions like query, page, country, device, date, and searchAppearance, with metrics including clicks, impressions, CTR, and average position. You can filter, aggregate, and paginate results for comprehensive analysis.

How do I use this MCP server inside FlowHunt?

Add the MCP component to your FlowHunt flow, open its configuration, and insert the MCP server details in JSON format. Once configured, your AI agent will have access to Google Search Console data for enhanced SEO capabilities.

Try Google Search Console MCP Server with FlowHunt

Unlock powerful SEO insights in your AI workflows, automate indexing management, and empower your team to optimize search performance directly from your favorite tools.

Learn more

Google Drive MCP Server
Google Drive MCP Server

Google Drive MCP Server

Seamlessly integrate Google Drive with AI assistants and developer tools using the Model Context Protocol (MCP). The Google Drive MCP Server enables natural lan...

17 min read
Cloud Storage File Management +4
Google Ads MCP Server
Google Ads MCP Server

Google Ads MCP Server

Transform Google Ads campaign management with AI-powered automation using the Google Ads MCP Server. Seamlessly integrate Google Ads data and operations with AI...

25 min read
Google Ads PPC +5
Klaviyo MCP Server
Klaviyo MCP Server

Klaviyo MCP Server

Automate email marketing campaigns and customer engagement with the Klaviyo MCP Server. Integrate Klaviyo's powerful marketing automation platform with AI assis...

14 min read
Email Marketing Marketing Automation +4