AI Protocols and Technical SEO for E-commerce: A Founder's Technical Deep-Dive

AI Protocols and Technical SEO for E-commerce: A Founder's Technical Deep-Dive

Published on Jan 29, 2025 by Arshia Kahani. Last modified on Jan 29, 2025 at 9:00 am
Technical SEO AI Protocols E-commerce Schema.org

Viktor Zeman co-founded Quality Unit over two decades ago and has led the development and global growth of the product suite, including FlowHunt. Since 2024, he’s focused specifically on FlowHunt and helping companies implement practical AI solutions, automation, and modern AI-driven work environments. His E-commerce Mastermind presentation dove deep into three critical areas of technical implementation of AI in e-commerce.

The presentation detailed specific protocols, technical implementations, and content strategies tested across Quality Unit’s products and customer base. What follows is Viktor’s technical roadmap for making e-commerce sites discoverable through AI systems, functional within AI-mediated commerce, and competitive as search shifts from keywords to AI citations.

Viktor Zeman at the E-commerce Mastermind conference

Part 1: AI Commerce Protocols As The New Infrastructure Layer

Viktor laid down the groundwork by introducing the standardized protocols that enable AI to interact with e-commerce systems on behalf of the users.

Major platforms such as Shopify, Salesforce Commerce Cloud or BigCommerce have already begun implementing parts of these protocols. So did the payment processors Stripe, PayPal, and Checkout.com. This clearly shows that AI commerce compatibility is becoming a competitive baseline.

Universal Commerce Protocol (UCP) enables AI assistants to discover products, compare options, initiate checkout, and complete transactions without users leaving the AI interface. UCP is already implemented in Shopify, Salesforce Commerce Cloud, BigCommerce, and major payment processors.

Agentic Commerce Protocol (ACP), a collaboration between Stipe and OpenAI, focuses specifically on transaction security and simplicity within conversational interfaces. While UCP addresses the broader shopping cycle, ACP specializes in the purchase process, as it enables checkout right within chat interfaces.

Agent Payment Protocol (AP2), developed by Google, provides the security framework making AI-mediated transactions trustworthy through transaction signing, merchant authentication, and payment authorization. AP2 integrates with UCP to provide the trust layer that makes autonomous AI purchasing a reality.

AI ecommerce protocols

Implementation Requirements

For your e-shop to become compatible with AI-driven shopping, and thus recommended by AI platforms, you must expose machine-readable data at multiple layers:

Structured Product Data

Products must be described using standards like:

  • schema.org: Give AI structured data it can read and understand
  • enriched product feeds: Add the native_commerce attribute to product feeds, signaling that products are available through AI commerce protocols.
  • and clearly defined attributes.

This allows AI systems to interpret products without ambiguity, including the variants, pricing, availability, and shipping constraints. It’s the foundation AI systems use to understand what you sell.

Merchant Metadata

AI agents don’t evaluate just the products, they also evaluate whether the merchants are trustworthy and a good fit for the user. That’s why key information about your business must be explicit and accessible:

  • return policies,
  • shipping zones,
  • delivery times,
  • supported payment methods.

Much of this data already exists in systems like Google Merchant Center, but it needs to be complete, accurate, and consistently maintained.

Commerce Manifests

One of the less visible but critical components is the commerce manifest. It’s typically a JSON file hosted on the merchant’s domain.

This manifest defines supported protocol versions, available services, payment handlers, and checkout capabilities, helping AI Agents understand how your store works.

API Endpoints for Checkout

Implement three critical endpoints:

  • POST /checkout-sessions - Create new checkout sessions
  • PUT /checkout-sessions/{id} - Update session details
  • POST /checkout-sessions/{id}/complete - Finalize transactions

AP2 Integration

Implement the aforementioned Agent Payment Protocol for secure transaction handling.

The Model Context Protocol (MCP) Bridge

For platforms without native UCP support, MCP provides an integration path. Zeman emphasized MCP’s growing importance as the connection layer between AI agents and existing systems.

What MCPs enable:

  • Secure communication between AI agents and APIs
  • Defined tools (API requests) available to AI
  • Resource management and access control
  • Prompt definitions for consistent interactions

Developing a Custom MCP server enables you to create precise prompts tailored to your specific use cases and send isolated API calls with proper rate limiting. This way, you can rest assured your AI implementation will be secure, controlled and as cheap as possible.

Example: A chatbot integrated with e-commerce MCP and a shipping provider (e.g. Chameleon) enables customers to query not just the order status, but track delivery in real time, all within a single conversation.

Logo

Ready to grow your business?

Start your free trial today and see results within days.

Part 2: Technical SEO Fundamentals

The second implementation topic Viktor covered was technical SEO. He took great care to emphasize that the cornerstone of SEO isn’t keywords. It’s “an infrastructure both search engines and AI systems can access and trust”. Because slow and unreliable sites get abandoned by both users and crawlers.

Infrastructure: Speed, Security, Scalability

Fast and Secure Infrastructure

  • Quality SSL certificates (not self-signed or expired)
  • CDN implementation for static assets
  • Image optimization: offloading, lazy loading, responsive versions
  • Proper caching strategies

Scalability Requirements

  • Traffic handling capability
  • Storage capacity for content and media
  • Database performance under load
  • MCP integration capability for AI agent access

Robots.txt: The Foundation of Crawl Control

Despite being a 30-year-old standard, robots.txt remains frequently misconfigured. The common issues include:

Non-existent robots.txt: Some sites return error pages or maintenance messages instead of proper robots.txt files, confusing crawlers about what’s allowed.

Blocked AI bots: Blocking AI crawlers prevents your content from being cited in AI responses. While you may want to block some bots, blanket blocking eliminates AI visibility.

Missing sitemap directives: Robots.txt should reference your XML sitemaps, guiding crawlers to complete content discovery.

Syntax errors: Trailing commas in wildcards (Disallow: /?pv=*,) cause parsing failures in some crawlers, creating unintended blocking.

Blocked valuable content: Sometimes sites block content they actually want indexed, usually through overly broad wildcard rules.

Robots txt Ahrefs report

Sitemaps: Complete, Fast, Accurate

XML sitemaps tell search engines and AI systems what content exists and how it’s organized. Common problems in this area include:

Cache header issues: Incorrect cache headers can prevent sitemaps from updating properly, leaving crawlers with stale content lists.

Incomplete URL coverage: Plugin-generated sitemaps often miss custom post types, taxonomies, or dynamic pages, leaving significant content undiscovered.

Rate limiting problems: Some sites implement aggressive rate limiting that blocks sitemap fetching entirely, returning 429 errors after just 10 URLs.

Links to 404 pages: Sitemaps containing dead links waste crawler budget and signal poor site maintenance.

Cache Headers: The Performance Multiplier

Proper HTTP cache headers dramatically improve performance for repeat visitors and reduce server load. Yet many sites misconfigure this completely. Cache-Control directives matter:

  • max-age: How long content can be cached
  • public vs private: Whether CDNs can cache content
  • no-cache vs no-store: Validation requirements
  • immutable: Content that never changes

Viktor emphasized checking cache headers for all asset types: HTML, CSS, JavaScript, images, fonts, and API responses.

Daily Audits: Continuous Monitoring

Technical problems emerge constantly and that’s why our team runs automated daily audits checking:

  • Broken links and 404 errors
  • Missing or misconfigured redirects
  • Duplicate content issues
  • Schema markup errors
  • Performance regressions
  • Security certificate status

Using tools like Ahrefs for automated monitoring ensures problems are caught and fixed before they accumulate into major traffic losses:

Daily ahrefs audit

Core Web Vitals: User Experience Metrics That Rank

Google’s Core Web Vitals directly influence rankings. You should focus mainly on these two critical tools: PageSpeed Insights provides lab data showing potential performance under controlled conditions. Chrome User Experience Report (CrUX) provides real-world data from actual users visiting your site.

The three critical metrics to watch out for are:

  • Largest Contentful Paint (LCP): Loading performance (target: <2.5s)
  • First Input Delay (FID): Interactivity (target: <100ms)
  • Cumulative Layout Shift (CLS): Visual stability (target: <0.1)

Schema.org: Teaching Machines What Your Site Is About

Schema.org markup transforms HTML into machine-readable structured data. Both traditional search engines and AI systems rely on schema to understand content context and relationships.

Essential schema types for e-commerce:

  • Organization: Business identity and structure
  • WebSite: Site-level information
  • WebPage: Page-specific metadata
  • Product: Complete product information
  • Breadcrumb: Navigation hierarchy
  • Offer: Pricing and availability
  • Review: Customer feedback and ratings
  • FAQ: Common questions and answers
  • HowTo: Step-by-step instructions

Common schema implementation mistakes include:

  • Missing required properties
  • Incorrect property types
  • Broken entity relationships
  • Outdated or stale data
  • Missing image assets referenced in schema

Regular validation through Google’s Rich Results Test ensures schema remains properly structured and complete.

Google Search Central

Semantic Understanding: Beyond Keywords

Viktor repeatedly emphasized that SEO is no longer about keywords. He detailed semantic analysis tools that reveal how search engines and AI systems understand your content’s meaning rather than just the words.

These tools visualize entity relationships, topic coverage, and semantic connections within your content. AI systems use these relationships to determine when your content answers specific queries, even when exact keywords don’t appear.

While traditional SEO asked: “Does this page contain the right keywords?” Modern SEO asks: “Does this page demonstrate expertise about relevant entities and their relationships?”

“Google doesn’t care if say you are ‘The fastest phone repair shop in Bratislava’. It asks whether your website shows expertise about the entity of ‘phone’ in relation to the entities of ‘repair’ and ‘Bratislava’. Do other sources validate expertise about these entites? For example reviews or native ads. Does user behavior, such as time spent on the site, signal that the content is relevant?” - Viktor Zeman

He further underlines that you should start with quick wins, such as fixing some of your technical SEO issues. The results should show up in no time. Only once AI and search engines can properly read and understand your site will it be time to start creating keyword-rich content.

Semantic scatterplot

Viktor challenged conventional wisdom about link building, arguing that internal link structure deserves far more attention than most sites give it.

PageRank distribution: Google (and increasingly, AI systems) flow authority through links. Your internal link structure determines which pages receive that authority.

Link juice concentration: Links from high-traffic pages carry more value than links from rarely-visited pages. Strategic internal linking amplifies the impact of your most popular content.

Context through anchor text: The words used in links signal topic relationships to both search engines and AI systems.

Placement hierarchy: Links in main content carry more weight than footer or navigation links.

The Case for Automation

On our websites, we don’t fight windmills trying to scale and mantain conistent manual internal links. Instead, we have implemented an automated internal linking at scale. This automation considers:

  • Semantic similarity between pages (to ensure links are relevant)
  • Topic clustering and relationships (so the links are well placed)
  • Authority distribution goals (to ensure core keywords link to high-priority pages)
  • Anchor text variation
  • Link density (to avoid stuffing the content with links)

The result is a comprehensive internal link structure that would be impossible to maintain manually while ensuring every piece of content connects logically to related topics.

Part 4: Content Generation - Structure Before Scale

Viktor’s approach to AI content generation focuses on systematic structure rather than ad-hoc article creation.

Understanding Current AI Behavior

Before generating content, understand how AI systems currently discuss your industry: Step 1: Generate test prompts - Create 500+ questions representing how users might query AI systems about topics in your domain. Step 2: Analyze AI responses - Use tools like AmICited.com to see which sources AI systems currently cite when answering these prompts.

This will reveal:

  • Your current citation frequency
  • Competitor citation patterns
  • Topics where no one is being cited (opportunities)
  • The structure and depth of successful answers

Step 3: Identify gaps - Find questions where AI systems provide weak answers or cite poor sources. These represent opportunities to become the authoritative citation.

Why Regenerate Product Descriptions

AI-optimized product descriptions benefit three critical channels:

  • Traditional SEO: Better keyword coverage and semantic richness improve traditional search rankings.
  • GEO (Generative Engine Optimization): Structured, comprehensive descriptions make products more likely to be recommended by AI systems.
  • **PPC (Pay-Per-Click):**AI-powered ad platforms like Performance Max and AI Max use product descriptions to optimize ad targeting and creative generation.

What makes product descriptions AI-ready:

  • Functional feature descriptions (what it does)
  • Use case explanations (how customers use it)
  • Detailed technical specifications
  • Review insights extraction (from YouTube, customer feedback)
  • Comprehensive FAQ covering questions and error messages

Post Type Specialization

Rather than generic “blog content,” Viktor advocates creating specialized AI agents to generate each distinct post type, each with defined elements and structure. For example, glossary,checklists, how-to posts, feature documentation, but also reusable blog frameworks such as analysis, insights, industry commentary.

While a general AI Agent with a really good prompt might strike gold first-try, that’s not what you need. You’re looking for scale and repeatable accurate workflows. Writing new prompts for agents each time, hoping they’ll work, and then saving the prompts in a notepad won’t give you that. Manually copying the same prompt to get a single output won’t scale.

What you need to do is create a highly-specialized AI Agent that will perform flawlessly consistently and in scale. Each post type requires a dedicated AI agent configured with specific prompt templates, formatting rules, and structural requirements.

This includes clearly defining each section of the post type. For example, for the title element Viktor recommends adding this structure to your prompt:

  • Maximum 60 characters
  • SEO-friendly keyword inclusion
  • Clear value proposition

The Content Generation Workflow

Viktor briefly outlined the exact process our team uses:

  1. Generate prompt library: Create 500+ prompts representing user queries across your domain using AmICited.com or similar tools.

  2. Analyze citation patterns: Understand current AI behavior for these prompts and find opportunities. Figure out what’s cited, what’s missing, what’s weak.

  3. Build specialized agents: Create AI agents in FlowHunt (or similar platforms) for each post type with defined elements and constraints.

  4. Generate systematically: Produce content at scale using specialized agents for each post type, maintaining consistent structure and quality.

  5. Implement semantic linking: Use semantic similarity algorithms to automatically suggest and create related article connections.

  6. Monitor and refine: Track which content gets cited by AI systems and refine your approach based on real citation data.

Real Results: HZ-Containers Case Study

18,000% traffic increase (180x) from January-September 2025, delivering 2,000 containers. The approach is built on technical SEO foundations, comprehensive content answering all questions, proper schema markup, structured post types, and automated internal linking. No keyword stuffing or link schemes.

HZ containers traffic improvements

Part 5: The Shift from SEO to GEO

Viktor emphasized a fundamental transition happening in how people find and evaluate products online. Google considers hundreds of ranking factors. Here’s only a partial list to illustrate the complexity:

  • Content quality signals
  • Backlink authority
  • Technical performance
  • User experience metrics
  • Mobile optimization
  • Security indicators
  • Schema markup
  • And hundreds more…

The reality is that each improvement contributes only promille (thousandths) of gain. Achieving meaningful ranking improvements takes months of continuous optimization across dozens of factors simultaneously. Traditional SEO remains important but represents a grinding, incremental approach to visibility.

The GEO Alternative: Direct AI Citations

Generative Engine Optimization focuses on being cited by AI systems when users ask questions relevant to your business.

Key differences from traditional SEO:

  • Speed: AI citations can happen within days of publishing new content, not months of waiting for ranking improvements.
  • Control: You directly influence what AI systems know about you through structured content, rather than hoping algorithm changes favor your approach.
  • Comprehensiveness: AI systems reward complete answers over keyword optimization, aligning incentives with actual user value.
  • Attribution: When AI systems cite your content, users see direct source attribution, building credibility more directly than traditional search snippets.

Part 6: Making It Practical - Tools and Workflows

Here’s a quick overview of the key tools and implementation roadmap.

Essential Tools

  • AmICited.com: Track how AI platforms cite your brand across different prompts. Monitor competitors. Identify opportunities where no one is being cited.
  • Ahrefs: Technical audits, backlink analysis, competitor research, rank tracking.
  • Google Search Console: Index status, crawl errors, performance data, Core Web Vitals.
  • PageSpeed Insights & CrUX: Performance monitoring with real user data.
  • Google Rich Results Test: Schema validation and structured data verification.
  • Claude Code / AI Development Tools: Content generation automation, MCP server development, systematic content creation.
  • FlowHunt: Visual AI workflow builder for creating specialized content generation agents, implementing automated processes, and managing complex AI automation.

The Implementation Priority

Viktor recommended a specific sequence for implementation: Phase 1: Technical Foundation (Weeks 1-4) • Infrastructure audit and optimization • Robots.txt and sitemap configuration • Cache header implementation • Core Web Vitals improvement • Schema.org markup for existing pages Phase 2: Content Structure (Weeks 5-8) • Define post types and their elements • Create specialized AI agents for each type • Establish internal linking automation • Implement semantic similarity systems Phase 3: Content Generation (Weeks 9-16) • Generate prompt library (500+ prompts) • Analyze current citation patterns • Begin systematic content creation • Monitor AI citation performance • Refine based on data Phase 4: Protocol Implementation (Ongoing) • Implement UCP/ACP/AP2 if applicable • Develop custom MCP servers for integrations • Test AI commerce functionality • Expand based on adoption

The Long-Term Mindset

This isn’t a quick-win strategy. Technical SEO, comprehensive content, and AI protocol implementation require sustained investment over months.

However, the results compound. Each piece of properly structured content increases your authority. Each technical improvement enhances the effectiveness of all your content. Each citation by AI systems increases the likelihood of future citations. The question isn’t whether to invest in this infrastructure—it’s whether to lead the transition or follow later when competitors have already established authority in AI-mediated discovery.

The Bottom Line

For e-commerce technical leaders, this framework offers clarity. You have to start with build proper technical foundations, implementing AI commerce protocols, structuring content systematically, and optimizing for both traditional search and AI citations simultaneously. The infrastructure you build today determines discoverability when users ask AI systems for recommendations tomorrow.

Connecting the Framework

Viktors technical presentation complements the strategic and operational perspectives from earlier in the conference series.

Michal Lichner’s implementation roadmap established where to focus AI implementation and how to prepare content systematically. Zeman’s presentation provides the technical infrastructure that makes that content discoverable and functional.

Jozef Štofira’s support automation shows the exact set of tools we use to automate support gruntwork, from filtering and categorizing, to data enrichment, answer assistant and human handoff.

Together, these three perspectives form a complete picture: strategic planning, technical infrastructure, and operational execution for e-commerce in an AI-mediated commerce environment.

Frequently asked questions

What are AI commerce protocols and why do they matter for e-commerce?

AI commerce protocols like UCP (Universal Commerce Protocol), ACP (Agentic Commerce Protocol), and AP2 (Agent Payment Protocol) standardize how AI systems interact with e-commerce platforms. They enable AI assistants to browse products, compare options, initiate checkout, and complete transactions on behalf of users, making your store accessible through AI-mediated shopping experiences.

What's the difference between SEO and GEO optimization?

SEO (Search Engine Optimization) focuses on ranking in traditional search engines like Google through keyword optimization and backlinks. GEO (Generative Engine Optimization) focuses on being cited by AI systems like ChatGPT and Perplexity through structured content, clear entity definitions, and comprehensive answers. Modern e-commerce needs both: SEO for current traffic, GEO for future AI-mediated discovery.

What technical SEO fundamentals should e-commerce sites prioritize?

Priority fundamentals include: fast, secure infrastructure with CDN; properly configured robots.txt and sitemaps; correct cache headers; regular technical audits; Core Web Vitals optimization; comprehensive schema.org markup for entities; semantic content structure; and automated internal linking. These create the foundation that both traditional search engines and AI systems need to understand and index your content properly.

How should e-commerce businesses approach AI content generation?

Start by generating 500+ prompts using tools like AmICited.com to understand how AI systems currently discuss your industry. Create specialized AI agents for each content type (glossary, how-to, checklists, product descriptions) with defined elements and formatting rules. Use semantic similarity for related article suggestions. Generate content that answers all potential visitor questions comprehensively rather than targeting narrow keywords.

Arshia is an AI Workflow Engineer at FlowHunt. With a background in computer science and a passion for AI, he specializes in creating efficient workflows that integrate AI tools into everyday tasks, enhancing productivity and creativity.

Arshia Kahani
Arshia Kahani
AI Workflow Engineer

Automate Your SEO and Content Generation with FlowHunt

Build AI agents that generate SEO-optimized content, implement technical SEO improvements, and create AI protocol integrations—all within FlowHunt's visual workflow builder.

Learn more

Where and How to Begin with AI in E-commerce: A Practical Roadmap
Where and How to Begin with AI in E-commerce: A Practical Roadmap

Where and How to Begin with AI in E-commerce: A Practical Roadmap

A practical framework for implementing AI in e-commerce from Quality Unit's CMO. Learn where to start, common challenges, content preparation strategies, and re...

9 min read
AI Implementation E-commerce +3