
Where and How to Begin with AI in E-commerce: A Practical Roadmap
A practical framework for implementing AI in e-commerce from Quality Unit's CMO. Learn where to start, common challenges, content preparation strategies, and re...

A technical founder’s guide to implementing AI commerce protocols (UCP, ACP, AP2), mastering technical SEO fundamentals, and generating content optimized for both traditional search and AI citations.
Viktor Zeman co-founded Quality Unit over two decades ago and has led the development and global growth of the product suite, including FlowHunt. Since 2024, he’s focused specifically on FlowHunt and helping companies implement practical AI solutions, automation, and modern AI-driven work environments. His E-commerce Mastermind presentation dove deep into three critical areas of technical implementation of AI in e-commerce.
The presentation detailed specific protocols, technical implementations, and content strategies tested across Quality Unit’s products and customer base. What follows is Viktor’s technical roadmap for making e-commerce sites discoverable through AI systems, functional within AI-mediated commerce, and competitive as search shifts from keywords to AI citations.

Viktor laid down the groundwork by introducing the standardized protocols that enable AI to interact with e-commerce systems on behalf of the users.
Major platforms such as Shopify, Salesforce Commerce Cloud or BigCommerce have already begun implementing parts of these protocols. So did the payment processors Stripe, PayPal, and Checkout.com. This clearly shows that AI commerce compatibility is becoming a competitive baseline.
Universal Commerce Protocol (UCP) enables AI assistants to discover products, compare options, initiate checkout, and complete transactions without users leaving the AI interface. UCP is already implemented in Shopify, Salesforce Commerce Cloud, BigCommerce, and major payment processors.
Agentic Commerce Protocol (ACP), a collaboration between Stipe and OpenAI, focuses specifically on transaction security and simplicity within conversational interfaces. While UCP addresses the broader shopping cycle, ACP specializes in the purchase process, as it enables checkout right within chat interfaces.
Agent Payment Protocol (AP2), developed by Google, provides the security framework making AI-mediated transactions trustworthy through transaction signing, merchant authentication, and payment authorization. AP2 integrates with UCP to provide the trust layer that makes autonomous AI purchasing a reality.

For your e-shop to become compatible with AI-driven shopping, and thus recommended by AI platforms, you must expose machine-readable data at multiple layers:
Products must be described using standards like:
schema.org: Give AI structured data it can read and understandThis allows AI systems to interpret products without ambiguity, including the variants, pricing, availability, and shipping constraints. It’s the foundation AI systems use to understand what you sell.
AI agents don’t evaluate just the products, they also evaluate whether the merchants are trustworthy and a good fit for the user. That’s why key information about your business must be explicit and accessible:
Much of this data already exists in systems like Google Merchant Center, but it needs to be complete, accurate, and consistently maintained.
One of the less visible but critical components is the commerce manifest. It’s typically a JSON file hosted on the merchant’s domain.
This manifest defines supported protocol versions, available services, payment handlers, and checkout capabilities, helping AI Agents understand how your store works.
Implement three critical endpoints:
Implement the aforementioned Agent Payment Protocol for secure transaction handling.
For platforms without native UCP support, MCP provides an integration path. Zeman emphasized MCP’s growing importance as the connection layer between AI agents and existing systems.
Developing a Custom MCP server enables you to create precise prompts tailored to your specific use cases and send isolated API calls with proper rate limiting. This way, you can rest assured your AI implementation will be secure, controlled and as cheap as possible.
Example: A chatbot integrated with e-commerce MCP and a shipping provider (e.g. Chameleon) enables customers to query not just the order status, but track delivery in real time, all within a single conversation.
The second implementation topic Viktor covered was technical SEO. He took great care to emphasize that the cornerstone of SEO isn’t keywords. It’s “an infrastructure both search engines and AI systems can access and trust”. Because slow and unreliable sites get abandoned by both users and crawlers.
Despite being a 30-year-old standard, robots.txt remains frequently misconfigured. The common issues include:
Non-existent robots.txt: Some sites return error pages or maintenance messages instead of proper robots.txt files, confusing crawlers about what’s allowed.
Blocked AI bots: Blocking AI crawlers prevents your content from being cited in AI responses. While you may want to block some bots, blanket blocking eliminates AI visibility.
Missing sitemap directives: Robots.txt should reference your XML sitemaps, guiding crawlers to complete content discovery.
Syntax errors: Trailing commas in wildcards (Disallow: /?pv=*,) cause parsing failures in some crawlers, creating unintended blocking.
Blocked valuable content: Sometimes sites block content they actually want indexed, usually through overly broad wildcard rules.

XML sitemaps tell search engines and AI systems what content exists and how it’s organized. Common problems in this area include:
Cache header issues: Incorrect cache headers can prevent sitemaps from updating properly, leaving crawlers with stale content lists.
Incomplete URL coverage: Plugin-generated sitemaps often miss custom post types, taxonomies, or dynamic pages, leaving significant content undiscovered.
Rate limiting problems: Some sites implement aggressive rate limiting that blocks sitemap fetching entirely, returning 429 errors after just 10 URLs.
Links to 404 pages: Sitemaps containing dead links waste crawler budget and signal poor site maintenance.
Proper HTTP cache headers dramatically improve performance for repeat visitors and reduce server load. Yet many sites misconfigure this completely. Cache-Control directives matter:
Viktor emphasized checking cache headers for all asset types: HTML, CSS, JavaScript, images, fonts, and API responses.
Technical problems emerge constantly and that’s why our team runs automated daily audits checking:
Using tools like Ahrefs for automated monitoring ensures problems are caught and fixed before they accumulate into major traffic losses:

Google’s Core Web Vitals directly influence rankings. You should focus mainly on these two critical tools: PageSpeed Insights provides lab data showing potential performance under controlled conditions. Chrome User Experience Report (CrUX) provides real-world data from actual users visiting your site.
The three critical metrics to watch out for are:
Schema.org markup transforms HTML into machine-readable structured data. Both traditional search engines and AI systems rely on schema to understand content context and relationships.
Essential schema types for e-commerce:
Common schema implementation mistakes include:
Regular validation through Google’s Rich Results Test ensures schema remains properly structured and complete.

Viktor repeatedly emphasized that SEO is no longer about keywords. He detailed semantic analysis tools that reveal how search engines and AI systems understand your content’s meaning rather than just the words.
These tools visualize entity relationships, topic coverage, and semantic connections within your content. AI systems use these relationships to determine when your content answers specific queries, even when exact keywords don’t appear.
While traditional SEO asked: “Does this page contain the right keywords?” Modern SEO asks: “Does this page demonstrate expertise about relevant entities and their relationships?”
“Google doesn’t care if say you are ‘The fastest phone repair shop in Bratislava’. It asks whether your website shows expertise about the entity of ‘phone’ in relation to the entities of ‘repair’ and ‘Bratislava’. Do other sources validate expertise about these entites? For example reviews or native ads. Does user behavior, such as time spent on the site, signal that the content is relevant?” - Viktor Zeman
He further underlines that you should start with quick wins, such as fixing some of your technical SEO issues. The results should show up in no time. Only once AI and search engines can properly read and understand your site will it be time to start creating keyword-rich content.

Viktor challenged conventional wisdom about link building, arguing that internal link structure deserves far more attention than most sites give it.
PageRank distribution: Google (and increasingly, AI systems) flow authority through links. Your internal link structure determines which pages receive that authority.
Link juice concentration: Links from high-traffic pages carry more value than links from rarely-visited pages. Strategic internal linking amplifies the impact of your most popular content.
Context through anchor text: The words used in links signal topic relationships to both search engines and AI systems.
Placement hierarchy: Links in main content carry more weight than footer or navigation links.
On our websites, we don’t fight windmills trying to scale and mantain conistent manual internal links. Instead, we have implemented an automated internal linking at scale. This automation considers:
The result is a comprehensive internal link structure that would be impossible to maintain manually while ensuring every piece of content connects logically to related topics.
Viktor’s approach to AI content generation focuses on systematic structure rather than ad-hoc article creation.
Before generating content, understand how AI systems currently discuss your industry: Step 1: Generate test prompts - Create 500+ questions representing how users might query AI systems about topics in your domain. Step 2: Analyze AI responses - Use tools like AmICited.com to see which sources AI systems currently cite when answering these prompts.
This will reveal:
Step 3: Identify gaps - Find questions where AI systems provide weak answers or cite poor sources. These represent opportunities to become the authoritative citation.
AI-optimized product descriptions benefit three critical channels:
Rather than generic “blog content,” Viktor advocates creating specialized AI agents to generate each distinct post type, each with defined elements and structure. For example, glossary,checklists, how-to posts, feature documentation, but also reusable blog frameworks such as analysis, insights, industry commentary.
While a general AI Agent with a really good prompt might strike gold first-try, that’s not what you need. You’re looking for scale and repeatable accurate workflows. Writing new prompts for agents each time, hoping they’ll work, and then saving the prompts in a notepad won’t give you that. Manually copying the same prompt to get a single output won’t scale.
What you need to do is create a highly-specialized AI Agent that will perform flawlessly consistently and in scale. Each post type requires a dedicated AI agent configured with specific prompt templates, formatting rules, and structural requirements.
This includes clearly defining each section of the post type. For example, for the title element Viktor recommends adding this structure to your prompt:
Viktor briefly outlined the exact process our team uses:
Generate prompt library: Create 500+ prompts representing user queries across your domain using AmICited.com or similar tools.
Analyze citation patterns: Understand current AI behavior for these prompts and find opportunities. Figure out what’s cited, what’s missing, what’s weak.
Build specialized agents: Create AI agents in FlowHunt (or similar platforms) for each post type with defined elements and constraints.
Generate systematically: Produce content at scale using specialized agents for each post type, maintaining consistent structure and quality.
Implement semantic linking: Use semantic similarity algorithms to automatically suggest and create related article connections.
Monitor and refine: Track which content gets cited by AI systems and refine your approach based on real citation data.
18,000% traffic increase (180x) from January-September 2025, delivering 2,000 containers. The approach is built on technical SEO foundations, comprehensive content answering all questions, proper schema markup, structured post types, and automated internal linking. No keyword stuffing or link schemes.

Viktor emphasized a fundamental transition happening in how people find and evaluate products online. Google considers hundreds of ranking factors. Here’s only a partial list to illustrate the complexity:
The reality is that each improvement contributes only promille (thousandths) of gain. Achieving meaningful ranking improvements takes months of continuous optimization across dozens of factors simultaneously. Traditional SEO remains important but represents a grinding, incremental approach to visibility.
Generative Engine Optimization focuses on being cited by AI systems when users ask questions relevant to your business.
Key differences from traditional SEO:
Here’s a quick overview of the key tools and implementation roadmap.
Viktor recommended a specific sequence for implementation: Phase 1: Technical Foundation (Weeks 1-4) • Infrastructure audit and optimization • Robots.txt and sitemap configuration • Cache header implementation • Core Web Vitals improvement • Schema.org markup for existing pages Phase 2: Content Structure (Weeks 5-8) • Define post types and their elements • Create specialized AI agents for each type • Establish internal linking automation • Implement semantic similarity systems Phase 3: Content Generation (Weeks 9-16) • Generate prompt library (500+ prompts) • Analyze current citation patterns • Begin systematic content creation • Monitor AI citation performance • Refine based on data Phase 4: Protocol Implementation (Ongoing) • Implement UCP/ACP/AP2 if applicable • Develop custom MCP servers for integrations • Test AI commerce functionality • Expand based on adoption
This isn’t a quick-win strategy. Technical SEO, comprehensive content, and AI protocol implementation require sustained investment over months.
However, the results compound. Each piece of properly structured content increases your authority. Each technical improvement enhances the effectiveness of all your content. Each citation by AI systems increases the likelihood of future citations. The question isn’t whether to invest in this infrastructure—it’s whether to lead the transition or follow later when competitors have already established authority in AI-mediated discovery.
For e-commerce technical leaders, this framework offers clarity. You have to start with build proper technical foundations, implementing AI commerce protocols, structuring content systematically, and optimizing for both traditional search and AI citations simultaneously. The infrastructure you build today determines discoverability when users ask AI systems for recommendations tomorrow.
Viktors technical presentation complements the strategic and operational perspectives from earlier in the conference series.
Michal Lichner’s implementation roadmap established where to focus AI implementation and how to prepare content systematically. Zeman’s presentation provides the technical infrastructure that makes that content discoverable and functional.
Jozef Štofira’s support automation shows the exact set of tools we use to automate support gruntwork, from filtering and categorizing, to data enrichment, answer assistant and human handoff.
Together, these three perspectives form a complete picture: strategic planning, technical infrastructure, and operational execution for e-commerce in an AI-mediated commerce environment.
AI commerce protocols like UCP (Universal Commerce Protocol), ACP (Agentic Commerce Protocol), and AP2 (Agent Payment Protocol) standardize how AI systems interact with e-commerce platforms. They enable AI assistants to browse products, compare options, initiate checkout, and complete transactions on behalf of users, making your store accessible through AI-mediated shopping experiences.
SEO (Search Engine Optimization) focuses on ranking in traditional search engines like Google through keyword optimization and backlinks. GEO (Generative Engine Optimization) focuses on being cited by AI systems like ChatGPT and Perplexity through structured content, clear entity definitions, and comprehensive answers. Modern e-commerce needs both: SEO for current traffic, GEO for future AI-mediated discovery.
Priority fundamentals include: fast, secure infrastructure with CDN; properly configured robots.txt and sitemaps; correct cache headers; regular technical audits; Core Web Vitals optimization; comprehensive schema.org markup for entities; semantic content structure; and automated internal linking. These create the foundation that both traditional search engines and AI systems need to understand and index your content properly.
Start by generating 500+ prompts using tools like AmICited.com to understand how AI systems currently discuss your industry. Create specialized AI agents for each content type (glossary, how-to, checklists, product descriptions) with defined elements and formatting rules. Use semantic similarity for related article suggestions. Generate content that answers all potential visitor questions comprehensively rather than targeting narrow keywords.
Arshia is an AI Workflow Engineer at FlowHunt. With a background in computer science and a passion for AI, he specializes in creating efficient workflows that integrate AI tools into everyday tasks, enhancing productivity and creativity.

Build AI agents that generate SEO-optimized content, implement technical SEO improvements, and create AI protocol integrations—all within FlowHunt's visual workflow builder.

A practical framework for implementing AI in e-commerce from Quality Unit's CMO. Learn where to start, common challenges, content preparation strategies, and re...

Learn how to create an AI-powered Shopify manager using FlowHunt's MCP server integration to automate product management, orders, customers, and inventory contr...

A technical breakdown of six AI functions that reduced support workload by 48.5%. Learn the specific problems each solves, implementation approach, and measurab...