Your Shopify store might be completely invisible to ChatGPT right now. Not because of your product data or your SEO — but because of a text file most merchants don't even know exists.
The file is robots.txt. And if it blocks AI crawlers, none of them can see any of your products. Not one.
What robots.txt does
robots.txt is a file that sits at the root of your website (yourstore.com/robots.txt). It tells web crawlers — including AI agents — what they can and can't access on your site.
When an AI agent like GPTBot (ChatGPT's crawler) or ClaudeBot visits your store, it checks robots.txt first. If the file says "Disallow: /", the agent stops right there. It won't crawl a single page.
Shopify's defaults are fine
Shopify's default robots.txt blocks a few internal paths:
/cart— the shopping cart page/checkout— the checkout flow/search— the search results page/policies/— policy pages (see our policies guide for more on this)
These are sensible defaults. Cart and checkout pages don't need to be crawled. The important thing is that product pages, collection pages, and your homepage are not blocked by default.
The problem: custom blocks on AI crawlers
Some stores — or apps installed on those stores — add custom rules that block specific AI crawlers. The AI-specific user agents to watch for:
- GPTBot — ChatGPT's web crawler
- ChatGPT-User — ChatGPT browsing on behalf of users
- Google-Extended — Google's AI training crawler
- ClaudeBot — Anthropic's Claude crawler
- PerplexityBot — Perplexity AI's crawler
If your robots.txt includes rules like this:
User-agent: GPTBot
Disallow: /
User-agent: ClaudeBot
Disallow: /
Then those AI agents can't access anything on your store. Every product page, every collection, every piece of content — all invisible to them.
These rules are usually added via a robots.txt.liquid template in your Shopify theme. The Liquid template might look like this:
Liquid template (robots.txt.liquid)
{% comment %}
WARNING: This blocks AI crawlers from your entire store.
Remove these rules if you want AI agents to recommend your products.
{% endcomment %}
User-agent: GPTBot
Disallow: /
User-agent: ClaudeBot
Disallow: /
Rendered output
User-agent: GPTBot
Disallow: /
User-agent: ClaudeBot
Disallow: /
The template and the output look nearly identical for robots.txt — the key difference is that the Liquid file is where you make changes. The rendered output is what crawlers actually see at yourstore.com/robots.txt.
How to check right now
- Open your browser and go to
yourstore.com/robots.txt - Look for lines that mention the AI user agents listed above
- If you see
Disallow: /under any of those user agents, that AI agent is blocked from your entire store
If you only see Shopify's default rules (blocking /cart, /checkout, /search, /policies/), you're fine. The problem is when there are additional rules targeting AI-specific crawlers.
How to fix it
If you find AI crawler blocks in your robots.txt, here's how to remove them:
- Go to Online Store > Themes in your Shopify admin
- Click Actions > Edit code (or the three-dot menu, then "Edit code")
- In the Templates section, look for
robots.txt.liquid - If it exists, open it and look for the
Disallowrules under AI user agents - Remove the rules that block AI crawlers, or change
Disallow: /toAllow: / - Save the file
If you don't see a robots.txt.liquid file, the blocks might be coming from a Shopify app. Check your installed apps — some SEO or security apps add robots.txt rules. Look at the app settings for any crawler blocking options.
After making changes
Visit yourstore.com/robots.txt again to verify the AI crawler blocks are gone. The changes take effect immediately — there's no caching delay for robots.txt on Shopify.
Keep in mind that even after removing the blocks, AI crawlers need time to re-crawl your store. The change is instant on your end, but it may take days or weeks for AI agents to fully index your products.
Should you block any AI crawlers?
That depends on your goals. Some store owners have concerns about AI training data or content reuse. That's a valid position.
But if your goal is to have AI agents recommend your products to customers — and especially if you want to participate in Shopify's Agentic Storefronts — blocking AI crawlers works against you. You can't be recommended by agents that can't see your store.
It's worth noting that Google-Extended specifically controls AI training, not search indexing. Blocking it means your content won't be used for Google's AI training, but it won't affect your Google Search rankings. The others (GPTBot, ClaudeBot, PerplexityBot) are used for both training and live product discovery.