Skip to main content

GUIDE № 06 · POLICIES

Why Shopify Hides Your Store Policies from AI (and How to Unhide Them)

4 min read · Guide 6 of 17

AI agents check your store policies before they recommend your products. Terms of service, refund policy, shipping policy, privacy policy — these are trust signals. If they're missing, AI agents may skip your store entirely.

Shopify's Agentic Storefronts — the feature that lets AI agents browse and buy from your store on behalf of customers — require published policies. No policies, no eligibility.

The four required policies

Every Shopify store should have these four:

  1. Terms of Service — at /policies/terms-of-service
  2. Refund Policy — at /policies/refund-policy
  3. Shipping Policy — at /policies/shipping-policy
  4. Privacy Policy — at /policies/privacy-policy

AI agents check whether each of these URLs returns a real page. If a policy page is missing or redirects to your homepage, it counts as missing.

How to check and publish policies

  1. Go to Settings > Policies in your Shopify admin
  2. You'll see fields for each policy type
  3. If a field is empty, you need to add content. Shopify offers auto-generated templates — click Create from template for a starting point
  4. Review and customize the template for your store
  5. Click Save

Once saved, Shopify automatically publishes each policy at its standard URL. You can verify by visiting yourstore.com/policies/refund-policy (replace with your actual domain).

This takes about 5 minutes. If you use the templates as a starting point, most of the content is already there — just customize the details for your business.

The catch: robots.txt blocks policy pages

Here's the part most stores miss. Even after you publish all four policies, AI agents might not be able to read them.

Shopify's default robots.txt includes this rule:

Disallow: /policies/

This tells all crawlers — including AI agents — to stay away from your policy pages. The policies exist, but AI agents are told not to look at them.

This is a Shopify default. It's not something you configured. And it affects every Shopify store unless you override it.

How to fix the robots.txt block

You can override Shopify's default robots.txt by creating a robots.txt.liquid template in your theme:

  1. Go to Online Store > Themes in your Shopify admin
  2. Click Actions > Edit code (or "Edit code" in the three-dot menu)
  3. In the Templates section, look for robots.txt.liquid
  4. If it doesn't exist, click Add a new template, select robots.txt
  5. Add Allow rules for AI-specific crawlers to access your policy pages

Here's what to add to your robots.txt.liquid template. This preserves Shopify's defaults and adds Allow rules so AI crawlers can read your policy pages. The User-agent list below covers the nine agents StoreAudit checks for in its robots.txt sub-score: GPTBot, ChatGPT-User, Google-Extended, ClaudeBot, PerplexityBot, CCBot, anthropic-ai, Amazonbot, and OAI-SearchBot:

Liquid template (robots.txt.liquid)

{% comment %}
  Allow AI crawlers to access policy pages.
  Custom rules go ABOVE the default loop so they take precedence.
{% endcomment %}
User-agent: GPTBot
Allow: /policies/

User-agent: ChatGPT-User
Allow: /policies/

User-agent: Google-Extended
Allow: /policies/

User-agent: ClaudeBot
Allow: /policies/

User-agent: PerplexityBot
Allow: /policies/

User-agent: CCBot
Allow: /policies/

User-agent: anthropic-ai
Allow: /policies/

User-agent: Amazonbot
Allow: /policies/

User-agent: OAI-SearchBot
Allow: /policies/

{% comment %}
  Shopify default robots.txt rules — emit these unchanged.
  See: https://shopify.dev/docs/storefronts/themes/architecture/templates/robots-txt-liquid
{% endcomment %}
{% for group in robots.default_groups %}
  {{- group.user_agent }}
  {%- for rule in group.rules -%}
    {{ rule }}
  {%- endfor -%}
  {%- if group.sitemap != blank %}
    {{ group.sitemap }}
  {%- endif %}
{% endfor %}

Rendered output

After Shopify renders the template, your robots.txt will include these rules alongside the defaults:

User-agent: GPTBot
Allow: /policies/

User-agent: ChatGPT-User
Allow: /policies/

User-agent: Google-Extended
Allow: /policies/

User-agent: ClaudeBot
Allow: /policies/

User-agent: PerplexityBot
Allow: /policies/

User-agent: CCBot
Allow: /policies/

User-agent: anthropic-ai
Allow: /policies/

User-agent: Amazonbot
Allow: /policies/

User-agent: OAI-SearchBot
Allow: /policies/

Sitemap: https://yourstore.com/sitemap.xml

These match the AI user-agents StoreAudit checks for in its robots.txt sub-score. The rules tell those AI agents that they can read your policy pages, even though the default rule blocks /policies/.

How to verify

  1. Visit yourstore.com/robots.txt in your browser
  2. Search for /policies/
  3. If you see Disallow: /policies/ with no corresponding Allow rules for AI agents, the policies are still blocked
  4. After making changes, check again to confirm the Allow rules are in place

For more details on robots.txt and AI crawlers, see our robots.txt guide.