AI agents check your store policies before they recommend your products. Terms of service, refund policy, shipping policy, privacy policy — these are trust signals. If they're missing, AI agents may skip your store entirely.
Shopify's Agentic Storefronts — the feature that lets AI agents browse and buy from your store on behalf of customers — require published policies. No policies, no eligibility.
The four required policies
Every Shopify store should have these four:
- Terms of Service — at
/policies/terms-of-service - Refund Policy — at
/policies/refund-policy - Shipping Policy — at
/policies/shipping-policy - Privacy Policy — at
/policies/privacy-policy
AI agents check whether each of these URLs returns a real page. If a policy page is missing or redirects to your homepage, it counts as missing.
How to check and publish policies
- Go to Settings > Policies in your Shopify admin
- You'll see fields for each policy type
- If a field is empty, you need to add content. Shopify offers auto-generated templates — click Create from template for a starting point
- Review and customize the template for your store
- Click Save
Once saved, Shopify automatically publishes each policy at its standard URL. You can verify by visiting yourstore.com/policies/refund-policy (replace with your actual domain).
This takes about 5 minutes. If you use the templates as a starting point, most of the content is already there — just customize the details for your business.
The catch: robots.txt blocks policy pages
Here's the part most stores miss. Even after you publish all four policies, AI agents might not be able to read them.
Shopify's default robots.txt includes this rule:
Disallow: /policies/
This tells all crawlers — including AI agents — to stay away from your policy pages. The policies exist, but AI agents are told not to look at them.
This is a Shopify default. It's not something you configured. And it affects every Shopify store unless you override it.
How to fix the robots.txt block
You can override Shopify's default robots.txt by creating a robots.txt.liquid template in your theme:
- Go to Online Store > Themes in your Shopify admin
- Click Actions > Edit code (or "Edit code" in the three-dot menu)
- In the Templates section, look for
robots.txt.liquid - If it doesn't exist, click Add a new template, select
robots.txt - Add
Allowrules for AI-specific crawlers to access your policy pages
Here's what to add to your robots.txt.liquid template. This preserves Shopify's defaults and adds Allow rules so AI crawlers can read your policy pages:
Liquid template (robots.txt.liquid)
{% comment %}
Shopify default robots.txt rules — keep these
{% endcomment %}
User-agent: *
{{ content_for_header }}
{% comment %}
Allow AI crawlers to access policy pages
{% endcomment %}
User-agent: GPTBot
Allow: /policies/
User-agent: ChatGPT-User
Allow: /policies/
User-agent: ClaudeBot
Allow: /policies/
User-agent: PerplexityBot
Allow: /policies/
Sitemap: {{ shop.url }}/sitemap.xml
Rendered output
After Shopify renders the template, your robots.txt will include these rules alongside the defaults:
User-agent: GPTBot
Allow: /policies/
User-agent: ChatGPT-User
Allow: /policies/
User-agent: ClaudeBot
Allow: /policies/
User-agent: PerplexityBot
Allow: /policies/
Sitemap: https://yourstore.com/sitemap.xml
This tells those AI agents that they can read your policy pages, even though the default rule blocks /policies/.
How to verify
- Visit
yourstore.com/robots.txtin your browser - Search for
/policies/ - If you see
Disallow: /policies/with no correspondingAllowrules for AI agents, the policies are still blocked - After making changes, check again to confirm the
Allowrules are in place
For more details on robots.txt and AI crawlers, see our robots.txt guide.