AI crawlers are the bots that index content for AI platforms. Proper configuration ensures your content can be discovered and used by ChatGPT, Perplexity, Claude, and other AI systems.
Major AI Crawlers
GPTBot (OpenAI/ChatGPT)
User-agent: GPTBot
Crawls content for ChatGPT's training and web browsing features. Allowing GPTBot access increases the likelihood your content informs ChatGPT responses.
PerplexityBot (Perplexity)
User-agent: PerplexityBot
Indexes content for Perplexity's search-first AI. Critical for visibility in Perplexity's citation-heavy responses.
ClaudeBot (Anthropic/Claude)
User-agent: ClaudeBot, anthropic-ai
Crawls for Anthropic's Claude AI. Important for visibility in Claude's responses and enterprise deployments.
Google-Extended (Google AI)
User-agent: Google-Extended
Google's crawler for AI training (separate from Googlebot for search). Affects Google AI Overviews and Bard/Gemini.
Robots.txt Configuration
Allow All AI Crawlers
To allow all AI crawlers access to your content:
Allow: /
User-agent: PerplexityBot
Allow: /
User-agent: ClaudeBot
Allow: /
User-agent: anthropic-ai
Allow: /
User-agent: Google-Extended
Allow: /
Selective Access
Allow AI crawlers but block specific directories:
Allow: /
Disallow: /private/
Disallow: /internal/
Block AI Crawlers
If you need to block AI crawlers (not recommended for GEO):
Disallow: /
Checking Current Configuration
Audit your current robots.txt configuration:
- Navigate to yoursite.com/robots.txt
- Search for AI crawler user-agents
- Check for any Disallow rules affecting AI bots
- Verify Allow rules are in place if using restrictive defaults
Common Issues
Accidentally Blocking AI Crawlers
Some security plugins or hosting configurations block unknown bots by default. Review your security settings and whitelist AI crawler user-agents.
Conflicting Rules
If you have both Allow and Disallow rules, check the order and specificity. More specific rules typically take precedence.
CDN/Firewall Blocks
Cloudflare, AWS WAF, and other services may block AI crawlers at the network level, even if robots.txt allows them. Review firewall rules and bot management settings.
Get Your Technical Configuration Audited
Our audit checks your crawler access configuration and identifies any technical barriers to AI visibility.
Get Free AI Visibility Audit →