AI Bot Access Checker
Paste your robots.txt and instantly see which AI crawlers can access your site. Covers GPTBot, ClaudeBot, PerplexityBot, Google-Extended, and 10 more.
Your robots.txt
Find yours at yourdomain.com/robots.txt
Block all AI bots snippet
User-agent: GPTBot Disallow: / User-agent: ChatGPT-User Disallow: / User-agent: OAI-SearchBot Disallow: / User-agent: ClaudeBot Disallow: / User-agent: Claude-Web Disallow: / User-agent: Google-Extended Disallow: / User-agent: PerplexityBot Disallow: / User-agent: Applebot-Extended Disallow: / User-agent: Bytespider Disallow: / User-agent: Meta-ExternalAgent Disallow: / User-agent: Amazonbot Disallow: / User-agent: cohere-ai Disallow: / User-agent: Diffbot Disallow: /
Paste your robots.txt to check AI bot access
Find it at yourdomain.com/robots.txt
What is this tool for?
The AI Bot Access Checker analyzes your robots.txt file and shows you exactly which AI crawlers are allowed or blocked from accessing your site. It checks 14 major AI bots including GPTBot (OpenAI), ClaudeBot (Anthropic), Google-Extended, PerplexityBot, and more.
Why it matters for SEO
AI companies are crawling the web to train their models and power AI search features. Your robots.txt controls which of these bots can access your content. Understanding who can crawl you is the first step to making informed decisions about AI visibility vs. content protection.
Key SEO & AI elements it impacts
- GPTBot and ChatGPT-User serve different purposes. GPTBot collects training data; ChatGPT-User browses pages when users ask
- Google-Extended controls Gemini AI training separately from regular Googlebot search indexing
- Blocking a bot with robots.txt is a request, not enforcement. Well-behaved bots honor it, but it's not a security measure
- Some AI search platforms (like Perplexity) may not cite your content in search results if you block their crawler
- The same robots.txt rules that block AI training bots may also block AI search features that send you traffic
What to expect by fixing it
After checking your robots.txt, you'll know exactly which AI platforms can access your content and get specific recommendations for any inconsistencies. You can then make an informed decision about whether to block, allow, or selectively permit AI crawlers based on your business goals.