Free tool

AI Crawler Checker

See which AI bots can access your content

Check AI crawler access

Tests 20 AI bots

Checks GPTBot, ClaudeBot, PerplexityBot, Google-Extended, and 16 more AI crawlers.

Your robots.txt controls which AI systems can crawl your site — but most site owners don't know which bots they're blocking. This checker tests your domain against 20 known AI crawlers, including GPTBot, ClaudeBot, PerplexityBot, Google-Extended, and more. See at a glance who can access your content and who's shut out.

Instantly see which of 20 major AI crawlers can and cannot access your pages.
Identify accidental blocks that prevent AI systems from discovering your content.
Check for LLMs.txt presence alongside robots.txt for complete AI access coverage.

How it works

Get started in 3 simple steps

1

Enter any public domain or URL to check.

2

The tool fetches your robots.txt and LLMs.txt, then tests 20 AI user-agents.

3

Review per-bot results showing allowed, blocked, and the matching robots.txt rule.

Best use cases

Built for teams that take AI visibility seriously

1

Site owners verifying they haven't accidentally blocked key AI crawlers.

2

AEO teams auditing crawler access policies across domains and subdomains.

3

Agencies reviewing client sites for AI discoverability before running AEO campaigns.

Want continuous monitoring instead of one-off checks?

Start free trial

FAQ

Frequently asked questions

Which AI crawlers does this check?

It tests 20 AI crawlers including GPTBot and ChatGPT-User (OpenAI), ClaudeBot and Claude-Web (Anthropic), PerplexityBot, Google-Extended, Bingbot, Bytespider (ByteDance), CCBot (Common Crawl), Amazonbot, Meta-ExternalAgent, Applebot-Extended, cohere-ai, Diffbot, YouBot, and more.

What if I don't have a robots.txt?

If no robots.txt is found, all crawlers are assumed to have access by default. While that means AI bots can crawl your site, having an explicit robots.txt gives you control over which bots you allow and what paths they can access.

Should I block or allow AI crawlers?

If you want your content cited by AI platforms like ChatGPT and Perplexity, allow their crawlers. Blocking GPTBot, ClaudeBot, or PerplexityBot means those platforms cannot index your content and will not cite it in AI-generated answers.

What's the difference between robots.txt and LLMs.txt?

robots.txt controls crawl access — which bots can visit which paths. LLMs.txt is a content discovery file that tells AI crawlers which pages are most important and how your content is structured. You need both for complete AEO coverage.

How do I fix a blocked AI crawler?

Find the matching Disallow rule in your robots.txt and either remove it or add a specific Allow directive for that bot. For example, adding 'User-agent: GPTBot' followed by 'Allow: /' will grant GPTBot full access to your site.

Start for free

Turn AI visibility insights into growth

Create your workspace to monitor AI visibility and activate optimization workflows.