Allow GPTBot, ClaudeBot, and PerplexityBot — Bolt
AI answer engines only cite sites their bots can fetch. If you do not allow GPTBot, ClaudeBot, and PerplexityBot in robots.txt, you are invisible to ChatGPT and Perplexity no matter how good your content is.
Fixing this in Bolt
StackBlitz in-browser AI app builder with WebContainers
Bolt projects run inside a browser container — file paths are real, but the deploy target varies. Paste the prompt below into your Bolt chat and the fix rolls out across the project in one pass.
Using a different tool? Pick your stack:
The prompt for Bolt
Copy and paste this into your Bolt chat exactly as-is.
In Bolt, please update my project with these exact changes: Allow AI crawlers 1. Open /public/robots.txt (create it if missing). 2. Add four blocks: `User-agent: GPTBot` / `Allow: /`, then ClaudeBot, PerplexityBot and Google-Extended — same pattern. 3. Do NOT add a global `User-agent: * / Disallow: /` — that overrides the bot-specific allows on some parsers.
Why this matters
In 2024-2025 AI search (ChatGPT, Claude, Perplexity, Google AI Overviews) started sending real referral traffic and, more importantly, becoming the first place users ask questions. The old model — rank in Google, get clicks — is being replaced by "get cited in an AI answer, get traffic".
Every major AI company publishes a crawler that respects robots.txt. OpenAI has GPTBot (for training) and ChatGPT-User (for live web search). Anthropic has ClaudeBot. Perplexity has PerplexityBot. Google has Google-Extended. Common Crawl (CCBot) powers many open-source models.
If you do not explicitly allow these bots, they default to "whatever robots.txt says for User-agent: *". Many sites ship with restrictive defaults and accidentally block everything. The fix is a six-line allow list — the single highest-leverage GEO move you can make.
How to use this prompt in Bolt
- 1. Open your Bolt project.
- 2. Copy the prompt above with the copy button.
- 3. Paste into the Bolt chat and send.
- 4. Review the diff, accept the changes, redeploy.
- 5. Verify the fix using the checklist below.
Common mistakes to avoid
- error_outlineBlocking all bots with `User-agent: *\nDisallow: /` and forgetting to add allow rules for AI.
- error_outlineAllowing only GPTBot and forgetting the others — ChatGPT search and ChatGPT training use different crawlers.
- error_outlineBlocking AI because of a blog post that said "block AI crawlers" — that advice costs you AI-search visibility.
- error_outlinePutting the allow rules in the wrong order — `User-agent: *` `Disallow: /` first, then specific allows after.
How to verify the fix worked
- check_circleVisit `/robots.txt` and confirm the six User-agent blocks exist, each with `Allow: /`.
- check_circleTest with https://www.bing.com/webmasters/help/robots-txt-analyzer or similar tools.
- check_circleCheck server logs for GPTBot, ClaudeBot, PerplexityBot user-agents after 1-2 weeks — they should appear.
- check_circleIn ChatGPT with web browsing, ask a query where you would expect to be cited — see if you appear.
Frequently asked questions
Will allowing GPTBot let OpenAI train on my content?expand_more
Does allowing AI crawlers hurt my SEO?expand_more
Is there a downside to allowing them?expand_more
Want all 34 prompts tailored to your Bolt site?
Pantra scans your site in 10 seconds, detects the stack, and generates the exact prompts that apply — only the ones you actually need.
Scan my siteRelated Bolt prompts
Add a robots.txt that allows search + AI crawlers — Bolt
Prompt to drop a correct /public/robots.txt for Google, Bing, and AI crawlers like GPTBot and PerplexityBot in Lovable, Cursor, Bolt, v0, Replit, Windsurf, Claude Code, Base44.
AI Search / GEOAdd an llms.txt file — Bolt
Stack-specific prompt to publish llms.txt — a curated guide telling LLMs what your site is about.
AI Search / GEOAdd JSON-LD structured data — Bolt
Prompt to add Organization, Article, and FAQ JSON-LD to <head> — tells Google and AI crawlers exactly what your page represents.
AI Search / GEOAdd server-side rendering — Bolt
Stack-specific prompt to ensure key content is in the HTML response, not rendered only via JS.