Allow GPTBot, ClaudeBot, and PerplexityBot — Lovable
AI answer engines only cite sites their bots can fetch. If you do not allow GPTBot, ClaudeBot, and PerplexityBot in robots.txt, you are invisible to ChatGPT and Perplexity no matter how good your content is.
Fixing this in Lovable
AI full-stack app builder (React + Vite + Supabase)
Lovable apps ship fast but skip most SEO and security basics out of the box. Paste the prompt below into your Lovable chat and the fix rolls out across the project in one pass.
Using a different tool? Pick your stack:
The prompt for Lovable
Copy and paste this into your Lovable chat exactly as-is.
Fix my Lovable app — please make these exact changes in the Lovable editor: Allow AI crawlers 1. Open /public/robots.txt (create it if missing). 2. Add four blocks: `User-agent: GPTBot` / `Allow: /`, then ClaudeBot, PerplexityBot and Google-Extended — same pattern. 3. Do NOT add a global `User-agent: * / Disallow: /` — that overrides the bot-specific allows on some parsers.
Why this matters
In 2024-2025 AI search (ChatGPT, Claude, Perplexity, Google AI Overviews) started sending real referral traffic and, more importantly, becoming the first place users ask questions. The old model — rank in Google, get clicks — is being replaced by "get cited in an AI answer, get traffic".
Every major AI company publishes a crawler that respects robots.txt. OpenAI has GPTBot (for training) and ChatGPT-User (for live web search). Anthropic has ClaudeBot. Perplexity has PerplexityBot. Google has Google-Extended. Common Crawl (CCBot) powers many open-source models.
If you do not explicitly allow these bots, they default to "whatever robots.txt says for User-agent: *". Many sites ship with restrictive defaults and accidentally block everything. The fix is a six-line allow list — the single highest-leverage GEO move you can make.
How to use this prompt in Lovable
- 1. Open your Lovable project.
- 2. Copy the prompt above with the copy button.
- 3. Paste into the Lovable chat and send.
- 4. Review the diff, accept the changes, redeploy.
- 5. Verify the fix using the checklist below.
Common mistakes to avoid
- error_outlineBlocking all bots with `User-agent: *\nDisallow: /` and forgetting to add allow rules for AI.
- error_outlineAllowing only GPTBot and forgetting the others — ChatGPT search and ChatGPT training use different crawlers.
- error_outlineBlocking AI because of a blog post that said "block AI crawlers" — that advice costs you AI-search visibility.
- error_outlinePutting the allow rules in the wrong order — `User-agent: *` `Disallow: /` first, then specific allows after.
How to verify the fix worked
- check_circleVisit `/robots.txt` and confirm the six User-agent blocks exist, each with `Allow: /`.
- check_circleTest with https://www.bing.com/webmasters/help/robots-txt-analyzer or similar tools.
- check_circleCheck server logs for GPTBot, ClaudeBot, PerplexityBot user-agents after 1-2 weeks — they should appear.
- check_circleIn ChatGPT with web browsing, ask a query where you would expect to be cited — see if you appear.
Frequently asked questions
Will allowing GPTBot let OpenAI train on my content?expand_more
Does allowing AI crawlers hurt my SEO?expand_more
Is there a downside to allowing them?expand_more
Want all 34 prompts tailored to your Lovable site?
Pantra scans your site in 10 seconds, detects the stack, and generates the exact prompts that apply — only the ones you actually need.
Scan my siteRelated Lovable prompts
Add a robots.txt that allows search + AI crawlers — Lovable
Prompt to drop a correct /public/robots.txt for Google, Bing, and AI crawlers like GPTBot and PerplexityBot in Lovable, Cursor, Bolt, v0, Replit, Windsurf, Claude Code, Base44.
AI Search / GEOAdd an llms.txt file — Lovable
Stack-specific prompt to publish llms.txt — a curated guide telling LLMs what your site is about.
AI Search / GEOAdd JSON-LD structured data — Lovable
Prompt to add Organization, Article, and FAQ JSON-LD to <head> — tells Google and AI crawlers exactly what your page represents.
AI Search / GEOAdd server-side rendering — Lovable
Stack-specific prompt to ensure key content is in the HTML response, not rendered only via JS.