Generate a sitemap.xml covering every route — Windsurf
Without a sitemap, Google has to discover pages only through internal links. On JS-heavy AI-built apps, discovery is slow and unreliable — new pages can wait weeks to be indexed.
Fixing this in Windsurf
Codeium agentic AI IDE with Cascade
Windsurf Cascade can walk your router file and generate the sitemap file in one pass.
Using a different tool? Pick your stack:
The prompt for Windsurf
Copy and paste this into your Windsurf chat exactly as-is.
Using Windsurf Cascade, apply these edits across the project in one pass: Add sitemap.xml 1. Create /public/sitemap.xml listing every public route. 2. Wrap each URL in a <url><loc>...</loc></url> entry under <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">. 3. Update /public/robots.txt to add a "Sitemap:" line pointing to the new file.
Why this matters
Googlebot discovers pages in two ways: following links from already-known pages, and reading sitemaps. If your app has few external backlinks (as most new AI-built apps do), link discovery is slow. A sitemap cuts first-indexing time from weeks to days.
Modern SPAs have another discovery problem: routes often only exist after a client-side router runs, which Googlebot may or may not execute. A static `sitemap.xml` bypasses that entirely — every URL is declared upfront in plain XML.
Submitting your sitemap in Google Search Console also unlocks the "Pages" report, which shows exactly which URLs are indexed, which are crawled-but-not-indexed, and why. You cannot debug indexation without this report, and the report does not populate without a submitted sitemap.
How to use this prompt in Windsurf
- 1. Open your Windsurf project.
- 2. Copy the prompt above with the copy button.
- 3. Paste into the Windsurf chat and send.
- 4. Review the diff, accept the changes, redeploy.
- 5. Verify the fix using the checklist below.
Common mistakes to avoid
- error_outlineListing hash-based routes (`/app#dashboard`) — Google ignores fragments in sitemaps.
- error_outlineIncluding URLs that return 404 or 301.
- error_outlineForgetting to add a `Sitemap: https://…` line to robots.txt.
- error_outlineShipping a static sitemap that gets stale because nobody regenerates it on deploy.
- error_outlineListing login, admin, and checkout URLs you do not want indexed.
How to verify the fix worked
- check_circleVisit `https://yoursite.com/sitemap.xml` in a browser — must return XML, not 404.
- check_circleRun `curl -s https://yoursite.com/sitemap.xml | xmllint --noout -` — must parse.
- check_circleSubmit in Google Search Console → Sitemaps — status "Success" within 24 hours.
- check_circleConfirm `robots.txt` ends with `Sitemap: https://yoursite.com/sitemap.xml`.
Frequently asked questions
Do I need a sitemap if my site has fewer than 20 pages?expand_more
How often should I regenerate the sitemap?expand_more
Can a sitemap include URLs from a different domain?expand_more
Does my sitemap need <lastmod>, <changefreq>, or <priority>?expand_more
Want all 34 prompts tailored to your Windsurf site?
Pantra scans your site in 10 seconds, detects the stack, and generates the exact prompts that apply — only the ones you actually need.
Scan my siteRelated Windsurf prompts
Add a robots.txt that allows search + AI crawlers — Windsurf
Prompt to drop a correct /public/robots.txt for Google, Bing, and AI crawlers like GPTBot and PerplexityBot in Lovable, Cursor, Bolt, v0, Replit, Windsurf, Claude Code, Base44.
AI Search / GEOAllow GPTBot, ClaudeBot, and PerplexityBot — Windsurf
Prompt to whitelist AI crawlers so ChatGPT, Claude, and Perplexity can cite your pages. Works in any AI-coded stack.
TechnicalFix broken internal links — Windsurf
Prompt to scan for 404s on internal links and fix or redirect them — SEO and UX win.
SEOAdd a unique <title> tag to every page — Windsurf
Copy-paste prompt to add a unique, keyword-rich <title> tag to every page in Lovable, Cursor, Bolt, v0, Replit, Windsurf, Claude Code, or Base44.