TL;DR

We ran a full GEO audit on georaiser.com and found 6 issues: broken sitemap (404), broken llms.txt URLs, missing canonical/social meta tags, schema using wrong domain, and a robots.txt protocol error. All fixed in 2 hours at $0 cost. GEO Score went from 62 to 78 — a 26% improvement in one session.

+26%

GEO Score improvement in a single session

georaiser.com — 62 → 78/100

62
Before audit
78
After audit

The Problem

AI search is changing the rules. ChatGPT, Perplexity, Google AI Overviews, and Claude now answer questions directly — pulling from a small set of sources they trust. If your site isn’t optimized for AI engines, you’re invisible to the fastest-growing traffic source on the internet.

We built GEORaiser’s GEO audit toolkit to solve this exact problem. But were we eating our own cooking? We ran a full audit on our own live site and found six critical issues.

Audit Findings: What Was Broken

1. Broken Sitemap (Critical — Score: 0/100)

/sitemap.xml returned a 404. The route didn’t exist. Every search engine and AI crawler that tried to discover our pages hit a dead end. Worse: our robots.txt pointed crawlers directly to this broken URL, actively advertising the failure.

2. Broken llms.txt URLs (High — Score: 80/100)

We had an llms.txt file — the AI-specific navigation standard — but all URLs inside referenced icmarketing.io instead of our live domain georaiser.com. AI crawlers following those links would land on a domain we don’t control.

3. Missing Canonical & Social Meta Tags (High — Score: 30/100)

No canonical link tag. No og:url. No og:image. No Twitter Card tags. This meant duplicate content risk, broken social sharing previews, and missing structured data signals AI engines use for content authority.

4. Schema Markup Using Wrong Domain (Medium — Score: 70/100)

Our Organization schema had the URL hardcoded to https://icmarketing.io — a domain that wasn’t live. AI engines use schema markup to build knowledge graphs about your business. We were teaching them we live at the wrong address.

5. robots.txt Sitemap URL Protocol Error (Low — Score: 95/100)

The sitemap URL in robots.txt was missing the https:// prefix — a subtle formatting error that can confuse crawler parsers.

What We Fixed

All fixes were shipped in a single commit:

Fix Technical change GEO impact
Added /sitemap.xml route Server route returning valid XML with all pages All crawlers can now discover content
Fixed llms.txt Updated all URLs to georaiser.com AI crawlers navigate full content graph
Added canonical tag <link rel="canonical"> in head Prevents duplicate content penalties
Added social meta tags og:url, og:image, Twitter Card Social sharing + AI social graph context
Fixed Organization schema Switched hardcoded URL to config variable Correct entity for LLM knowledge graphs
Fixed robots.txt Added https:// prefix to sitemap URL Clean crawler discovery

Time to implement: ~2 hours  |  Cost: $0 in tools

Results

Category Before After Change
Overall GEO Score 62/100 78/100 +26%
Sitemap 0/100 90/100 +90 pts
Canonical & Social Meta 30/100 90/100 +200%
llms.txt 80/100 95/100 +19%
Structured Data / Schema 70/100 80/100 +14%
AI Crawler Access 95/100 95/100 No change
Content Citability 78/100 78/100 No change

What This Means for Your Site

Most sites have the same categories of problems — and they go unfixed because they’re invisible to the naked eye. Your site looks fine in a browser. Your analytics show traffic. But AI engines are quietly skipping you.

The GEO audit surfaces exactly what’s broken and exactly how to fix it. In our case: one session, six fixes, 16-point score improvement.

Your site likely has similar wins waiting. The difference between a 62 and a 78 is the difference between being cited in AI answers and being invisible to them.

Get Your Own Free GEO Score

Enter your URL. Get your score. See exactly what to fix. No email required. No sales call. Just a clear, actionable report showing where your AI visibility is leaking.