Eating Our Own Dog Food

We Audited Our Own Site
Before Charging Anyone Else

Before launching our paid audit service, we ran the full GEO audit process on ourselves — found 6 critical gaps, fixed all of them, and went from 62 to 78 in one session.

Get Your Free GEO Audit →
Internal Benchmark

GEORaiser — We Audited Our Own Site

Before
62
GEO Score
+16 pts
After
78
GEO Score
Context: Before charging a single client, we ran our full GEO audit process on GEORaiser.com itself. We found 6 critical gaps — a broken sitemap, missing canonical tags, bad llms.txt URLs, wrong schema URLs, incomplete robots.txt, and no structured data on key pages. We fixed all six in one session. Score went from 62 to 78. This is the same process we run for every paid audit.

6 Fixes Applied in One Session

1

Sitemap Rebuilt and Registered

The /sitemap.xml route returned a 404 — every AI crawler that tried to discover pages hit a dead end. We rebuilt the sitemap, registered it in robots.txt, and submitted to Google Search Console. This single fix unblocked crawl discovery for all pages.

Impact: +6 GEO points
2

Schema Markup Added Across Key Pages

Zero structured data meant AI engines had no machine-readable signals for content type, authorship, or organizational identity. We added Organization, WebSite, BreadcrumbList, and FAQPage JSON-LD schema to all primary pages — giving AI engines the structured signals they extract citations from.

Impact: +5 GEO points
3

llms.txt File Created with Correct URLs

The llms.txt file existed but had wrong URLs throughout — pointing to localhost instead of the live domain. AI models ingesting it were getting broken links. We fixed every URL and restructured the file to follow the emerging standard, giving LLMs a clean curated index of our content.

Impact: +2 GEO points
4

Schema URLs Corrected

Existing JSON-LD schema had @id and url fields pointing to http://localhost:3000 — making every structured data block technically invalid in production. Fixed all schema URLs to use the live domain across all page types.

Impact: +1 GEO point
5

Canonical Tags Added to All Pages

No <link rel="canonical"> tags meant search engines and AI crawlers couldn't resolve duplicate content signals. Every page now has a canonical URL, reducing ambiguity about which version to index and cite.

Impact: +1 GEO point
6

robots.txt Completed

The robots.txt file was missing a Sitemap: directive — AI crawlers checking robots.txt had no pointer to the sitemap. Added the directive and verified all major AI crawlers (GPTBot, ClaudeBot, PerplexityBot) are explicitly allowed.

Impact: +1 GEO point

Is Your Site Invisible to AI Search?

Most sites score between 40–65 without deliberate GEO work. Find out where you stand — and what to fix — in 48 hours.

Get My Free GEO Audit →