All articles
Case Study
Feb 20, 2026·6 min read

Why Some Tech Homepages Score Under 50 on AI Readability

Some of the most sophisticated software companies in the world have homepages that AI can barely read. We ran the numbers.

There's a particular irony in running a sophisticated tech company's homepage through an AI readability analyzer. These companies build the tools that help decide which websites get cited and which get ignored.

48

A common AI visibility score for JS-heavy tech homepages

Out of 100 — comfortably in the 'weak' range

This isn't a gotcha. It's a case study in how decisions that impress humans can make you nearly invisible to AI.

01

What our analyzer actually saw

⚠️

Partial render detected. Fewer than 600 words received from initial HTML. Score may be conservative.

  • Structure: 8.2 / 30 — almost no structural signals in served HTML
  • Content depth: 4.1 / 22 — fewer than 600 words to analyze
  • Semantic quality: 12.3 / 16 — what little existed was well-written
  • Value prop: 6.8 / 13 — meta description carried almost everything
  • Hierarchy: 1.4 / 5 — heading structure minimal in raw HTML
48.39

Final score

Saved from lower only by one strong meta description

02

The JavaScript rendering gap

Many tech homepages are React frontends with heavy client-side rendering. In a browser: JavaScript executes, components mount, full page appears. For AI web crawlers: they receive the same HTML but don't run JavaScript. They analyze the initial document — which for a heavily client-side site contains almost nothing.

The site looks beautiful in a browser. It's nearly invisible to AI.

03

What one meta description can do

💬

A clear, specific meta description — one that identifies what you are, who you serve, and why it matters — can be the difference between a score of 48 and a score of ~30.

Meta tags are static HTML — they render regardless of JavaScript execution. For any JS-heavy site, your meta description and title become the only content AI reliably sees. They're doing all the work.

04

Three ways to fix it

  • Server-side rendering (SSR) — most robust, requires engineering. Next.js, Nuxt offer it out of the box.
  • Static generation (SSG) — render your homepage at build time. Crawlers get full HTML, users get fast JS app.
  • Meta layer only — if SSR isn't feasible, ensure title, meta, Open Graph, and schema.org are fully populated in initial HTML.
~2×

Score improvement possible

Estimated with proper SSR implementation — days of engineering work

Key takeaway

Many tech homepages score under 50 on AI readability because heavy client-side rendering means AI crawlers see fewer than 600 words. A strong meta description can save the score from going lower. SSR or static generation would fix it in days.

See how your site scores

Free AI visibility analysis — takes 10 seconds.

Analyze my site →