How the AI Exposure Score Is Calculated
The AI Exposure Score (0-100) measures how visible, understandable, and recommendable your product is to AI assistants like ChatGPT, Claude, Perplexity, and Gemini. It is calculated across 6 categories and 25+ individual signals.
Score range
Category weights
In this document
AI Crawl Access
AI crawlers need permission to index your site and a structured way to understand it. Blocking AI crawlers in robots.txt means ChatGPT, Claude, and Perplexity cannot index your content. llms.txt files act like a product brief for LLMs — giving them exactly what they need to describe you accurately.
Signals checked (5)
- sitemap.xml present and parseable+5 pts
- sitemap.xml contains 3+ URLs+2 pts
- robots.txt allows AI crawlers (GPTBot, ClaudeBot, PerplexityBot)+5 pts
- llms.txt file exists at /llms.txt+5 pts
- llms-full.txt file exists at /llms-full.txt+3 pts
Content Quality
AI systems need text they can actually read. JavaScript-heavy sites with minimal crawlable HTML score poorly here because LLMs typically parse the raw source, not rendered JavaScript. High text-to-HTML ratios signal that your pages have real content rather than empty shells.
Signals checked (4)
- Homepage has 500+ words of crawlable text+5 pts
- Text-to-HTML ratio above 15%+5 pts
- 3+ pages reachable from homepage+5 pts
- Navigation links to key sections+5 pts
Product Clarity
AI assistants need to understand what your product does before they can recommend it. A missing or vague H1, no features page, and no pricing information all reduce confidence. When an AI is asked 'what is X?', it needs a clear, unambiguous answer from your site's content.
Signals checked (3)
- Clear H1 headline present+5 pts
- Feature keywords or features page found+5 pts
- Pricing page or pricing information found+5 pts
Structured Data & Meta
Structured data (schema.org JSON-LD) is the highest-fidelity signal you can give AI systems. A SoftwareApplication schema tells LLMs your product name, category, description, pricing, and URL in a machine-readable format they can extract and reference directly. OpenGraph tags improve how your product appears in AI-generated summaries.
Signals checked (5)
- OpenGraph tags present (og:title, og:description, og:image)+5 pts
- Page title is set and descriptive+3 pts
- Meta description is present (150-160 chars)+3 pts
- JSON-LD structured data (SoftwareApplication, Product, or Organization)+7 pts
- Canonical URL set+2 pts
Agent Readiness
Agent readiness measures whether your site is prepared for autonomous AI agents — tools like Claude, ChatGPT with browsing, and Perplexity that browse your site on behalf of users. llms.txt was proposed by Jeremy Howard (fast.ai) as an AI-native equivalent of robots.txt — a plain-text briefing document designed specifically for LLMs.
Signals checked (2)
- llms.txt file exists and is readable+5 pts
- llms-full.txt file exists and is readable+5 pts
Scoring Model and Aggregation
The AI Exposure Score is a weighted additive model. Each signal is checked independently; points are awarded for passing signals and withheld for failing ones. Partial credit is not awarded — signals are binary pass/fail where possible, with a small number of graduated signals (e.g., word count checked at multiple thresholds).
The total possible score is 100 points. The final score represents the percentage of points earned — a score of 72 means the site earned 72 of 100 possible points across all checked signals.
Scans are run against the live version of your site at the time of the audit. Dynamic JavaScript content is not rendered; the score reflects what AI crawlers and LLMs see when they fetch your site's HTML source directly — the same view that ChatGPT, Claude, and Perplexity use when indexing content.
All scans are passive and read-only. No authentication credentials are required or used. The scanner does not modify your site in any way.
How to Improve Your Score
The highest-ROI improvements, ranked by points per hour of implementation effort:
- 1Allow AI crawlers in robots.txtUp to 5 pts5 min
Add User-agent entries for GPTBot, ClaudeBot, PerplexityBot, OAI-SearchBot, and Bytespider.
- 2Add a sitemap.xml with 3+ URLsUp to 7 pts10 min
Most frameworks generate this automatically. Verify it's accessible at /sitemap.xml.
- 3Add JSON-LD SoftwareApplication schemaUp to 7 pts15 min
One <script type='application/ld+json'> block in your page head. AIExposureTool generates this for you.
- 4Create a /llms.txt fileUp to 10 pts20 min
Plain text file with product name, description, target users, use cases, and key URLs. AIExposureTool generates this for you.
- 5Write a clear H1 headlineUp to 5 pts5 min
State what your product does and who it's for in one sentence. Avoid generic phrases like 'Welcome'.
- 6Add OpenGraph tagsUp to 5 pts10 min
og:title, og:description, og:image. Most meta tag libraries handle this automatically.
- 7Add testimonials and quantifiable metricsUp to 10 pts30 min
Real testimonials with names and specific outcomes. Include at least one metric (user count, scans, companies).
See your AI Exposure Score
Scan your site for free and see exactly which signals you're passing, which you're missing, and what to fix first.