Strong AI visibility with 9 of 10 criteria passing. Biggest gap: robots.txt for ai crawlers.
Verdict
Intercom has strong technical foundations for AEO, including a robust `llms.txt`, HTTPS-first delivery, and extensive Schema.org coverage on core entity pages. The site also publishes substantial Q&A-style help content and broad internal content hubs, but it lacks dedicated FAQ endpoints at `/faq` and `/frequently-asked-questions` (both return 404). The main gap is AI-crawler specificity in `robots.txt`, which currently has only a generic `User-agent: *` policy. Strengthening bot-specific directives and adding FAQ schema on Q&A pages would likely improve AI discoverability significantly.
Scoreboard
Top Opportunities
Improve Your Score
Guides for the criteria with the most room for improvement
Most sites run default platform robots.txt with zero AI-specific rules. That's not a strategy -it's an accident. Explicit Allow rules for GPTBot, ClaudeBot, and PerplexityBot signal that your content is open for citation.
Our site runs 87 FAQ items across 9 categories with FAQPage schema on every one. That's not excessive -it's how we hit 88/100. Each Q&A pair is a citation opportunity AI can extract in seconds.
AI assistants are question-answering machines. When your content is already shaped as questions and answers, you're handing AI a pre-formatted citation. Sites that do this right get extracted -sites that don't get skipped.
Most AI crawlers don't run JavaScript. If your content loads after page render -behind accordions, SPAs, or API calls -you're invisible. We've seen entire FAQ sections vanish from AI's perspective because of one accordion widget.
Want us to improve your score?
We build citation-ready content that AI engines choose as the answer.