Strong AI visibility with 10 of 10 criteria passing. All criteria met.
Verdict
novo.co is well-prepared for AI Engine Optimization, with a live and detailed `llms.txt`, strong structured data implementation, and a dedicated FAQ experience backed by `FAQPage` JSON-LD. Core crawl infrastructure is solid, including accessible `robots.txt` and a large sitemap index. The biggest gaps are limited AI-bot specificity in `robots.txt` and a modest footprint of publishable original research assets.
Scoreboard
Top Opportunities
Improve Your Score
Guides for the criteria with the most room for improvement
Most sites run default platform robots.txt with zero AI-specific rules. That's not a strategy -it's an accident. Explicit Allow rules for GPTBot, ClaudeBot, and PerplexityBot signal that your content is open for citation.
AI has a trust hierarchy for sources. At the top: proprietary data and first-hand expert analysis. At the bottom: rewritten Wikipedia articles. We've watched AI preferentially cite sites with original benchmarks -even over bigger competitors.
A page built with <div> everywhere looks the same to AI as a page with no structure at all. Semantic elements -<main>, <article>, <section>, <time> -are the markup that tells AI where your content starts, what it means, and how it's organized.
AI systems don't cite websites -they cite entities. A verifiable business with an address, named authors, and social proof. Our self-audit (88/100) still loses points here because we lack a physical address. That's how strict this criterion is.
Want us to improve your score?
We build citation-ready content that AI engines choose as the answer.