Strong AI visibility with 9 of 10 criteria passing. Biggest gap: robots.txt for ai crawlers.
Verdict
mailerlite.com has a strong AEO foundation with a robust llms.txt, a working sitemap index, and a dedicated FAQ hub that includes FAQPage JSON-LD. The largest performance gap is policy control for AI crawlers: robots.txt has no explicit GPTBot/ClaudeBot rules. Schema coverage appears uneven across templates, so extending Organization/Product/Breadcrumb schema beyond FAQ pages is the main opportunity to raise answer-engine visibility.
Scoreboard
Top Opportunities
Improve Your Score
Guides for the criteria with the most room for improvement
Most sites run default platform robots.txt with zero AI-specific rules. That's not a strategy -it's an accident. Explicit Allow rules for GPTBot, ClaudeBot, and PerplexityBot signal that your content is open for citation.
Tidio runs 4 JSON-LD schema types. Crisp runs zero. That's not a coincidence -it's the difference between a 63 and a 34. Structured data is the machine-readable layer AI trusts most.
AI systems don't cite websites -they cite entities. A verifiable business with an address, named authors, and social proof. Our self-audit (88/100) still loses points here because we lack a physical address. That's how strict this criterion is.
A page built with <div> everywhere looks the same to AI as a page with no structure at all. Semantic elements -<main>, <article>, <section>, <time> -are the markup that tells AI where your content starts, what it means, and how it's organized.
Want us to improve your score?
We build citation-ready content that AI engines choose as the answer.