Weak AI visibility with 9 of 22 criteria passing. Biggest gap: rss/atom feed.
Verdict
runconverge.com has a mixed AEO profile with an overall score of 44/100: core technical hygiene exists, but answer extractability is weak. The site is strong on machine guidance and authority basics (llms.txt 10/10, canonical strategy 10/10, fact/data density 10/10 with 134 quantitative points), and sitemap/indexation signals are decent (7/10). However, multiple zero-score areas are suppressing AI answer pickup, including direct answer paragraphs (0/10), definition patterns (0/10), list/table extractability (0/10), RSS/Atom feed (0/10), AI permissions/licensing (0/10), and author schema (0/10). With structured content formatting, deeper schema, and freshness/author signals, this can move from partially visible to reliably citable in AI-generated answers.
Scoreboard
Fix It With AI
Copy-paste these prompts into Claude Code or Cursor to fix each criterion.
These prompts are designed for projects where you have direct access to the codebase (Next.js, React, static HTML, WordPress, etc.). If your site runs on a hosted platform like Webflow, switch to the "Webflow" tab for platform-specific instructions. Using a different hosted platform? Contact us for custom guidance.
Top Opportunities
Improve Your Score
Guides for the criteria with the most room for improvement
Sitemaps tell crawlers what exists. RSS feeds tell them what changed. If you don't have one, your new content waits days -or weeks -to be discovered.
Your sitemap says 500 pages exist. Our crawl finds 700. Those 200 missing URLs? AI crawlers will never know they exist.
Same content, three URLs, zero canonical tags. Congratulations -you just split your authority three ways and gave AI crawlers a headache.
You want AI engines to cite your content. But have you actually told them they're allowed to? Most sites haven't -and AI systems default to conservative behavior.
Want us to improve your score?
We build citation-ready content that AI engines choose as the answer.