Moderate AI visibility with 6 of 10 criteria passing. Biggest gap: llms.txt file.
Verdict
maketheroadny.org has a technically crawlable HTTPS WordPress setup with Yoast schema and a working XML sitemap index, but it lacks AI-specific governance signals. The site returns 404 for /llms.txt and robots.txt contains only a generic User-agent rule with crawl delay and no GPTBot/ClaudeBot directives. FAQ intent exists via /faq redirecting to a legacy article, yet there is no dedicated FAQ hub or FAQPage/Question/Answer schema to help AI engines extract answers reliably.
Scoreboard
Top Opportunities
Improve Your Score
Guides for the criteria with the most room for improvement
Tidio has a 251-line llms.txt. Crisp has zero. The score gap: +29 points. This single file tells AI assistants exactly what your site does -and without it, they're guessing.
Most sites run default platform robots.txt with zero AI-specific rules. That's not a strategy -it's an accident. Explicit Allow rules for GPTBot, ClaudeBot, and PerplexityBot signal that your content is open for citation.
Our site runs 87 FAQ items across 9 categories with FAQPage schema on every one. That's not excessive -it's how we hit 88/100. Each Q&A pair is a citation opportunity AI can extract in seconds.
AI assistants are question-answering machines. When your content is already shaped as questions and answers, you're handing AI a pre-formatted citation. Sites that do this right get extracted -sites that don't get skipped.
Want us to improve your score?
We build citation-ready content that AI engines choose as the answer.