Moderate AI visibility with 6 of 10 criteria passing. Biggest gap: llms.txt file.
Verdict
nymemorycenter.org has strong human-readable caregiver and FAQ content, but weak machine-readable AI optimization. The crawl shows no `llms.txt`, minimal structured data (only `WebSite` on the homepage), and no AI-crawler-specific directives in `robots.txt`. Core crawlability is decent with a valid sitemap index and server-rendered content, so the fastest gains are adding `llms.txt`, FAQ/Organization schema, and explicit GPTBot/ClaudeBot policy.
Scoreboard
Top Opportunities
Improve Your Score
Guides for the criteria with the most room for improvement
Tidio has a 251-line llms.txt. Crisp has zero. The score gap: +29 points. This single file tells AI assistants exactly what your site does -and without it, they're guessing.
Tidio runs 4 JSON-LD schema types. Crisp runs zero. That's not a coincidence -it's the difference between a 63 and a 34. Structured data is the machine-readable layer AI trusts most.
Most sites run default platform robots.txt with zero AI-specific rules. That's not a strategy -it's an accident. Explicit Allow rules for GPTBot, ClaudeBot, and PerplexityBot signal that your content is open for citation.
AI has a trust hierarchy for sources. At the top: proprietary data and first-hand expert analysis. At the bottom: rewritten Wikipedia articles. We've watched AI preferentially cite sites with original benchmarks -even over bigger competitors.
Want us to improve your score?
We build citation-ready content that AI engines choose as the answer.