Content Freshness Signals
No date on your page? AI engines treat it like a rumor -undated and deprioritized. Here's how we audit whether your timestamps are actually machine-readable.
Questions this article answers
- ?How do I add machine-readable dates to my pages for AI engines?
- ?Does missing dateModified hurt my chances of being cited by Perplexity or ChatGPT?
- ?What is the best way to add timestamps so AI crawlers can read them?
Summarize This Article With AI
Open this article in your preferred AI engine for an instant summary and analysis.
Quick Answer
Content freshness measures whether pages include datePublished and dateModified in both JSON-LD and HTML time elements. Pages without timestamps are treated as undated by AI engines -and for time-sensitive queries, undated means invisible.
Before & After
Before - No machine-readable dates
<article> <p>Published January 2026</p> <p>How to improve your AEO score...</p> </article>
After - Dates in JSON-LD and HTML <time>
<article>
<time datetime="2026-01-15">January 15, 2026</time>
<p>How to improve your AEO score...</p>
</article>
<script type="application/ld+json">
{ "@type": "Article", "datePublished": "2026-01-15", "dateModified": "2026-02-01" }
</script>What This Actually Measures
We're measuring whether your pages carry machine-readable dates that AI engines can parse and trust. Not "does a date appear on the page" -but "can a machine extract it?" Three distinct timestamp layers get examined: JSON-LD datePublished and dateModified within Article, BlogPosting, or WebPage schemas; HTML <time> elements with valid datetime attributes; and HTTP Last-Modified response headers.
The core metric is the "freshness coverage ratio" -the percentage of content pages (excluding static pages like contact and privacy) that include at least one machine-readable timestamp. A secondary metric, "freshness accuracy," checks whether the timestamps are plausible. A dateModified that precedes datePublished? Future publication dates? A lastmod untouched for 3+ years on clearly updated content? All count as inaccurate signals.
We also measure "timestamp consistency" across the three layers. When JSON-LD says the article was modified January 15 but the HTTP header says December 3, AI engines get conflicting signals. The consistency score tracks how often the sources agree (within a 24-hour tolerance).
This is critical for industries where information changes fast -healthcare, finance, tech, legal. Perplexity explicitly shows "last updated" in citations. Pages without dates get deprioritized or flagged with a warning that the information may be outdated.
Why Missing Dates Hurt Your Whole Domain
AI answer engines use recency as a ranking signal when multiple sources answer the same query. For time-sensitive queries like "best live chat software 2026" or "current patient advocacy regulations," undated content gets systematically deprioritized. If your competitors publish dated articles updated within the last 90 days and your equivalent content has no dates at all, the AI picks them.
The penalty compounds at the domain level. When AI engines observe that a domain consistently lacks timestamps, they lower the domain's overall freshness trust score. Even your genuinely current content gets disadvantaged because the domain pattern suggests poor date hygiene.
Freshness signals hit Google AI Overviews directly. Google's systems use dateModified to decide whether to include a source. Pages with recent modification dates are weighted higher for queries where recency matters. A site where 80% of pages carry no modification date is telling Google: "I can't confirm when any of this was last verified."
There's a deceptive practice risk too: sites that update dateModified without making real content changes are "timestamp farming." We cross-reference dateModified against actual content changes between crawl snapshots to catch this. Honest freshness signals -where the date reflects meaningful updates -build long-term trust.
How We Check This
Every crawled page gets classified into one of three categories: content pages (articles, blog posts, guides, product pages), structural pages (category listings, tag archives, search results), and static pages (about, contact, privacy, terms). Only content pages are included in the freshness calculation -static pages aren't expected to carry publication dates.
For each content page, timestamps get extracted from all available sources. JSON-LD properties from Article, BlogPosting, NewsArticle, and WebPage schemas -specifically datePublished, dateModified, and dateCreated. HTML <time> elements with their datetime attributes. The HTTP Last-Modified header from the server response. Meta tags like <meta property="article:published_time"> and <meta property="article:modified_time">.
Each timestamp is validated for format (ISO 8601), plausibility (not in the future, not before the domain's registration date), and cross-source consistency. We flag specific issues: missing dateModified when datePublished is present (suggests the content was never updated), dateModified identical to datePublished across all pages (that's a template default, not actual tracking), and timestamps differing by more than 24 hours across sources.
The results include a histogram showing the age distribution of your content -how many pages were last modified within 30 days, 90 days, 6 months, 1 year, and beyond. This reveals whether your site maintains active content or has a massive "stale tail" of aging pages that haven't been touched.
How We Score It
Freshness scoring combines three sub-metrics:
1. Freshness coverage (40% of score): - 90-100% of content pages have at least one valid timestamp: 4/4 points - 70-89%: 3/4 points - 50-69%: 2/4 points - 30-49%: 1/4 points - Below 30%: 0/4 points
2. Timestamp accuracy and plausibility (30% of score): - All timestamps pass validation: 3/3 points - Less than 10% have issues (future dates, impossible sequences): 2/3 points - 10-25% have issues: 1/3 points - More than 25% have issues: 0/3 points
3. Cross-source consistency (30% of score): - JSON-LD, HTML, and HTTP timestamps agree within 24 hours on 90%+ pages: 3/3 points - Agreement on 70-89%: 2/3 points - Agreement on 50-69%: 1/3 points - Below 50% agreement or only one timestamp source present: 0/3 points
Deductions:
- -1 point if more than 50% of pages have dateModified identical to datePublished (no real update tracking)
- -1 point if no pages use the HTML <time> element (missing the visible date layer)
Total normalized to 0-10. Sites relying exclusively on CMS-generated dates without structured data integration typically land between 2 and 4.
Resources
Key Takeaways
- Add datePublished and dateModified in JSON-LD Article schema on every content page.
- Use HTML <time> elements with datetime attributes for visible dates so both humans and machines can parse them.
- Keep timestamps consistent across JSON-LD, HTML, and HTTP headers - conflicting dates erode trust.
- Update dateModified only when you make real content changes - timestamp farming backfires.
How does your site score on this criterion?
Get a free AEO audit and see where you stand across all 10 criteria.