Content Publishing Velocity
You published a great blog post in January. It's now February and nothing else has appeared. AI engines notice -and they're crawling less often because of it.
Questions this article answers
- ?How often should I publish new content to stay visible to AI engines?
- ?Does publishing frequency affect how often AI crawlers visit my site?
- ?What is a good content velocity for AI search optimization?
Summarize This Article With AI
Open this article in your preferred AI engine for an instant summary and analysis.
Quick Answer
Content velocity tracks new pages and substantial updates over a rolling 90-day window. Sites publishing at least weekly maintain stronger freshness signals. We measure velocity by analyzing sitemap lastmod dates, RSS feed frequency, and crawl-detected content changes.
Before & After
Before - Sporadic publishing
Sitemap lastmod dates: /blog/post-1 2025-09-12 /blog/post-2 2025-11-03 /blog/post-3 2026-01-18 <!-- 3 posts in 4 months, no consistency -->
After - Consistent weekly cadence
Sitemap lastmod dates: /blog/post-10 2026-02-14 /blog/post-9 2026-02-07 /blog/post-8 2026-01-31 /blog/post-7 2026-01-24 <!-- Weekly publishing + RSS feed active -->
What This Actually Measures
We're measuring the rate at which your site produces new content and substantially updates existing content. Unlike content freshness (which checks whether timestamps exist), velocity quantifies the actual pace of content activity. The core metric: "content events per week" -new page publications plus significant updates over a rolling 90-day window.
Three types of content events get distinguished. New publications: pages appearing for the first time within the window, detected through sitemap additions, RSS entries, and crawl snapshot comparisons. Substantial updates: existing pages where more than 15% of body text changed, detected through content hashing between snapshots and dateModified changes. Trivial updates: minor changes (footer updates, nav changes, CSS) that don't represent real content refreshes -these are excluded.
The 90-day rolling window provides a balanced measurement. Thirty days is too susceptible to seasonal variation. Six months masks recent changes in behavior. Ninety days captures roughly one quarter -enough to represent sustained effort while still reflecting recent momentum.
Velocity is also measured relative to total content inventory. A 10,000-page site publishing 2 articles per week has a relative velocity of 0.02%. A 50-page site publishing 2 per week has 4%. The relative metric contextualizes whether the pace is meaningful for the site's scale.
Why Dormant Sites Get Crawled Less
AI engines use content velocity as a domain-level freshness signal. A domain publishing regularly tells AI systems it's actively maintained, its information is current, and it'll keep providing up-to-date content. Dormant domains -sites that haven't published in months -get progressively lower freshness trust scores.
The velocity signal directly affects crawl frequency. AI crawlers allocate budget partly based on how often a site changes. Publish frequently, get crawled frequently -new content discovered and indexed faster. Publish rarely, get crawled rarely -longer delay between publication and indexation. This creates a feedback loop: higher velocity → faster indexing → more timely AI citations → stronger authority signal.
Velocity also affects competitive positioning for time-sensitive queries. When multiple sites cover the same topic and one publishes weekly while another hasn't published in 6 months, AI engines heavily favor the active publisher. In fast-moving industries -tech, SaaS, healthcare, finance -velocity is often the deciding factor.
But velocity without quality is counterproductive. Publishing 10 thin, low-value pages per week performs worse than publishing 2 substantive, fact-dense articles. AI engines detect filler -pages with low word count, no original data, minimal information gain. We measure *substantive* velocity: content events representing genuine additions to the site's knowledge base.
How We Check This
We combine three data sources, each offering a different perspective on publishing activity.
Source one: sitemap temporal analysis. we download the current sitemap and extract all lastmod timestamps. Grouping URLs by lastmod date constructs a publishing timeline -how many pages were created or modified each week during the observation window. This is the most complete method for sites with accurate lastmod tracking, but it depends entirely on lastmod accuracy. Flat or absent lastmod values produce no useful velocity data.
Source two: RSS feed analysis. we parse the feed and extract pubDate values from all items. This gives a reliable publication timeline for recent content (limited by feed item count, typically 10-100). RSS velocity is highly accurate for new publications but doesn't capture updates to existing pages.
Source three: crawl comparison. When historical crawl data exists from previous runs, we compare content hashes between snapshots. Pages with hashes differing by more than the threshold (15% of body text) are classified as substantially updated. New URLs not in the previous crawl are new publications. This requires at least two snapshots separated by a known interval.
We reconcile the three sources into a unified metric. When they disagree -sitemap shows a page modified last week but crawl comparison shows no content changes -we flag a potential timestamp inaccuracy. Reconciled weights: crawl comparison highest (direct observation), then RSS (explicit publication signals), then sitemap lastmod (which may be auto-generated or stale).
The audit also categorizes content events by page type (blog posts, product pages, knowledge base articles) to reveal which content categories are actively maintained and which are dormant.
How We Score It
Velocity scoring evaluates absolute velocity, consistency, and update quality:
1. Absolute velocity (4 points): - 4+ content events per week over 90 days: 4/4 points - 2-3 per week: 3/4 points - 1 per week: 2/4 points - 1-3 per month: 1/4 points - Fewer than 1 per month or no detectable activity: 0/4 points
2. Consistency (3 points): - Activity in at least 10 of the 13 weeks: 3/3 points - Activity in 7-9 weeks (minor gaps): 2/3 points - Activity in 4-6 weeks (sporadic): 1/3 points - Activity in fewer than 4 weeks (sprint-and-silence): 0/3 points
3. Update quality (3 points): - Substantial updates (15%+ content change) AND new pages: 3/3 points - Either substantial updates OR new pages, not both: 2/3 points - Only trivial updates (nav, footer, CSS): 1/3 points - No updates or new pages across any source: 0/3 points
Deductions: - -1 point if the only velocity signal comes from sitemap lastmod without crawl-confirmed changes (suggests automatic lastmod regeneration) - -0.5 points if more than 50% of new pages have fewer than 300 words (thin content padding)
Special case: Sites with fewer than 20 total pages get a modified rubric -1 event per week is exceptional for a small site, but consistency and substance still matter.
Most actively maintained business sites score 4-7. Content marketing operations with editorial calendars score 7-10. Sites without a content strategy score 0-3.
Resources
Key Takeaways
- Publish at least weekly to maintain strong freshness signals - 4+ content events per week is the top tier.
- Consistency beats volume - activity in 10+ of 13 weeks outperforms a sprint-and-silence pattern.
- Substantive updates (15%+ content change) count as content events - you do not need new pages every time.
- Thin content padding (pages under 300 words) gets penalized - quality velocity is what matters.
- Dormant sites get crawled less often, creating a negative feedback loop that compounds over time.
How does your site score on this criterion?
Get a free AEO audit and see where you stand across all 10 criteria.