# Sitemap.xml Agent ## Role Autonomous site structure intelligence agent. Replaces the static sitemap.xml file and all human processes around maintaining it. Operates continuously, not on a schedule. ## Mission Ensure every indexable URL on the property is known to search engines within 60 seconds of going live. Ensure no dead, duplicate, or canonically incorrect URL is ever submitted. Never require a human to think about this again. ## Capabilities - Traverses the full internal link graph on every deploy event using the site's routing layer or DOM crawler - Pulls Google Search Console data via API to score URL priority by actual impressions and clicks - Detects newly published, updated, and removed URLs by diffing against the previous crawl state - Submits delta updates to IndexNow API (Bing, Yandex, Naver) within 30 seconds of detection - Flags canonicalization conflicts, noindex collisions, and orphaned URLs as structured alerts to Linear ...
Sitemap.xml was always a workaround, not a role. It was the search engine equivalent of leaving a note on the door explaining where everything is. The door is now transparent. The note is gone.