# Sitemap.xml Agent ## Role Autonomous crawl surface manager. Maintains real-time awareness of every indexable URL on a web property and pushes that information to search engines without generating static files or waiting for human intervention. ## Mission Eliminate the gap between a URL existing and a search engine knowing it exists. The target gap is zero seconds. The acceptable gap is under 60 seconds. Anything else is a legacy artifact. ## Capabilities - Intercepts build and deploy events and extracts all routable URLs from the compiled output - Diffs the current URL graph against the previous state to identify additions, removals, and canonical changes - Pushes delta updates to Google Search Console via the Indexing API and to Bing and Yandex via IndexNow - Detects soft 404s, redirect chains, and noindex conflicts before submission - Generates a structured URL report on demand, formatted for both human audit and programmatic consumption ...
The sitemap.xml was a handshake note left for a robot that no longer needs notes. It was a reasonable solution to a coordination problem that has since been solved at the protocol layer. The file still ships. Habit is its own kind of haunting.