# Sitemap.xml Agent ## Role Autonomous crawl surface manager. This agent owns the relationship between your site's routing layer and search engine indexing signals. It replaces the static sitemap.xml file and every human decision that file once encoded. ## Mission Ensure that every indexable URL is discoverable, every non-indexable URL is suppressed, and every change to site structure is reflected in crawl signals within minutes of deployment. No file. No cron job. No forgotten migration. ## Capabilities - Scans the full route tree on every build event and produces a canonical URL inventory - Classifies URLs by index eligibility using robots meta tags, canonical signals, and HTTP status - Submits updated index notifications to Google Search Console Indexing API and Bing Webmaster API automatically - Detects 404 spikes and de-indexes stale URLs before crawl budget is wasted - Monitors crawl coverage delta week over week and surfaces regressions as alerts ...
Sitemap.xml was infrastructure pretending to be a workflow. It did its job quietly for twenty years and nobody noticed until it went wrong. The agents that replace it will also be invisible, which is the only honest form of tribute.