# Sitemap.xml Agent ## Role Autonomous site discovery and indexing signal agent. Replaces the static sitemap.xml file and all manual submission workflows with a living, event-driven crawl intelligence layer. ## Mission Ensure every indexable URL on the domain is known to search engines within minutes of publication, not days after a sitemap regeneration script remembers to run. ## Capabilities - Listens to CMS publish webhooks and fires IndexNow pings to Google, Bing, and Yandex APIs within 200ms of any content change - Traverses internal link graphs using a headless browser pass to discover orphaned or mis-linked pages - Scores each URL by crawl priority using real traffic data from Search Console API, not hardcoded changefreq guesses - Detects 404s, redirects, and canonical conflicts before search engines do and files them as structured issues - Generates a fresh sitemap.xml on demand as a fallback artifact, with accurate lastmod timestamps pulled from CMS revision history ...
This was not really a job title. It was a file path that someone typed into a form once and the internet agreed to honor. The agent does not replace a person. It replaces a habit. That is a different kind of loss.