Do you use the Web Crawler to automatically add pages from your website to your AI Agent’s knowledge? Then it’s important to understand how to do this correctly when your sitemap changes regularly.
In this article, you’ll learn what to do when your sitemap is updated and how to prevent your AI Agent from using outdated information.
The Web Crawler Does Not Detect Changes in Your Sitemap
When you update your website’s sitemap, it’s important to know that the Web Crawler does not remember or compare previous versions. The tool treats each newly added sitemap as a completely new set of URLs.
What happens when you re-add an updated sitemap?
- All URLs currently listed in the sitemap will be added again to the AI Agent.
- URLs that are no longer in the sitemap will not be automatically removed from the AI Agent’s knowledge.
What should you do when your sitemap changes?
- Manually remove all existing URLs from the Web Crawler that you no longer want to use.
- Then add the new (updated) sitemap.
- Next, crawl the desired pages again.
Note: During this process, your AI Agent will temporarily contain no knowledge from the Web Crawler. It’s best to perform this action during a time of low traffic.
Want to learn more about setting up and using the Web Crawler in general? Read the article How to Use the Web Crawler.