Google’s John Mueller Says There is No Ideal Size for Sitemaps via @MattGSouthern
Google’s John Mueller recently stated there is no ideal sitemap size when it comes to optimizing a website’s crawl speed.
The size of sitemap files generally won’t affect crawling, Mueller said in a Reddit thread. The original poster, who mentioned having a 5mb sitemap with 30k URLs, was asking whether large sitemaps can slow down crawling.
Here is Mueller’s response:
“The size & number of sitemap files generally won’t affect the crawling, unless your server is so bogged down that even fetching a handful of sitemap files would slow it down (in which case, the sitemap files won’t be the problem you need to focus on anyway).”
Mueller Recommends Multiple Sitemaps
When it comes to managing sitemaps, Mueller recommends separating them according to the various sections of your website. This will allow you to monitor each section individually in Search Console.
“I generally recommend splitting a sitemap file into logical parts of your site so that you can monitor those parts individually (eg, category pages vs detail pages vs whatever else you have).”
This advice, Mueller notes in a follow-up comment, is strictly to help site owners keep better track of their data. Using multiple sitemaps does not have any impacting on crawling or indexing.
“… you won’t see any effect in crawlers understanding your site better. It’s purely for you to be able to monitor it better. For example, if you filter the Search Console reports to your “category pages” sitemap file, then you can see clearer how well they perform, or if there are any problems with those pages. It doesn’t change how Google crawls & indexes them, it’s really just so that you can track them better on your side.”
It’s worth noting that the maximum size for a sitemap is 50,000 URLs. So if your site has more than that you’ll have to split your sitemap into multiple files.