09 Nov Google’s Mueller on Crawl Rate for Big and Small Sites via @martinibuster
Google’s John Mueller asked the SEO community why people are missing the URL submission tool. One person said that small sites are disadvantaged because bigger sites are crawled more frequently. John Mueller shared insights into how Google crawls sites.
Are Big Sites Crawled More Frequently?
It’s possible that popular and more frequently linked websites are crawled more often. Links are a part of the crawling process because Googlebot is crawling from link to link.
So it’s not unreasonable to assume that popular sites are crawled more frequently than less popular sites.
Here is John Mueller’s original question:
“I’ve seen folks looking forward to the URL submission tool being back. I don’t have any new news, but I’d love to find out more about why you’re missing it.
Let me know which URLs you’re missing the tool for, and how you generally used it. Thanks!”
I've seen folks looking forward to the URL submission tool being back. I don't have any new news, but I'd love to find out more about why you're missing it.
Let me know which URLs you're missing the tool for, and how you generally used it. Thanks! https://t.co/okrv0YdETL
— 🍌 John 🍌 (@JohnMu) November 9, 2020
And the publisher answered:
“@JohnMu,
you know that crawlers don’t visit small website as frequent as big ones. So for any update on key pages and indexing faster we depend on the URL submission tool. By removing the access to the tool, you are favoring big websites and hitting hard on small ones.”
John Mueller responded that crawling is independent to the size of a website.
“Crawling is independent of website size. Some sites have a gazillion (useless) URLs and luckily we don’t crawl much from them. If you have an example from your site where you’re seeing issues, feel free to add it to the form.”
The person asking the question stated that some publishing platforms don’t automatically update their sitemaps.
To which John Mueller suggested that upgrading the platform so that it automatically updates the sitemap is the easier solution.
“There are still sites that don’t use sitemaps? Seems like a much simpler fix than to manually submit every new or updated URL …
…Sounds like something to fix in that case :). Manual submissions are never scalable, make sure it works automatically.
….Making a sitemap file automatically seems like a minimal baseline for any serious website, imo.”
Large Site and Popular Site
I think what was overlooked in the above exchange is what the publisher meant when they said a “big site” was crawled more often. John Mueller answered the question literally in terms of how many pages a site contained.
It may not be unreasonable to assume that what the publisher may have meant was a site that was more popular had an advantage over a smaller site that did not have as many links.
Yet, Mueller’s reference to larger (and sometimes more popular) sites containing useless URLs is a fair point. That implies that smaller sites are easier more efficient to crawl and indeed, if you are able to view crawl logs, Google does seem to visit smaller and less popular sites fairly frequently.
If days or weeks go by without Google discovering some of your pages, that’s something that could be helped by a sitemap. But it could also be indicative of deeper problems with regard to the quality of the content or the links.
Sorry, the comment form is closed at this time.