Google May Reducing Refresh Webpage Crawls to Make Crawling More Sustainablehttps://www.searchenginejournal.com
This topic is discussed by Google’s Search Relations team, which is made up of John Mueller, Martin Splitt, and Gary Illyes. They go on to explain that web crawling can be made more sustainable by cutting down on refresh crawls. There are two types of Googlebot crawling: crawling to discover new content and crawling to refresh existing content. Google is considering scaling back on crawling to refresh content.
A website like Wall Street Journal is constantly updating its homepage with new content, so it deserves a lot of refresh crawls.
However, WSJ is not likely updating its About page as frequently, so Google doesn’t need to keep doing refresh crawls on those types of pages.
For more details on how Google plans to pull this off, listen to the full discussion in the podcast.