Google has confirmed three key factors that can boost Googlebot crawling. Here’s how you can leverage this information. During a discussion, Google’s Gary Illyes and Lizzi Sassman highlighted three elements that prompt more frequent Googlebot visits.
While they emphasized that constant crawling isn’t always necessary, they acknowledged that there are ways to encourage Googlebot to revisit your website.
“How High-Quality Content Influences Crawling Frequency”
One of the key points they discussed was the importance of website quality. Many people encounter the “discovered but not indexed” issue, often due to outdated SEO practices that are mistakenly believed to be effective. Having worked in SEO for 25 years, I’ve observed that industry best practices are frequently several years behind what Google is actually doing. It can be difficult to identify what’s wrong when someone is convinced they’re following all the right steps.
At the 4:42 mark, Gary Illyes explained that one factor that can lead to increased crawling is the presence of high-quality signals detected by Google’s algorithms. However, the specific signals of quality and helpfulness that prompt Google to crawl more often are not explicitly defined by Google.
While Google doesn’t provide clear guidelines, we can make educated guesses. For example, there are patents related to branded searches, which suggest that branded search queries could be considered implied links by Google. Contrary to popular belief, these “implied links” are not just brand mentions, which the patent does not cover.
Another relevant patent is the Navboost patent, which has been around since 2004. Some people mistakenly associate this patent with click-through rates (CTR), but if you read the original document, you’ll find that it actually refers to user interaction signals. Research from the early 2000s focused heavily on clicks, but it’s clear from the patents and studies that ranking is not as simple as “a user clicks on a site in the SERPs, Google ranks it higher, and the user gets rewarded.”
In general, I believe that signals indicating people find a site helpful can improve its ranking. This often means providing users with what they expect to see, even if the quality isn’t the best.
Site owners sometimes complain that Google is ranking low-quality content, and when I check, I can see what they mean—the sites are indeed subpar. However, the content is still meeting users’ expectations because many people can’t differentiate between what they expect to see and truly high-quality content.
The Froot Loops algorithm refers to Google’s reliance on user satisfaction signals to determine if their search results are pleasing users. For example, there’s a popular recipe site (which I won’t name) that posts easy-to-make recipes that lack authenticity and use shortcuts like canned cream of mushroom soup. As someone with kitchen experience, those recipes make me cringe. But many people I know love the site because they just want a simple recipe and don’t know any better.
Boosted Publishing Frequency
Illyes and Sassman also mentioned that an increase in publishing frequency could prompt Googlebot to crawl a site more often. For example, if a site suddenly starts publishing a larger number of pages, this could trigger more frequent crawling. However, Illyes discussed this in the context of a hacked site that begins rapidly publishing new content, which would naturally lead to increased crawling by Googlebot.
If we take a broader view of that statement, it becomes clear that Illyes is suggesting an increase in publishing activity may lead to more frequent crawling by Googlebot. It’s not the fact that the site was hacked that causes the increased crawling; it’s the surge in published content that triggers it.
“Maintaining Consistent Content Quality”
Gary Illyes mentions that Google may reassess the overall quality of a site, which could lead to a decrease in crawl frequency. What Illyes means by Google “rethinking the quality of the site” is that if some parts of the site fall below the original quality standards, it can affect the entire site’s reputation. Over time, if low-quality content accumulates, it can overshadow the good content and lower the overall site quality.
When people report a “content cannibalism” issue, it often turns out to be a problem with low-quality content elsewhere on the site.
Lizzi Sassman inquired around the 6-minute mark about the impact of having static content that neither improves nor deteriorates but remains unchanged. Gary didn’t provide a definitive answer but suggested that Googlebot might slow down its crawling if there are no updates. However, he admitted he wasn’t certain.
An important but unstated point related to content quality is that if the topic evolves and the content remains static, it may become less relevant and lose rankings. Regular content audits are advisable to ensure that your content stays current and continues to meet user expectations and industry standards.