" Mueller responded the following: "Usually we do not talk about how many algorithms we use.
We publicly state that we have 200 factors when it comes to scanning, indexing and ranking.
Most of the other reports that come to us is just information that we collect and can use to improve our algorithms in the future.At the same time, he noted that small reports about violations of one page scale are less prioritized for Google.This information was reported by Jennifer Slagg in the The SEMPost blog.Since Google Penguin was modified into real-time update and started ignoring spam links instead of imposing sanctions on websites, this has led to a decrease of the value of auditing external links.In general, the difficult part is that Googlebot is not a browser, so it does not get the same speed effects that are observed within a browser when implementing HTTP / 2.
We can cache data and make requests in a different way than a regular browser.
These companies have different opinions on the reason why they reject links.
I don't think that helding too many audits makes sense, because, as you noted, we successfully ignore the links, and if we see that the links are of an organic nature, it is highly unlikely that we will apply manual sanctions to a website.
This question was put to the John Mueller, the company’s employee during the last video conference with webmasters.
The question was: "When you mention Google's quality algorithm, how many algorithms do you use?
Oct 08/2017 During the last video conference with webmasters Google rep called John Mueller said that Googlebot still refrains to scan HTTP.