Fall of the Machine Part II: Google's "Disavow Links" Tool

by Aaron Bradley on October 16, 2012

in Search Engines, SEO

Fall of the Machine Part II - The Disavow Links Tool from Google

In May of 2009 I wrote a post on Google's paid link reporting mechanism, opining that the introduction of a manual link review process undermined Google's traditional and successful reliance on machine-based algorithms to rank web pages.

Fast forward to 16 October 2012, when suddenly the airwaves are buzzing with Matt Cutts' announcement of Google's "disavow links tool" at PubCon.

To a certain extent this is a déjà vu moment. Like the paid links reporting tool it is at once an admission that Google's mighty machines aren't up to the task of assessing the native value of hyperlinks without some degree of human intervention. And as with paid link reporting, it also "formally makes the SEO industry necessary" as an auxiliary force to police Google's webmaster guidelines.

However, the possible necessity of link disavowal seems more nefarious to me in two respects.

First, while paid link reporting ultimately put the onus on those abusing Google's reliance on PageRank to clean up their acts (at least those identified through the paid link reporting tool), link disavowal means puts webmasters on notice that they're now on the hook for behavior by others over which they have no control. To use a legal analogy, the presumption of innocence is no more.

And a mechanism for reporting "bad links" does not place webmasters in control, except insofar as its an explicit admission that competitors can influence your search visibility (simultaneously placing a nail in the coffin of the "is there such a thing as 'negative SEO' debate – cased closed). Yes, a webmaster now can combat a competitor's malfeasance – if they opt into Google Webmaster Tools so they may receive "unnatural link" notifications, and if they have the time, knowledge and motivation to disavow linkspam.

(Google's announcement sports a marvelously ambiguous adverb in it's second sentence. "If you haven't gotten this ['unnatural links'] notification, this tool generally isn't something you need to worry about." Emphasis mine.)

In short, where one could previously garner something like a competitive advantage by optionally ratting out nefarious competitors, now one is seemingly required to protect one's reputation for purity by formally distancing oneself from slanderers.

Which leads me to my second point – namely that this sort of esoteric search engine optimization activity favors sites with bigger budgets and well-informed marketers. A small business with a site that offers "great content" to visitors ("great content" having become almost a sacred mantra in Matt Cutts' missives) may now, apparently, fall prey to the evil intentions of unscrupulous competitors unless they employ professional help to keep the vampires at bay.

To a large extent much of the concern over "negative SEO" is probably a tempest in a teapot, and really most webmasters shouldn't worry their pretty little heads over "unnatural links." As the announcement post says, the "[v]ast, vast majority of sites do not need to use this tool," that's "primary purpose is to help clean up if you've hired a bad SEO or made mistakes in your own link-building" and that one "[t]ypically" does not have to clean up links that a webmaster didn't create. Still, the profusion of qualifiers employed (emphasis, again, mine) leaves Google a lot of wiggle room to disavow their own advice.

The fact of the matter is that bad links, like good content, ultimately have a human behind the wheel. And one thing I've always considered a primary reason for Google's success and strategic brilliance is to remove humans from the process of ranking websites (keep in mind, for example, that the primary purpose of Google's "quality raters" is not to circumvent the algorithm, but to help hone it). There are probably in the neighborhood of 40 to 50 billion pages in Google's index: any significant human intercessory process to determine the relative value of these pages obviously doesn't scale.

So – to me at least – further crowd-sourcing of Google quality guidelines enforcement further undermines the foundation of their algorithmic success. All of which may be simply to say that the age of the link-as-an-impartial-vote may soon come to an end (link fraud, unlike voter fraud, is an actual problem). And maybe what rises from the ashes may once again allow Google to return to its mechanical roots, at least until pesky humans again upset the applecart.

Ready to disavow your social connections, y'all?

{ 2 comments… read them below or add one }

Leave a Comment

Previous post:

Next post: