Google algorithm now punishes content pirates
Google Pirate algorithm update
Google has adjusted its search algorithm to penalise sites that publish copyright material, with changes taking effect today.
In an Inside Search blog post published on Friday, Google’s senior vice president of engineering Amit Singhal revealed a new signal would be used by the algorithm that ranks sites in search results. From today, domains or URLs with a high number of copyright removal notices will be penalised in rankings. Singhal said the changes “should help users find legitimate, quality sources of content more easily”.
Copyright removal notices can be sent to Google by anyone; over 4.3 million URLs were processed in the last 30 days alone. Unlike the manual penalties given to sites for black hat SEO practices, the ‘pirate penalty’ is being described by Google as an algorithm signal like Panda or Penguin, applied across all search results. Singhal suggested this is because “only courts can decide if a copyright has been infringed; Google cannot determine whether a particular webpage does or does not violate copyright law”.
The YouTube loophole
Google has often been criticised by the film and music industries for a lax stance on copyright infringement. In January 2012 the company was called a ‘piracy leader‘ by Rupert Murdoch as it profits from adverts shown around pirated content on YouTube. However, YouTube itself will most likely not be affected by the pirate penalty – the site’s internal reporting and removal process will still take place, with its rankings remaining untarnished. Google confirmed that this is the case for any “popular user-generated content sites”.
Richard Frost, managing editor at theEword, commented: “It looks like Google is letting its algorithm do the hard work, which could be a wise move; this way, the company can’t be accused of censorship, and is keeping the music and film industries happy without taking the law into its own hands. However, the recent Google Transparency Report showed the company receives an astonishing number of requests, so we may see a large number of sites suffering.”