Mozilla hit with Google manual spam penalty
Non-profit software organisation Mozilla has been hit with a manual web spam penalties from Google, for allowing user-generated content on its site.
Web production manager Christopher More took to Google’s Webmaster forums to share the message received:
“Google has detected user-generated spam on your site. Typically, this kind of spam is found on forum pages, guestbook pages, or in user profiles. As a result, Google has applied a manual spam action to your site.”
More said he was now unable to find any such content. However, Google spokesperson John Mueller responded by saying the pages in question may have been removed from SERPs. Furthermore, he explained the penalty in question is “granular” – it may be restricted to the spammy pages or subdomains, leaving the main Mozilla site unaffected.
Mueller explained it is Mozilla’s spammy and unmoderated blog comments that “look particularly bad”, which may worry webmasters. However, he also provided some tips for those facing a similar penalty:
- Allow community members to flag spam
- Use rel=nofollow across all blog comments
Matt Cutts later joined the discussion to explain the penalised URL contained an astonishing 12 megabytes of spam from 21,169 different comments.
The webspam crackdown
This is not the first time a high profile organisation has been subject to a Google spam penalty. The BBC asked Google for help in March after receiving an unnatural link warning with no clue as to where the link might be. Again, Google responded saying it was a granular penalty against a single offending article.
Other sites have not been so lucky. A crackdown on paid links in February saw Interflora disappear from rankings for 11 days, losing countless customers in the process.
Adrian Mursec, senior developer at theEword, commented: “As Google becomes more transparent in its spam penalties and discusses it in the public arena, webmasters have the chance to learn from the mistakes of others. It’s clear that blog comment spam won’t be tolerated, so it’s vital that webmasters take action against it; of course, this will benefit the user experience as well.”