
Matt Cutts drawing
Google rules the news. Eric Schmidt steps down and Larry Page took the future in his hands, then the pageRank update was rolled out, and now this: Google announced a new round in the fight agains webspam. As Matt Cutts says in the offical Googleblog they have relaunched the „classifier“. With caffeine behind the scenes could Google offer more and fresher pages than ever. But of course there will be a lot of spam that use new opportunities to infiltrate the serps. Therefore Google launches a new „anti-spam-algo„.
Matt says:
As we’ve increased both our size and freshness in recent months, we’ve naturally indexed a lot of good content and some spam as well. To respond to that challenge, we recently launched a redesigned document-level classifier that makes it harder for spammy on-page content to rank highly. The new classifier is better at detecting spam on individual web pages, e.g., repeated spammy words—the sort of phrases you tend to see in junky, automated, self-promoting blog comments. We’ve also radically improved our ability to detect hacked sites, which were a major source of spam in 2010. And we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content. We’ll continue to explore ways to reduce spam, including new ways for users to give more explicit feedback about spammy and low-quality sites.

Google pacman
The last weeks Google took a lot of critics regarding the quality of the results. clear that Matt Cutts as the leader of the webspam-taskforce is angry. He and his team will now fight back. Interesting to hear that one (or the main?) target are „content-farms„. I consider that he means webdirectories and news-aggregator-sites, perhaps „prize comperizoners“, and … we will see. In summery „low-quality-sites“:
As “pure webspam” has decreased over time, attention has shifted instead to “content farms,” which are sites with shallow or low-quality content. In 2010, we launched two major algorithmic changes focused on low-quality sites. Nonetheless, we hear the feedback from the web loud and clear: people are asking for even stronger action on content farms and sites that consist primarily of spammy or low-quality content. We take pride in Google search and strive to make each and every search perfect. The fact is that we’re not perfect, and combined with users’ skyrocketing expectations of Google, these imperfections get magnified in perception. However, we can and should do better.
But who decides what is a quality-site? I see a lot of trouble in the near future in the serps, and a strong discussion about what Google supposed as „quality“. Are „affiliate sites“ content-farms? I think the main target are especially these affiliate-sites. That’s why Google emphazises that they would not exempt sites with adwords:
One misconception that we’ve seen in the last few weeks is the idea that Google doesn’t take as strong action on spammy content in our index if those sites are serving Google ads. To be crystal clear:
- Google absolutely takes action on sites that violate our quality guidelines regardless of whether they have ads powered by Google;
- Displaying Google ads does not help a site’s rankings in Google; and
- Buying Google ads does not increase a site’s rankings in Google’s search results.
These principles have always applied, but it’s important to affirm they still hold true.
Clear that the discussion will start when Google penalizes online marketeers and affiliates.
Ok, lets get ready to rumble ;-) (No, I have no spammy sites, but I know some people who sweat a lot …)
Full article of Matt Cutts here: „Official Google Blog: Google search and search engine spam„
Category: SEO-stuff | Author: Martin Missfeldt Kommentare aus