Who hates content farms and rubbish blogs filled to the brim with advertisements? Yes, most of us do! Back in February Google released the Panda update a major change to the search algorithm in the U.S. to deal with crappy, spammy sites. The update seems to be a success and has been rolled out for all uses searching in English.
One of the Google fellows, Amit Singhal, had this to say:
“Today we’ve rolled out this improvement globally to all English-language Google users, and we’ve also incorporated new user feedback signals to help people find better search results. In some high-confidence situations, we are beginning to incorporate data about the sites that users block into our algorithms. In addition, this change also goes deeper into the “long tail” of low-quality websites to return higher-quality results where the algorithm might not have been able to make an assessment before. The impact of these new signals is smaller in scope than the original change: about 2% of U.S. queries are affected by a reasonable amount, compared with almost 12% of U.S. queries for the original change.”
Panda Starts Chewing Bamboo Worldwide
The incorporation of searcher blocking data is an important but secondary factor in helping Google decide which websites are good quality and which are bad. Google has already made a decision about a site and it may use the fact that people are blocking a website as confirmation that they were right.
The Panda update hopes to go deeper and root out those small bad quality websites, like those that sell steroids for body builders. However, some small websites that are high in quality have been affected and this is rather unfair. This is where the benign dictator argument comes in, we can only hope and pray that Google really is acting in the best interests of us moral Web users. Singhal reported states:
“We’re focused on showing users the highest quality, most relevant pages on the web. We’re cautious not to roll out changes until we’re confident that they improve the user experience, while at the same time helping the broader web ecosystem. We incorporate new signals into our algorithm only after extensive testing, once we’ve concluded that they improve quality for our users.”
I like the expression “helping the broader web ecosystem,” I think Google are working for the benefit of responsible Web users and are doing a good job in keeping it clean. For those who feel hard done by, there is little they can do to get the website removed from the Google bad list.
Web and rank has compiled a small list of things that a website must do to be accepted as valid by Google and hopefully climb the Google rankings. The list is made up of advice taken from the Google Webmaster Help page and shows that actually they do respect quality.
- The most important factor is originality and authority of the content. Is it genuine original content or is it just a mess of information gathered from other websites?
- Is the website easy to navigate?
- Make sure that each page has a clear topic.
- If you like to place ads on your website it is vital that they don’t obstruct viewers from getting at the actual content.
- The website should be geared towards giving the viewer what they want, and not towards the website owner making money. This must be understood not in terms of online retailing but in websites that make money from ad revenue.
- Google made it clear that a website with only a few quality pages and a bunch of low-quality pages can still suffer. Enough crap content can bring down the whole of a website.
It is kind of unfair that Google decides how the Web should be run but they are doing a pretty good job. The rapid rise in low-quality blogs that over time just effectively become link farms is super annoying and it makes me happy to think that the Panda will work to blog these guys.