Thursday, July 25, 2013

Panda is the part of the Google algorithm that targets low quality websites. However, there has been some argument over the content Google treats as being low quality. This has led to many unhappy webmasters who believe that their websites have been incorrectly targeted. Google, maintaining secrecy of its algorithm, does not reveal much as to what it considers as being good content.

Due to this, most webmasters are unsure about ranking well in the post-Panda world. However, one can see some features of websites that benefited from this update and reach some general tips to stay in Google's good books.


Google Panda SEO
Matt Cutts with a fluffy Panda

1. Make sure your website template is spacious.

This is part of both the benchmark criteria and the user feedback criteria. From evidence of sites that are now ranking well in the search results, users like spacious layouts, with plenty of white space.


This is intuitive, as busy crowded websites are hard on the eye, while spacious websites allow the eye to rest due to the white space around the text. (This also applies in the offline world - have you noticed that high quality expensive supermarkets like Waitrose have wide aisles with gleaming white floors and a feeling of space, while cheap supermarkets like Aldi have very narrow aisles, because to compensate for the cheap products, they need to cram in more stuff per square foot to enable them to make a profit. Users have now been trained to think that classy shops are spacious and cheap shops are crowded, and this expectation is now transferred online when they browse websites).

2. Don't crowd too many adverts onto your page

It's long been known that users get put off by too many ads (one of the reasons Google became so popular in the first place was because they had an ad-free homepage, in contrast to AOL and Yahoo). However, many web masters seeking to maximize revenue, will cram as many ads above the fold as they can in order to get clicks.

Google Panda Update
Doesn't seem too impressive. Does it?
Google has clearly decided to side with their searchers, and content farms which typically used to have eight or nine ad units per page, got hurt badly in the Panda update.

3. Improve your site's loading speed

Users hate websites that take forever to load, and Google has been warning for some time that they are beginning to take page loading speed into account when ranking websites. To compound things, if users habitually backspace because a site isn't loading quickly, you are going to get a bad score from the machine learning part of Panda.

Fast Websites Rank Higher
In the online world, it isn't good to be the tortoise.

As it happens, if you have excessive ads, your page will load slower as it has to call the advertiser's server for every single ad unit. So, improving speed can be as simple as removing an ad block.

When placing images on your page, make sure they in JPEG format and not PNG format, as JPEGs load faster. While videos enhance the user experience, they also slow the loading of your page, so make sure that you have no more than one video per page.

You can also use PageSpeed Insights from Google for tips specific to your websites.

Finally, ask yourself whether you really need to have pop-ups or funky bars on your site - they slow the loading of the page and irritate visitors.

4. Be Social

Google has clearly stated that they use a webpage's social presence to determine its importance. It has also stated that +1's might be used as a signal for search rankings in the future. So, having an active community on Facebook, Twitter and Google+ becomes an absolute necessity.

Social Media SMO

5. Avoid thin content

There is some evidence that the Panda update penalised websites that had a lot of "thin" content - pages that did not really provide any useful information to visitors from the search engines.

Go through your website page by page and improve your articles. It's not just a question of adding more words to fluff a page out - you need to add useful information too.


6. Avoid duplicate content

The problem of duplicate content is a tough one, as content theft is exploding across the web. A lot of scrapers will use your RSS feed to steal your entire articles automatically. You can stop this by simply setting the feed settings to "short" or "summary" so that even if they draw on your feed, all they will get is the first paragraph.

7. Duplicate meta descriptions
Duplicate Content is bad for SEO.

Often a noob will hard code a meta description onto their main template which means the same description gets shown no matter which post on your site gets loaded. Prior to panda, Google was forgiving and usually came up with their own descriptions instead. Post-Panda however, Google has concluded that duplicate meta descriptions mean that someone is trying to game their search engine.

Therefore, remove hard coded descriptions from your template, and then install the free plugins that let you add them to individual pages. This will help you to create unique meta descriptions for every single post and page on your site.

8. Take care not to have duplicate tag pages

Many webmasters use tag pages as a means to add some internal links to their blog posts. However, if you have too many orphan tag pages (tag pages featuring just one post), you can end up with a lot of tag pages that are duplicates, as they essentially feature the same post.

Take some time to go through your site and make sure that each tag page is unique and features a different collection of posts, so that there is no duplication on your site.

9. Link out to other trusted websites.
Link to Trusted Websites

Old School quality articles from circa 2000 always linked out to references. It was not quite full blown academic writing with references at the bottom, but instead hyperlinks dotted through the text so that users could click through to find out more about any topic that was being referred to.

10. Avoid having too many affiliate links on a page.

Remember that with Panda, they got human raters to look at sites, and when the sites were classed as good, bad and indifferent, they then analyzed the metrics in order to profile the sites, and fed the metrics into their algorithm.

11. Poison Words

Have you ever wondered how your email provider manages to trap spam emails and put them in the spam folder? Some of this trapping is down to blacklisted email addresses from the sender, but the other part of the spam trap involves scanning the text of the email and using Bayesian spam filtering to look for "poison words". Typically these are bigrams or pairs of words that appear frequently on spam pages (usually because the spammer has hired a lot of cheap labor to produce the pages and their writers regurgitate the writing cliches of the niche they are in).

Search engines also use these bigrams to detect spam. From the Panda point of view, all Google needed to do was scan the sites their quality raters had identified as "bad" and look for common bigrams within the text, and then program their algorithm to hunt for them. They probably have a threshold for the occurrence of these words - if they appear too frequently, then they know they have a spam or doorway affiliate site.

Conclusion

The recommendations listed above entail a lot of hard work. However, the Panda update is a signal from Google to improve and upgrade your website, and those who do this most thoroughly are likely to rank better in the search results.

Source

0 comments:

Post a Comment