There are quite a few myths plaguing the SEO community right now. But no topics have longer trails of myths and rumors following them than Google’s Panda and Penguin updates.
Are you sure you’re not a firm believer in one of these myths? Read this to find out. Let’s start with the most wide-spread one.
1. Panda is about duplicate content
Flickr image by Suzies Farm
Google’s Panda update was originally called the Farmer update, and many believed it targeted so-called content farms. Even though it did target content farms among other things, Panda has never been aimed at duplicate content only.
Perhaps you’ve read accounts of SEOs and webmasters who say they have only “original and quality content” on their site, and yet their website sank in the SERPs. So, why did this happen?
Little by little, the SEO community began to realize that not only did content have to be original or grammatically correct, but that it also had to be engaging. And what’s “engaging content”? This is the kind of content that makes visitors spend more time on the site, click through to additional webpages, etc. So, that’s how yet another popular Panda myth was born.
2. Panda penalizes sites with high bounce rates
Flickr image by Keith Allison
At some point, SEOs began to suspect that Google might be using Google Analytics data to determine whether the content on a particular site was receiving enough engagement. That is, some people began to claim that, if a site has little traffic and high bounce-rates and if the visitors do not spend a lot of time on it, this triggers the Panda filter.
Some even recommended disabling Google Analytics altogether in order to protect a site from Google Panda. However, Matt Cutts did clarify the matter in a short video he made for Google Webmaster Central:
As you see, Google does not use Google Analytics data in its ranking algorithm in any way.
Hence, Google’s Panda is not just about “penalizing” sites with duplicate content or punishing sites with poor traffic statistics. It’s more about making sure that websites that are interesting, useful and engaging rank at the top of the search results.
The Panda update was named after a Google’s engineer, who developed a machine learning algorithm, which allows Google to imitate the behavior of human search quality raters to determine a particular site’s quality.
3. Penguin is about keyword stuffing
Flickr image by shehani
Even though Google quoted keyword stuffing as an unethical SEO practice targeted with the Penguin update, I think a lot of people misunderstand the term. If you look at the keyword stuffing example provided in the original Google’s post, you’ll see that real keyword stuffing is hard to confuse with something else.
At the same time, I hear a lot of people asking “if I use my keyword 5 times on the page, would that be keyword stuffing?” So, some people are really getting paranoid, while I think it’s next to impossible to stuff your page with keywords when you don’t actually intend to.
So, I’d recommend anyone optimizing content for keywords to first write the copy of their site and then make sure it contains the keywords. Or you could use an on-page SEO tool to get recommendations as to where your keywords should go.
4. Penguin penalized sites with directory links
Flickr image by dcysurfer/Dave Young
Some folks say that, post-Penguin, any site that has directory links is doomed. This is yet another exaggeration. Online directories are essentially catalogues of website URLs, which doesn’t make them malicious by default.
There are really useful, reputable niche and local directories on the Web that can help you drive direct traffic to your site and/or generate leads. So, being listed in those directories is not a bad thing. Even though this may not have much effect on rankings, it would not lead to a penalty either.
Besides, I believe it’s better to list your site in several paid and highly-relevant directories than just submit to a bigger number of less relevant directories.
If you do nothing to manipulate the SERPs, you are safe from Panda/Penguin
Flickr image by San Diego Shooter
Sometimes, I hear people complain on SEO forums that they are being outranked in the SERPs by some scraper site, or that they received an unnatural links warning even though they’d never built a single link. Unfortunately, things like these happen, even though now there are ways to prevent such situations – and it’s important to know how to handle the latter.
For example, in order to show Google that you’re the original creator of a content piece, claim your authorship over it.
Or, if you see a number of spammy-looking backlinks pointing to your site (especially if the anchor texts are too good to be true), this could be a competitor trying to get you slammed by Penguin. In this case, you can use Google’s new link disavow tool to tell Google that you don’t vouch for those suspicious links.
Just remember that, before you use the said tool, you need to try to get those links removed first, and include evidence into your reconciliation report. In any case, it’s better to check site backlinks on a regular basis, so that you can always detect a negative SEO attack early on.
They say, where there’s smoke there’s fire. So, the above mentioned Panda and Penguin myths must have had their ground as well. But it’s good to remember that one should not be in a rush to generalize based on one single incident, and that consistent testing is required before one can make conclusions.