Quality content has long been a major focus of Google and good SEO practices. Google’s Panda algorithm (named after Google engineer Naveet Panda) is a search filter originally introduced in 2011 designed specifically to combat spam in the form of poor quality content. Google claims the impact would impact 11.8% of is search results in the U.S. which at that time, had a far higher impact on results than most of it’s other algorithm changes to date.
However, exactly what poor quality content is being targeted here? In a press release around this time, Google explained:
we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content. We’ll continue to explore ways to reduce spam, including new ways for users to give more explicit feedback about spammy and low-quality sites.
As “pure webspam” has decreased over time, attention has shifted instead to “content farms,” which are sites with shallow or low-quality content. In 2010, we launched two major algorithmic changes focused on low-quality sites. Nonetheless, we hear the feedback from the web loud and clear: people are asking for even stronger action on content farms and sites that consist primarily of spammy or low-quality content. We take pride in Google search and strive to make each and every search perfect. The fact is that we’re not perfect, and combined with users’ skyrocketing expectations of Google, these imperfections get magnified in perception. However, we can and should do better.
The key terms here being “sites that copy others’ content and sites with low levels of original content.” and “content farms which are sites with shallow or low-quality content”. This all set the SEO community in a bit of a spin: the scale and scope of this update seemed huge. On one end of the spectrum, Google are understandably targeting spammy content farms. But all websites lie along this continuum somewhere. Whether you realise this or not, most websites will have some kind of duplicate content, whether this is news websites quoting politicians and officials, or a descriptive block of text within a technical specification document that is common between different models of that product. Often duplicated content can be quite difficult to avoid, while keeping your information accurate.
In our guide today, we will attempt to detail the top things you need to check and how you can ensure your content is Panda proof for the future.
Identify and fix ‘thin’ content pages
What we mean here really, is: ‘pages that don’t have enough information to stand in their own right, while also including something that will negatively impact SEO’. There are many ways users create pages like these (often by accident) so let’s first look at the characteristics of pages like these, these could be:
- Pages with little or not text, but lots of links i.e. if you use WordPress and use categories and tags, each one you create also creates a new pages which will usually by mywebsite.com/tag/tagname and mywebsite.com/category/category-name
- Pages with no unique text (i.e. all the text on the page can be found either elsewhere on this same website, or elsewhere on the web). This can happen on listing pages (if the same titles and descriptions may be used elsewhere, such as blog or news lists).
- A detail page that doesn’t have enough information to justify a whole page i.e. if you run an ecommerce web shop and have a single sentence description for a product, used when listing that product. When a user clicks on the product itself, they would hope for a more detailed description, including the same sentence again on this page
By the same token, it’s possible to create (for example) a blog post of only 100 words which would be considered too thin in most cases.
Check for duplicated content
Ideally, all your pages should have entirely unique content and in most cases there are really no excuses for plagiarising others copy (even if it is one of your suppliers offering a description of one of their products or services). There are some good tools out there to ensure the copy on your website is unique such as Copyscape.
However, there may be some scenarios where you have to use content that can also be found elsewhere on the web. In these cases, be sure to include enough of your own content to make it work. If you are quoting someone (and would like to use a specific paragraph verbatim), be sure to add your own analysis and breakdown of what it all means around it – plenty of ‘unique’ content surrounding it. If the descriptions of products are very similar, can you add anything further – your own personal recommendations or experiences (in which scenario one variant of the product is better than another for instance?)
Ensure spelling and grammar are optimal
Google uses grammar, spelling and punctuation as some of the more important indicators of the quality of content. Ensuring that copy is well-written, makes sense and flows with the appropriate punctuation will help Google see it as high quality.
Be sure to avoid long lists of all kinds where possible. Google will penalise keyword lists, or long comma (or bullet point) separated lists and will always favour copy that flows well while still including those keywords you wanted to get yourself listed for!
Check content keyword density and optimisation
Following on from the above point, you should also be checking that you have mentioned all the important keywords you want to be visible for. It is worth spending some time reviewing copy with a view to tweaking it and allowing the inclusion of more keywords, where doing this doesn’t significantly negatively impact the readability and grammar (above).
Check for user spam and poor quality content
This can be trickier still to manage! If you have a blog, forum or any kind of system that includes some open comments, be sure to check they don’t get abused! Spambots will often look for comment sections where they might include some of their own spam, with links back to whichever paymaster sent them! Generally, if you aren’t interested in discussing openly with people, we recommend turning comments off just due to the level of problems this can create with user generated spam and the amount of time or effort that is needed to properly manage this. There are some great services such as Akismet that specialise in providing spam guards specifically to block this.
As with all things SEO, moderation is the key! If in doubt, simply ask yourself if this type of page is “helpful to at least one type of person”. If so, then it’s likely to be fine. If however, you find a page in your site with a long list of keywords but no information, this is the time to take action!
In reality, most pages will likely lie somewhere within this spectrum and in all cases, the quality of the content can be improved.