duplicated content

Home/Tag:duplicated content

Technical SEO Audit Services in Manchester

Technical SEO Audit Services

Carrying out an in-depth technical SEO audit can be quite time consuming: and yet, you want to be certain there are no snags on your website that might hamper your climb to the top of Google’s search rankings.

In our experience, websites typically can have a number of serious SEO issues that are often not visible to the regular visitor with a browser. Audit My Website is an SEO company in Manchester backed by more than a decade of SEO experience from each of our staff and is designed to pick up all the details most other SEO reports will miss.

The technical SEO audit includes all the reporting you get with our top level SEO audit with a lot of extra details and analysis to highlight some of the much smaller issues on other pages. So why is a technical SEO audit better?

  • Every page is tested to ensure it can be indexed without issues by search engines.

  • Checks for penalties caused by spam or penalisation for duplicated content

  • Validity check of the HTML and CSS (to W3C standards) to test whether every page in your site is full W3C compliant. This report includes details of any issues found relating to this on a page-by-page basis.

  • Backlink analysis – we will analyse and report on your top performing backlinks to your website, and see which in particular are passing on the most SEO benefit. This can be vital for working out the best link building strategies for your website going forwards.

  • Recommendations on the best ways to improve speed and performance for your website

  • Analysis of page titles and descriptions to ensure the optimal click-through-rate (CTR) : this report will highlight any meta descriptions and titles that can be improved to assist your SEO effort

  • Analysis of website structure to ensure your website pages and Urls are collectively offering the best benefit to your targeted keywords.

  • Test that your website has a valid SSL certificate, and that all resources are loaded over SSL to ensure your website is secured against any interceptions of communication between your visitors and website.

If you have already had a top level SEO report from us, then you already have the building blocks for good SEO, this detailed technical SEO audit report will be sure that all the smaller things across the pages of your site are also identified.

Panda and Google – panda proof quality content

Quality content has long been a major focus of Google and good SEO practices. Google’s Panda algorithm (named after Google engineer Naveet Panda)  is a search filter originally introduced in 2011 designed specifically to combat spam in the form of poor quality content. Google claims the impact would impact 11.8% of is search results in the U.S. which at that time, had a far higher impact on results than most of it’s other algorithm changes to date.

However, exactly what poor quality content is being targeted here? In a press release around this time, Google explained:

we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content. We’ll continue to explore ways to reduce spam, including new ways for users to give more explicit feedback about spammy and low-quality sites.

As “pure webspam” has decreased over time, attention has shifted instead to “content farms,” which are sites with shallow or low-quality content. In 2010, we launched two major algorithmic changes focused on low-quality sites. Nonetheless, we hear the feedback from the web loud and clear: people are asking for even stronger action on content farms and sites that consist primarily of spammy or low-quality content. We take pride in Google search and strive to make each and every search perfect. The fact is that we’re not perfect, and combined with users’ skyrocketing expectations of Google, these imperfections get magnified in perception. However, we can and should do better.

The key terms here being “sites that copy others’ content and sites with low levels of original content.” and “content farms which are sites with shallow or low-quality content”. This all set the SEO community in a bit of a spin: the scale and scope of this update seemed huge. On one end of the spectrum, Google are understandably targeting spammy content farms. But all websites lie along this continuum somewhere. Whether you realise this or not, most websites will have some kind of duplicate content, whether this is news websites quoting politicians and officials, or a descriptive block of text within a technical specification document that is common between different models of that product. Often duplicated content can be quite difficult to avoid, while keeping your information accurate.

In our guide today, we will attempt to detail the top things you need to check and how you can ensure your content is Panda proof for the future.

Identify and fix ‘thin’ content pages

What we mean here really, is: ‘pages that don’t have enough information to stand in their own right, while also including something that will negatively impact SEO’. There are many ways users create pages like these (often by accident) so let’s first look at the characteristics of pages like these, these could be:

  • Pages with little or not text, but lots of links i.e. if you use WordPress and use categories and tags, each one you create also creates a new pages which will usually by mywebsite.com/tag/tagname and mywebsite.com/category/category-name
  • Pages with no unique text (i.e. all the text on the page can be found either elsewhere on this same website, or elsewhere on the web). This can happen on listing pages (if the same titles and descriptions may be used elsewhere, such as blog or news lists).
  • A detail page that doesn’t have enough information to justify a whole page i.e. if you run an ecommerce web shop and have a single sentence description for a product, used when listing that product. When a user clicks on the product itself, they would hope for a more detailed description, including the same sentence again on this page

By the same token, it’s possible to create (for example) a blog post of only 100 words which would be considered too thin in most cases.

Check for duplicated content

Ideally, all your pages should have entirely unique content and in most cases there are really no excuses for plagiarising others copy (even if it is one of your suppliers offering a description of one of their products or services).  There are some good tools out there to ensure the copy on your website is unique such as Copyscape.

However, there may be some scenarios where you have to use content that can also be found elsewhere on the web. In these cases, be sure to include enough of your own content to make it work.  If you are quoting someone (and would like to use a specific paragraph verbatim), be sure to add your own analysis and breakdown of what it all means around it – plenty of ‘unique’ content surrounding it.  If the descriptions of products are very similar, can you add anything further – your own personal recommendations or experiences (in which scenario one variant of the product is better than another for instance?)

Ensure spelling and grammar are optimal

Google uses grammar, spelling and punctuation as some of the more important indicators of the quality of content. Ensuring that copy is well-written, makes sense and flows with the appropriate punctuation will help Google see it as high quality.

Be sure to avoid long lists of all kinds where possible. Google will penalise keyword lists, or long comma (or bullet point) separated lists and will always favour copy that flows well while still including those keywords you wanted to get yourself listed for!

Check content keyword density and optimisation

Following on from the above point, you should also be checking that you have mentioned all the important keywords you want to be visible for. It is worth spending some time reviewing copy with a view to tweaking it and allowing the inclusion of more keywords, where doing this doesn’t significantly negatively impact the readability and grammar (above).

Check for user spam and poor quality content

This can be trickier still to manage! If you have a blog, forum or any kind of system that includes some open comments, be sure to check they don’t get abused! Spambots will often look for comment sections where they might include some of their own spam, with links back to whichever paymaster sent them! Generally, if you aren’t interested in discussing openly with people, we recommend turning comments off just due to the level of problems this can create with user generated spam and the amount of time or effort that is needed to properly manage this. There are some great services such as Akismet that specialise in providing spam guards specifically to block this.

As with all things SEO, moderation is the key! If in doubt, simply ask yourself if this type of page is “helpful to at least one type of person”. If so, then it’s likely to be fine. If however, you find a page in your site with a long list of keywords but no information, this is the time to take action!

In reality, most pages will likely lie somewhere within this spectrum and in all cases, the quality of the content can be improved.