SEO Optimisation

Home/Tag:SEO Optimisation
SEO (Search Engine Optimisation) is the backbone of all online advertising, getting this right is vital to maximising your ROI and the effectiveness of your other digital advertising efforts.

SEO (Search Engine Optimisation in Tag: SEO Optimisation)

SEO (Search Engine Optimisation) is the backbone of all online advertising, getting this right is vital to maximising your ROI and the effectiveness of your other digital advertising efforts.

Audit My Website are leading providers of SEO Audits in Tag: SEO Optimisation, UK. We work with clients nationwide offering vital website checks that will ensure your SEO is optimal and that no barriers prevent your website enjoying the best rankings and visibility possible. So what is needed for good SEO?

SEO (Search Engine Optimisation) Audit

The Art of Writing Good Web Content

The web is awash with advice on writing good content. So much advice, in fact, that you may actually get slightly different advice depending on who you speak to! Like winning an election, writing good copy isn’t about pleasing everyone all of the time – it’s about giving one specific core demographic what they want whilst at the same time trying to keep the others as happy as possible. Today we will detail some of the more universal guidelines to writing good copy that will help get better engagement from your web visitors.

First: understand your visitors!

There are a couple of characteristics you can expect web reader to have which are very different to other readers:

  1. Most of the time, they will scan the text, rather than read properly! Keep things concise and to the point. Make sure key information is clearly visible where needed.
  2. Visitors will typically be like a hunting animal; they have arrived your your site in search of something specific i.e. a product, a service, an idea etc. They will not necessarily read all the text you give them, but will keenly  ‘follow the scent’ to track whatever they came for. This is why it’s always a good idea to re-enforce your internal linking between blog posts, especially if they digress into different topics and trails.

Identify your topic

This may sound obvious: be specific about what you are writing about. As mentioned, the visitor is like a stalking animal, if you try and be all things to all people, you will confuse and throw them of the scent! When reviewing visitor stats, you may find some topics suddenly attract quite a bit of interest – if so, could this be a ‘multi part blog’ that continues to build on the successful topics you’ve written about?

Do your research

Try searching for the topics you plan on writing about to see what others have written. The first few results in Google are those that came the highest valued (in terms of social sharing, backlinks and so on).  Read these and see if you can identify what made them such as great answer to your search query. Are there any points you could further build on? Are there any additional angles you might write about?

Knowledge is universally appreciated. Be sure to research the topic you are writing about. By bringing references, facts and figures – whether the reader agrees with your points or not, your facts will stand in their own right and you may even find it attracting links from others who are also making points that rely on this information.

Write original content

Do you offer something unique in information or style? In some cases it may simply be presenting a well understood topic in a new light or with a unique spin. This is especially important since the Panda Google search update, designed to weed out ‘thin’ content and content farms. Be sure all your writing is your own and written in your own style!

Create a strong headline

An award-winning headline will help your content stand out. For most people, in almost all cases the headline of your blog is the first thing people will see (whether in the search results or a link from another website). What you write here will affect your click through rate (CTR) and also bounce rate. You can bring both SEO and creative writing skills to bear in this. Many find it best to jot down a handful of ideas, and simply mark off those that are particularly good to further refine ideas. So what makes a good headline? We recommend considering the ‘four u’s of headline writing:

  • Make it Unique – your headline should stand out in some way from all the others
  • Make it Ultra-specific – remembering the analogy of the hunting animal, your headline should be specific enough the visitor can decide in an instant, whether this is something they’re interested in or not.
  • Make it Useful – the headline should in some way, give the reader a clue about what the page includes and let people know the benefits of reading.
  • Make it Urgent  – your headline should put pressure on the reader to not risk missing out by not reading the post.

Make content actionable

Do you give visitors useful information they can apply right away? If the reader feels you are giving them valuable information throughout your blog which they can use, they are much more likely to continue reading.

The challenge is usually that, at the time of writing, you never truly know how your visitors will react.  Following these guidelines will ensure you always write copy that is fairly engaging and you can never go far wrong. But as with every creative industry, your milage will vary be sure to try new angles and explore the possibilities.

Web Analytics

Web Analytics

With a retail environment, it’s easy to see how customers behave; their general flow, leading to ‘hotspots’ and visibilities of particular product placements, thereby allowing the company to decide how to present the store. However, with a website you can’t see your visitors at all, so something more is called for in order to allow you the same insights.

In 2005, Google rolled out their own version of Google Analytics, using technology from their previously acquired Urchin Software Corp in April 2005. This included a lot more detail than a lot of other tracking software of this time and it has continued to become more feature rich to this day. This is a technology we include on all websites we build today.

In this post, we will cut through the technical terms to explain what each area actually does and will detail how these can be used effectively to give you ideas you can use to develop and enhance your digital marketing campaign.

Goal Tracking

Goal tracking is a useful technique which helps to identify important factors that determine the likelihood of a visit turning into an enquiry. By setting up a ‘goal’ you are telling the analytics what action you consider to be a ‘success’. If you run an online shop, this may be completing the checkout phase (i.e. entering credit card information and completing the purchase), or may simply be completing a contact form, or downloading a brochure or something similar. You can have as many goals as you wish and these can be tied to almost anything that happens once a visitor is in your site.

Doing this, opens up a new possibility: you can then analyse the data collected, filter by those that reached the goal, and work back to how they initially entered the site, what their path through the site was and all other information which helps identify any patterns or factors which may have contributed to this goal being reached. This can also be tied in to your other business reporting as part of your marketing automation efforts.

For instance, doing this may highlight that visits following links from your Twitter account and twice as likely to result in an enquiry, or that people looking for a product A are much more likely to buy on their visit than people looking for product B. Once you have these kinds of insights you can then update your website to reflect this: if people looking for product A are much more likely to enquire, is it worth having a banner advert on each page leading people directly to this product?

Split Testing ( A/B testing )

This is a new technique that appeared as a result of highly developed web analytics and is most useful when used on landing pages. By preparing two versions of a landing page, users can be delivered randomly to version 1 or version 2. Their behaviour is then closely measured using analytics to identify which version of this page has the highest conversion rate.

By repeating this test to tease out each factor that influences behaviour, understanding the customers behaviour better means you can improve the conversion rate of the landing page itself.

Funnels

Once you have set up goals, funnels allow you to get even more detail on the path the visitor took before reaching the goal. These could be (for instance) arriving on the homepage, clicking ‘about product A’ then a page ‘buy product A’. By defining each of these pages as funnel steps, you can then analyse the ‘drop-off’s’ and ‘exit pages’ where people did not follow the beaten path we were expecting. This can highlight optimisation opportunities to make it more obvious how to navigate this path.

Funnels can also be used to produce nice visualisation which illustrates the flows and drop-offs at each step helping you address any of the common reasons or places people do not enquire.

Panda and Google – panda proof quality content

Quality content has long been a major focus of Google and good SEO practices. Google’s Panda algorithm (named after Google engineer Naveet Panda)  is a search filter originally introduced in 2011 designed specifically to combat spam in the form of poor quality content. Google claims the impact would impact 11.8% of is search results in the U.S. which at that time, had a far higher impact on results than most of it’s other algorithm changes to date.

However, exactly what poor quality content is being targeted here? In a press release around this time, Google explained:

we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content. We’ll continue to explore ways to reduce spam, including new ways for users to give more explicit feedback about spammy and low-quality sites.

As “pure webspam” has decreased over time, attention has shifted instead to “content farms,” which are sites with shallow or low-quality content. In 2010, we launched two major algorithmic changes focused on low-quality sites. Nonetheless, we hear the feedback from the web loud and clear: people are asking for even stronger action on content farms and sites that consist primarily of spammy or low-quality content. We take pride in Google search and strive to make each and every search perfect. The fact is that we’re not perfect, and combined with users’ skyrocketing expectations of Google, these imperfections get magnified in perception. However, we can and should do better.

The key terms here being “sites that copy others’ content and sites with low levels of original content.” and “content farms which are sites with shallow or low-quality content”. This all set the SEO community in a bit of a spin: the scale and scope of this update seemed huge. On one end of the spectrum, Google are understandably targeting spammy content farms. But all websites lie along this continuum somewhere. Whether you realise this or not, most websites will have some kind of duplicate content, whether this is news websites quoting politicians and officials, or a descriptive block of text within a technical specification document that is common between different models of that product. Often duplicated content can be quite difficult to avoid, while keeping your information accurate.

In our guide today, we will attempt to detail the top things you need to check and how you can ensure your content is Panda proof for the future.

Identify and fix ‘thin’ content pages

What we mean here really, is: ‘pages that don’t have enough information to stand in their own right, while also including something that will negatively impact SEO’. There are many ways users create pages like these (often by accident) so let’s first look at the characteristics of pages like these, these could be:

  • Pages with little or not text, but lots of links i.e. if you use WordPress and use categories and tags, each one you create also creates a new pages which will usually by mywebsite.com/tag/tagname and mywebsite.com/category/category-name
  • Pages with no unique text (i.e. all the text on the page can be found either elsewhere on this same website, or elsewhere on the web). This can happen on listing pages (if the same titles and descriptions may be used elsewhere, such as blog or news lists).
  • A detail page that doesn’t have enough information to justify a whole page i.e. if you run an ecommerce web shop and have a single sentence description for a product, used when listing that product. When a user clicks on the product itself, they would hope for a more detailed description, including the same sentence again on this page

By the same token, it’s possible to create (for example) a blog post of only 100 words which would be considered too thin in most cases.

Check for duplicated content

Ideally, all your pages should have entirely unique content and in most cases there are really no excuses for plagiarising others copy (even if it is one of your suppliers offering a description of one of their products or services).  There are some good tools out there to ensure the copy on your website is unique such as Copyscape.

However, there may be some scenarios where you have to use content that can also be found elsewhere on the web. In these cases, be sure to include enough of your own content to make it work.  If you are quoting someone (and would like to use a specific paragraph verbatim), be sure to add your own analysis and breakdown of what it all means around it – plenty of ‘unique’ content surrounding it.  If the descriptions of products are very similar, can you add anything further – your own personal recommendations or experiences (in which scenario one variant of the product is better than another for instance?)

Ensure spelling and grammar are optimal

Google uses grammar, spelling and punctuation as some of the more important indicators of the quality of content. Ensuring that copy is well-written, makes sense and flows with the appropriate punctuation will help Google see it as high quality.

Be sure to avoid long lists of all kinds where possible. Google will penalise keyword lists, or long comma (or bullet point) separated lists and will always favour copy that flows well while still including those keywords you wanted to get yourself listed for!

Check content keyword density and optimisation

Following on from the above point, you should also be checking that you have mentioned all the important keywords you want to be visible for. It is worth spending some time reviewing copy with a view to tweaking it and allowing the inclusion of more keywords, where doing this doesn’t significantly negatively impact the readability and grammar (above).

Check for user spam and poor quality content

This can be trickier still to manage! If you have a blog, forum or any kind of system that includes some open comments, be sure to check they don’t get abused! Spambots will often look for comment sections where they might include some of their own spam, with links back to whichever paymaster sent them! Generally, if you aren’t interested in discussing openly with people, we recommend turning comments off just due to the level of problems this can create with user generated spam and the amount of time or effort that is needed to properly manage this. There are some great services such as Akismet that specialise in providing spam guards specifically to block this.

As with all things SEO, moderation is the key! If in doubt, simply ask yourself if this type of page is “helpful to at least one type of person”. If so, then it’s likely to be fine. If however, you find a page in your site with a long list of keywords but no information, this is the time to take action!

In reality, most pages will likely lie somewhere within this spectrum and in all cases, the quality of the content can be improved.

Page Load Speed

Also referred to as ‘page load time’, page load speed is a measure of the amount of time taken between the moment a user requests a web page to when this page is loaded in the users browser.  The quicker the website is, the better the user experience and the more likely you are to be ranked higher as a result.

In April 2010, Google announced that page load speed would be used as an important ranking signal, putting a bit more pressure both on website owners and developers to see if we can deliver a faster web experience. So what does this involve? What can you do to improve the speed of your website pages if you are concerned they aren’t running as quickly as they should?

Fortunately, there are also lots of useful tools to help analyse and test the speed of pages, while also offering some insight into how improvements might be made. Google Page Speed Tool and YSlow are two good (free) services. Most web browsers will come with a developer panel most of which include a ‘network’ tab – here the browser will detail (in sequence) what loaded, and how long it took and this can be another way to identify which bits of the page load quickly, which load slowly and where improvements can be made!

There are really two sides to consider when looking to improve page load speed:

Server Side Solutions

To begin delivering the requested web page, the server must first get it together! If you run a content management system (such as WordPress or Umbraco) this will involve reading the page’s data (and content) from a database or some kind of cache and putting this into a template which is then served as the final page. This means that page load speed in this first instance will depend on the quality of the website’s code, how streamlined and well optimised it runs and how easily it can obtain the information it needs for the page.

Some pages may require so much data, they will always be slow due to the amount of work needed from the computer. Consider a news website in which users post news articles. On this site it lists the latest 10 publications on the homepage. To update this, the website will need to (1) Check every news item ever posted  (2) order these by date published (3) take the top 10 of this list to display.

It isn’t easy or feasible to limit the scope of this query, since it will need to check everything, and in these scenarios it would be best to employ some kind of server-side caching, which basically involves simply ‘keeping a copy’ of the result the last time this was run.

The other way server side speeds can be improved is by simply adding more and better resources. Using a server with a faster processor and more memory will likely result in a site with pages that load a lot faster. Similarly, if you are currently using a shared hosting account, these will typically be a slower experience (shared hosting is a server with many websites hosted on it, so the availability of it will determined by how heavily other people’s websites are used too.). These shared hosting accounts tend to be slower than your own virtual private server or dedicated server, however having your own server ads to the costs considerably.

Load balancing involves having extra redundancy of resources so that the workload can be managed across all, rather than just one of those resources. For instance, it’s common for big websites to have load balanced databases; which means having at least two databases and setting the website up to decide, based on the level of usage at that moment, which database it will use for the current request.

Client Side considerations

Once the server has assembled and sent the finished web page to you, there will be several new things your browser will now need to do as a result of this. These include:

  • Loading all Javascript and CSS and other assets referenced by the page
  • Fetching each image upon the page
  • Rendering and display the page
  • Initialising and running the scripts contained within the page

To make things even more complicated, there will be some interdependency between these tasks, but not necessarily any sequential order developers can rely on. Whilst the loading of images may begin immediately, the page may start displaying before these have finished loading – with images being shown a split second or so later. The browser will make a lot of on-the-fly decisions very quickly when managing this process, but there are areas that can help to improve page load speed:

    • Encourage heavier browser caching of images – Since the image Url’s are not likely to change, many will recommend using heavy client-side caching. This can be achieved in a number of ways: generally this is done through HTTP headers but there are many other methods. If using Apache, this can be done via the .htaccess file (or php.ini), if using Microsoft IIS this can be done through the web.config file. By setting the maximum age of cache for images to something high (such as several days) you can seriously reduce the workload needed by the browser between pages.
    • Bundling and minifying Javascript – while this is technically a server-side feature, the browser will reap the benefit. Imagine loading a web page with 20 separate Javascript files – each would need to be loaded with 20 separate web request’s, one for each file, along with the first request for the page itself. This is a serious amount of legwork for the browser, by gathering all CSS into a single file and JS down into another file, you have reduced this down into 2 additional requests besides the page itself. Minifying this by removing spaces and any unnecessary text also further reduces the amount of work needed to send these across the web, however some code cannot be minified so ensure you backup and test fully before going ahead – nothing worse than finding you site not looking great and no backup to revert to!

 

  • Optimising images by reducing their size to the maximum usable size on your website, and also turning up compression can significantly reduce the file size of files and therefore the amount of work (and time) needed to serve them.

 

Each website’s requirement will be unique and while some websites may include many images (making it most important they are well optimised) some websites may have  fewer images, some may include more scripts, making it important to keep an open mind when profiling a website and checking for bottlenecks in the page load process.