SEO (Search Engine Optimisation) provides the foundation of your digital online advertising and is vital to maximising your return on investment and the effectiveness of your marketing strategy.

SEO (Search Engine Optimisation in Tag: Google)

SEO (Search Engine Optimisation) is the backbone of all online advertising, getting this right is vital to maximising your ROI and the effectiveness of your other digital advertising efforts.

Audit My Website are leading providers of SEO Audits in Tag: Google, UK. We work with clients nationwide offering vital website checks that will ensure your SEO is optimal and that no barriers prevent your website enjoying the best rankings and visibility possible. So what is needed for good SEO?

SEO (Search Engine Optimisation) Audit

How To Ensure Content Is Mobile Optimised

If you’ve been following news in the SEO industry lately, hopefully your website is already mobile responsive. But is the content mobile optimised? If the content of your site isn’t optimised for mobile, it could be adversely affecting both user experience (known as UX) as well as impacting conversion rates.


In April 2015, Google rolled out a major global algorithm update which the media quickly dubbed ‘Mobilegeddon’. This update meant that mobile optimisation would be used as a ranking factor for mobile searches, so that websites that were mobile responsive would naturally rank above an otherwise equivalent site that was not ready for mobile users.

So the rush to make sites mobile responsive began, immeasurably improving the small screen experience for mobile internet users everywhere, and yet this development work only answers part of the question. In fact, having bespoke, mobile-optimised content is required for a positive user experience and potentially greater conversion rates.

If you have never considered mobile content, this article covers all the advice you need to make a big difference with the smaller displays.


As you’re probably already aware, reading content on a mobile device is completely different to reading content on a desktop. In 2006, a study by Jakob Nielsen found that people read web pages in an F-Shaped pattern that consists of two horizontal stripes followed by a vertical stripe. A separate eye-tracking study carried out by Briggsby found mobile-users view web content primarily on the centre of the screen, with 86% of their attention on the upper two-thirds. In addition, a US National Institute of Health study found that internet users’ eyes are drawn more to images than text, helping to demonstrate that the placement of images and text is crucial if you want your content to stand out!

When optimising your content for mobile, it’s usually best to reduce the number of images that take up screen space and bandwidth so that focus may be primarily on the information within your text. Taking into account those studies mentioned, it would be best to ensure your most important content features in the top two-thirds of the page if you are keen to give this information prominent visibility.


Although tone of voice is critical to both engage readers and ensuring your content reinforces your brand identity, there’s also a slightly different need mobile users have.

In addition to having a limited-size display, a mobile user is much more likely to either be out and about travelling or perhaps in a much more casual setting. In either case, you should keep text short and snappy. Get to the point, if your company ‘about us’ page is too long then perhaps a mobile user would be most happy with the basic information i.e. contact address, phone number (with a link) and email? You can still link to further testimonials or information about the product or service as needed.

This is probably best all around, as a study as recently as 2015 by Microsoft showed that (in general) reader’s attention spans are dropping.

It’s also a good idea to keep paragraphs short as possible, even to the point where each sentence becomes a new paragraph (although this is very subjective). You may think three or four sentences look fine on a desktop, but on a mobile this can quickly becomes a wall of text!


It’s important to consider what you mobile users need from the site: are thier needs the same as those using desktops? How do they differ?

A great example of this is the difference between the Trainline desktop and mobile offerings – you’ll notice the desktop offers a media-rich aesthetically pleasing page using the full width. Where a mobile has the all-important search-box that will start a user’s journey to finding a suitable train and booking tickets.

While responsive design may be vital to ensuring your site both complies with Google’s mobile algorithm and offers your visitors a good user experience, to increase traffic and conversion rates you need to ensure your content itself is fully optimised.

If you have any concerns or questions about your mobile content and want to ensure your website is fully mobile optimised, why not get in touch with our team today?

Why Is SEO Important for Businesses?

So you’re writing some fantastic, high quality content and you’re regularly posting it to your social media channels. Yet despite everything, you still don’t see your website at the top of Google. What could you possibly be doing wrong?

The most likely reason is that you haven’t fully optimised your content for search engines. Yes the content itself may be great, but if other bloggers are writing mediocre content that is fully optimised, they will always have a slight edge and better chance of ranking near the top.

So here are the top reasons why your business needs to get on board with search engine optimisation (SEO) and how your other marketing efforts can benefit.


SEO is a form of inbound marketing. It’s preferable above other types of offline advertising because it continues to offer rewards for the effort you put into SEO content writing long after the blog post is published. It can also work out more cost-effective than other forms of online marketing such as Pay-Per-Click (PPC) advertising, social media marketing and email marketing. But perhaps best of all, you want to be enjoying the best visibility possible for the content you have spent so much time writing.

PPC and social media may boost your revenue and showcase your brand image, it’s SEO that remains the backbone of your online presence, constantly working hard behind the scenes and marrying seamlessly into your other digital marketing efforts.

SEO is also favourable because it’s largely invisible and sympathetic to searchers needs: if you have optimised your site and blog post well, hopefully it will rank for the types of questions a searcher might be entering to find an answer to their question. You become a type of answerbank, and in this regard both your aims and those of Google’s become the same. Separate studies have also shown people are more willing to trust the quality of accuracy of organic search results over paid (or sponsored) Google AdWords.

SEO does the hard part for you by getting potential customers through the door, it’s then your job to convince them that you’re the best business to fulfil their needs.


In the SEO world, it’s no secret that Google rules. This search engine holds credibility and the trust in the search results it delivers. It’s a natural assumption that those businesses at the top of the SERPs are the most relevant and trustworthy for the searchers phrase, while those at the bottom are probably less so.

But what helps Google trust your website? As well as considering the keywords you want to rank for, you have to consider things like backlinks, links to other websites, traffic volumes, easy and user friendly web page navigation, ensuring none of your websites pages contain errors, and more recently, attention to how fast your web pages load, SSL certificates and more.

Keep on top of all these factors and you’ll be in good stead to beat the competition by showing you’re a credible company with a trustworthy website.


An unoptimised site will find it more difficult for customers to find you – compared with the same site when fully optimised. And it will prove particularly difficult to attract those who didn’t already know they needed your products and services.

This is why Keyword Research is a vital step to ensuring your blog post can attract the type of traffic looking for the content you are providing. It gives you a straw poll of the types of searches people do when attempting to answer the sort of questions you may be writing about. Like many forms of advertising, it’s both based on intuition and educated guesswork. While you may feel you know all types phrases people might search for, there will always be that one person out there who would say: “That’s funny, I would have searched for … instead.” And they are not wrong, hence why keyword research is necessary. Both to analyse the levels of search volume on the phrases you know about, and to highlighting any alternative keyword opportunities you may not even have considered when writing the post in the first place!

To put this into another context,if you had a high-street store selling discounted mountaineering equipment you’re much more likely to be successful by opening it in the Lake District where there is a constant footfall of your target customers, rather than opening it on London’s Bond Street next to a Prada and simply assuming people will come inside.

ANALYSE, TWEAK AND REPEAT (it never stops!)

SEO is a long term commitment and in order to see true success from it you need to continue monitoring and tweaking your approach as you would with any other marketing strategy.

Google Analytics is an invaluable SEO tool allowing historic and realtime analysis of traffic to your site. It gives you all the detail you need to identify sales funnels, your visitors behaviour including how they found you and in some cases what they searched for. It is also not limited to organic searches, but will also give you a detailed analysis of any active PPC campaigns you may also have.

So don’t lose out on customers simply because you don’t fully understand search engine optimisation, or assume it takes more time and money than you have to spare to get it off the ground. Some things are intuitive and, as the site owner, you may be best placed to understand your searchers habits and needs.

Our team of SEO experts can conduct a complete SEO audit of your website and identify what tweaks can offer a world of difference to your rankings and site traffic. If you are interested in ramping up traffic, increasing your sales leads and enjoying the full benefit of the content you’ve carefully written, you should get in touch with us today!

Web Analytics

Web Analytics

With a retail environment, it’s easy to see how customers behave; their general flow, leading to ‘hotspots’ and visibilities of particular product placements, thereby allowing the company to decide how to present the store. However, with a website you can’t see your visitors at all, so something more is called for in order to allow you the same insights.

In 2005, Google rolled out their own version of Google Analytics, using technology from their previously acquired Urchin Software Corp in April 2005. This included a lot more detail than a lot of other tracking software of this time and it has continued to become more feature rich to this day. This is a technology we include on all websites we build today.

In this post, we will cut through the technical terms to explain what each area actually does and will detail how these can be used effectively to give you ideas you can use to develop and enhance your digital marketing campaign.

Goal Tracking

Goal tracking is a useful technique which helps to identify important factors that determine the likelihood of a visit turning into an enquiry. By setting up a ‘goal’ you are telling the analytics what action you consider to be a ‘success’. If you run an online shop, this may be completing the checkout phase (i.e. entering credit card information and completing the purchase), or may simply be completing a contact form, or downloading a brochure or something similar. You can have as many goals as you wish and these can be tied to almost anything that happens once a visitor is in your site.

Doing this, opens up a new possibility: you can then analyse the data collected, filter by those that reached the goal, and work back to how they initially entered the site, what their path through the site was and all other information which helps identify any patterns or factors which may have contributed to this goal being reached. This can also be tied in to your other business reporting as part of your marketing automation efforts.

For instance, doing this may highlight that visits following links from your Twitter account and twice as likely to result in an enquiry, or that people looking for a product A are much more likely to buy on their visit than people looking for product B. Once you have these kinds of insights you can then update your website to reflect this: if people looking for product A are much more likely to enquire, is it worth having a banner advert on each page leading people directly to this product?

Split Testing ( A/B testing )

This is a new technique that appeared as a result of highly developed web analytics and is most useful when used on landing pages. By preparing two versions of a landing page, users can be delivered randomly to version 1 or version 2. Their behaviour is then closely measured using analytics to identify which version of this page has the highest conversion rate.

By repeating this test to tease out each factor that influences behaviour, understanding the customers behaviour better means you can improve the conversion rate of the landing page itself.


Once you have set up goals, funnels allow you to get even more detail on the path the visitor took before reaching the goal. These could be (for instance) arriving on the homepage, clicking ‘about product A’ then a page ‘buy product A’. By defining each of these pages as funnel steps, you can then analyse the ‘drop-off’s’ and ‘exit pages’ where people did not follow the beaten path we were expecting. This can highlight optimisation opportunities to make it more obvious how to navigate this path.

Funnels can also be used to produce nice visualisation which illustrates the flows and drop-offs at each step helping you address any of the common reasons or places people do not enquire.

Panda and Google – panda proof quality content

Quality content has long been a major focus of Google and good SEO practices. Google’s Panda algorithm (named after Google engineer Naveet Panda)  is a search filter originally introduced in 2011 designed specifically to combat spam in the form of poor quality content. Google claims the impact would impact 11.8% of is search results in the U.S. which at that time, had a far higher impact on results than most of it’s other algorithm changes to date.

However, exactly what poor quality content is being targeted here? In a press release around this time, Google explained:

we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content. We’ll continue to explore ways to reduce spam, including new ways for users to give more explicit feedback about spammy and low-quality sites.

As “pure webspam” has decreased over time, attention has shifted instead to “content farms,” which are sites with shallow or low-quality content. In 2010, we launched two major algorithmic changes focused on low-quality sites. Nonetheless, we hear the feedback from the web loud and clear: people are asking for even stronger action on content farms and sites that consist primarily of spammy or low-quality content. We take pride in Google search and strive to make each and every search perfect. The fact is that we’re not perfect, and combined with users’ skyrocketing expectations of Google, these imperfections get magnified in perception. However, we can and should do better.

The key terms here being “sites that copy others’ content and sites with low levels of original content.” and “content farms which are sites with shallow or low-quality content”. This all set the SEO community in a bit of a spin: the scale and scope of this update seemed huge. On one end of the spectrum, Google are understandably targeting spammy content farms. But all websites lie along this continuum somewhere. Whether you realise this or not, most websites will have some kind of duplicate content, whether this is news websites quoting politicians and officials, or a descriptive block of text within a technical specification document that is common between different models of that product. Often duplicated content can be quite difficult to avoid, while keeping your information accurate.

In our guide today, we will attempt to detail the top things you need to check and how you can ensure your content is Panda proof for the future.

Identify and fix ‘thin’ content pages

What we mean here really, is: ‘pages that don’t have enough information to stand in their own right, while also including something that will negatively impact SEO’. There are many ways users create pages like these (often by accident) so let’s first look at the characteristics of pages like these, these could be:

  • Pages with little or not text, but lots of links i.e. if you use WordPress and use categories and tags, each one you create also creates a new pages which will usually by and
  • Pages with no unique text (i.e. all the text on the page can be found either elsewhere on this same website, or elsewhere on the web). This can happen on listing pages (if the same titles and descriptions may be used elsewhere, such as blog or news lists).
  • A detail page that doesn’t have enough information to justify a whole page i.e. if you run an ecommerce web shop and have a single sentence description for a product, used when listing that product. When a user clicks on the product itself, they would hope for a more detailed description, including the same sentence again on this page

By the same token, it’s possible to create (for example) a blog post of only 100 words which would be considered too thin in most cases.

Check for duplicated content

Ideally, all your pages should have entirely unique content and in most cases there are really no excuses for plagiarising others copy (even if it is one of your suppliers offering a description of one of their products or services).  There are some good tools out there to ensure the copy on your website is unique such as Copyscape.

However, there may be some scenarios where you have to use content that can also be found elsewhere on the web. In these cases, be sure to include enough of your own content to make it work.  If you are quoting someone (and would like to use a specific paragraph verbatim), be sure to add your own analysis and breakdown of what it all means around it – plenty of ‘unique’ content surrounding it.  If the descriptions of products are very similar, can you add anything further – your own personal recommendations or experiences (in which scenario one variant of the product is better than another for instance?)

Ensure spelling and grammar are optimal

Google uses grammar, spelling and punctuation as some of the more important indicators of the quality of content. Ensuring that copy is well-written, makes sense and flows with the appropriate punctuation will help Google see it as high quality.

Be sure to avoid long lists of all kinds where possible. Google will penalise keyword lists, or long comma (or bullet point) separated lists and will always favour copy that flows well while still including those keywords you wanted to get yourself listed for!

Check content keyword density and optimisation

Following on from the above point, you should also be checking that you have mentioned all the important keywords you want to be visible for. It is worth spending some time reviewing copy with a view to tweaking it and allowing the inclusion of more keywords, where doing this doesn’t significantly negatively impact the readability and grammar (above).

Check for user spam and poor quality content

This can be trickier still to manage! If you have a blog, forum or any kind of system that includes some open comments, be sure to check they don’t get abused! Spambots will often look for comment sections where they might include some of their own spam, with links back to whichever paymaster sent them! Generally, if you aren’t interested in discussing openly with people, we recommend turning comments off just due to the level of problems this can create with user generated spam and the amount of time or effort that is needed to properly manage this. There are some great services such as Akismet that specialise in providing spam guards specifically to block this.

As with all things SEO, moderation is the key! If in doubt, simply ask yourself if this type of page is “helpful to at least one type of person”. If so, then it’s likely to be fine. If however, you find a page in your site with a long list of keywords but no information, this is the time to take action!

In reality, most pages will likely lie somewhere within this spectrum and in all cases, the quality of the content can be improved.

Page Load Speed

Also referred to as ‘page load time’, page load speed is a measure of the amount of time taken between the moment a user requests a web page to when this page is loaded in the users browser.  The quicker the website is, the better the user experience and the more likely you are to be ranked higher as a result.

In April 2010, Google announced that page load speed would be used as an important ranking signal, putting a bit more pressure both on website owners and developers to see if we can deliver a faster web experience. So what does this involve? What can you do to improve the speed of your website pages if you are concerned they aren’t running as quickly as they should?

Fortunately, there are also lots of useful tools to help analyse and test the speed of pages, while also offering some insight into how improvements might be made. Google Page Speed Tool and YSlow are two good (free) services. Most web browsers will come with a developer panel most of which include a ‘network’ tab – here the browser will detail (in sequence) what loaded, and how long it took and this can be another way to identify which bits of the page load quickly, which load slowly and where improvements can be made!

There are really two sides to consider when looking to improve page load speed:

Server Side Solutions

To begin delivering the requested web page, the server must first get it together! If you run a content management system (such as WordPress or Umbraco) this will involve reading the page’s data (and content) from a database or some kind of cache and putting this into a template which is then served as the final page. This means that page load speed in this first instance will depend on the quality of the website’s code, how streamlined and well optimised it runs and how easily it can obtain the information it needs for the page.

Some pages may require so much data, they will always be slow due to the amount of work needed from the computer. Consider a news website in which users post news articles. On this site it lists the latest 10 publications on the homepage. To update this, the website will need to (1) Check every news item ever posted  (2) order these by date published (3) take the top 10 of this list to display.

It isn’t easy or feasible to limit the scope of this query, since it will need to check everything, and in these scenarios it would be best to employ some kind of server-side caching, which basically involves simply ‘keeping a copy’ of the result the last time this was run.

The other way server side speeds can be improved is by simply adding more and better resources. Using a server with a faster processor and more memory will likely result in a site with pages that load a lot faster. Similarly, if you are currently using a shared hosting account, these will typically be a slower experience (shared hosting is a server with many websites hosted on it, so the availability of it will determined by how heavily other people’s websites are used too.). These shared hosting accounts tend to be slower than your own virtual private server or dedicated server, however having your own server ads to the costs considerably.

Load balancing involves having extra redundancy of resources so that the workload can be managed across all, rather than just one of those resources. For instance, it’s common for big websites to have load balanced databases; which means having at least two databases and setting the website up to decide, based on the level of usage at that moment, which database it will use for the current request.

Client Side considerations

Once the server has assembled and sent the finished web page to you, there will be several new things your browser will now need to do as a result of this. These include:

  • Loading all Javascript and CSS and other assets referenced by the page
  • Fetching each image upon the page
  • Rendering and display the page
  • Initialising and running the scripts contained within the page

To make things even more complicated, there will be some interdependency between these tasks, but not necessarily any sequential order developers can rely on. Whilst the loading of images may begin immediately, the page may start displaying before these have finished loading – with images being shown a split second or so later. The browser will make a lot of on-the-fly decisions very quickly when managing this process, but there are areas that can help to improve page load speed:

    • Encourage heavier browser caching of images – Since the image Url’s are not likely to change, many will recommend using heavy client-side caching. This can be achieved in a number of ways: generally this is done through HTTP headers but there are many other methods. If using Apache, this can be done via the .htaccess file (or php.ini), if using Microsoft IIS this can be done through the web.config file. By setting the maximum age of cache for images to something high (such as several days) you can seriously reduce the workload needed by the browser between pages.
    • Bundling and minifying Javascript – while this is technically a server-side feature, the browser will reap the benefit. Imagine loading a web page with 20 separate Javascript files – each would need to be loaded with 20 separate web request’s, one for each file, along with the first request for the page itself. This is a serious amount of legwork for the browser, by gathering all CSS into a single file and JS down into another file, you have reduced this down into 2 additional requests besides the page itself. Minifying this by removing spaces and any unnecessary text also further reduces the amount of work needed to send these across the web, however some code cannot be minified so ensure you backup and test fully before going ahead – nothing worse than finding you site not looking great and no backup to revert to!


  • Optimising images by reducing their size to the maximum usable size on your website, and also turning up compression can significantly reduce the file size of files and therefore the amount of work (and time) needed to serve them.


Each website’s requirement will be unique and while some websites may include many images (making it most important they are well optimised) some websites may have  fewer images, some may include more scripts, making it important to keep an open mind when profiling a website and checking for bottlenecks in the page load process.