Simon Steed

Home/Simon Steed

About Simon Steed

This author has not yet filled in any details.
So far Simon Steed has created 5 blog entries.

Your Pre-Launch Website Checks

Launching a new website is both exciting and stressful! The project is complete, you have a great new site, but there can so many things for everyone to check that it can be hard to know where to start. How can you be sure nothing gets missed and everything went as planned?

With this in mind, we have produced a basic website launch checklist. Depending on your requirements, we think it may be worth building your own website launch plan comprising a list of steps and a schedule for checking each based on what you need. Without further ado, here are our top points to check when launching a new website:

Post Development Checks

Once the core functionality of site site has been tested, it’s time to take that step back and review everything in it’s entirety. Only here do the really small things start start becoming visible.

  • Check for Broken Links: Even smallest websites will often have many Urls. Checking each can be a challenge in itself, even a few broken links can hold your website back in the search rankings and also have a negative impact on your visitor’s experience. Luckily, there are some great tools such as Link Assist’s Website Auditor that will spider the site and log any crawl errors such as broken links (4xx) or pages that give errors (5xx).  There is also a good free tool called Xenu.
  • Custom Not Found and Error Pages: Even when things don’t go according to plan once the site is launched (such as visitors following dead links, or mistyping a web addresses), you still want to give visitors the best experience possible. Having custom not found / error pages not only tidy up the look and feel of your site, they also reinforce your brand and help create a very professional experience.
  • Check Urls are search engine friendly and fully indexable: do you link to all places on the website? Are all pages either linked or included in the sitemap? Are there any pages with query strings that may be less easy for search engines to index and could those links be rewritten to look like a full Url? It’s worth checking that all your content is accessible.
  • Check your site displays properly: It’s fine having a site that is indexing and handles error pages and broken links, however if you have forgotten to check all the pages across all the main browsers and mobile devices, you may be in trouble. Open all the pages, navigate through the site and ensure everything displays as it should. You should be particularly careful when testing on mobiles and tablets, the times we have seen content completely disappear upon mobiles is untrue – check, check and check again.

SEO Checklist (Site Content & Structure)

Content and structure are hard to separate; they are like two sides to the same penny. You want to ensure your sites Url structure is optimal for SEO. In turn, this will also depend on what content you have upon those pages.

  • Keyword research and competitor research: This first vital step helps you identify your website’s target keywords and priorities. Whether you use the Google Keyword Planner Tool or any other, gathering together the different keywords by group and purpose, testing the levels of search and competition for each in turn provides the foundation for all other SEO activities.
  • Structure: Being guided by the results of the previous step, you can now take a look at your website structure and see if the pages allow a visitor flow that mirrors the purpose of keywords. It is probably best to think of these as grouped keywords, each grouping being bound to a particular section of the website.
  • Content optimised: In addition to structure, it’s important to check whether the content itself is well written and optimised. Does it read properly and convey your message back to your target audience in the way you need it to? Does it contain all the important information they will be interested in? Is it logically structured? Some people may visit your site with a very specific requirement, for this reason it’s always a good idea to break up the pages based on visitors need.
  • Check for duplicate content: Duplicate content can hurt your SEO efforts and annoyingly, detecting it can be tricky. There are some great duplicate content detection services such as Copyscape which allow you to paste in some copy, and it will help identify if this same copy is found elsewhere on the web.  Another way this can be tested is by copying a whole paragraph into the Google search bar and hit enter. Hopefully, your web page will be the only one that ranks (if it has been indexed) for this full paragraph search, if not then either you or someone else has copied the content.
  • Check your internal link strategy: Once you have your overall content and structure, you can now also review your internal links which really details the relationship between your content and structure. This should be aimed at not only giving you more links between pages in your site, but should also allow you to include more keywords within these links to help quietly promote various sections of the site for particular topics.  It should also be aimed at taking visitors directly to the information they came for. If you feel that one type of visitor may also be interested in another product or page, include a link in the content!
  • Check robots.txt and sitemap.xml: Make sure these are present and correct and that they allow search engines to fully index all the pages you need. If you have a content management system (CMS) don’t feel the need to include each individual admin link in the robots.txt file as this can also volunteer sensitive information to attackers by informing them of your admin Url structure and files.

Website Optimisation Checklist

  • Page load speed: This is a growing area of importance since Google announced they would take page load speed into account as a ranking signal for websites. Since then, there has been a bit of a mad scramble to see how many different ways websites can be made faster. Luckily, there are also some great tools to help test the speed of your pages. Pingdom have a page load speed test, and Google (via Webmaster Dashboard) also include their own testing tool, both of which can be used to work out where your site is failing.
  • Are Javascript and CSS minified and bundled? Most modern websites have a lot of Javascript and CSS resources. Loading all of these, particularly if they are spread across multiple domains can take a long time. Minifying and bundling these together can significantly reduce the time a user has to wait for the page to be fully functional when loaded.
  • Image optimisation: This often overlooked aspect of website optimisation can offer seriously big savings in terms of page load times.  There are a few things we suggest you check here:
    • Is the format of my image appropriate? For instance, JPEGs are best for ‘real images’ that include a wide variety of similar colours i.e. photographs. PNGs have more generic compression but support transparency and are typically smaller in size.
    • Is the (file) size of the image appropriate? Most image formats including JPEG have built in compression that can be finely tuned and can reduce most images by about 60 to 70% without a noticeable loss in quality when used.
    • Are the dimensions of the image suitable? This is a common one we see many websites get wrong with 60% of websites we encounter rendering ridiculously large image sizes. Ideally images should be as large as needed, but no larger than they will actually be used. If you upload an image several times bigger than the slideshow it will be used in, the users will never see the full detail the image includes and your server will still need to serve the entire (large) file.  In practice, this can often be many images, leading to some serious savings in page load times if images are optimised site-wide.

Site Monitoring

Just like launching a rocket, once launched, the biggest immediate risks appear over, and now it’s a case of staying the course and ensuring nothing unexpected happens further down the line.

Site Visitor Monitoring: It is important to have some kind of visitor tracking once your new site is launched. We recommend Google Analytics, but there are some other good free tools out there as alternatives. This kind of tracking is essential for monitoring your SEO and marketing efforts, but will also give you early warnings about some kinds of problems and usability issues visitors may encounter.

Site up-time monitoring: We would suggest that all websites have some kind of uptime and availability monitoring.  Pingdom and Uptime Robot are two good services that will send alerts if your website appears to stop responding. When a site goes offline, Google and search engines will begin gradually demoting the website from the rankings until it comes back online. This is also tricky to spot, so having early warning alert systems are great ways to keep informed of serious issues.

Split testing / conversion tracking: To help monitor your SEO and marketing efforts, it’s important to have some kind of performance reporting and monitoring in place. Along with Web Analytics such as Google Analytics (mentioned earlier), it’s possible to set up ‘goal tracking’ as part of conversion optimisation. By analysing which visitors completed an action (such as completing a purchase), you can obtain vital feedback about which areas of your SEO are working particularly well, which can then be useful when reviewing your SEO strategy.

Performance reporting: In order to measure the much longer term marketing strategies for your website, it’s important to include some kind of performance reporting. This can help guide some of your decisions on where to focus your online marketing efforts and tweaks next.

Backup and Recovery

Now the site is live, with all the tracking and monitoring it could need. What could possibly go wrong? Regardless, we recommend having some kind of backup strategy in place.

Plan your backups: How often will the website be backed up? If your website includes a content management system (CMS) it will likely need both files and SQL data backing up. How do I get access to these files. Do I have an offsite copy of my website?

We hope this guide can provide a basic wbesite launch plan template which will offered a few helpful insights into common tasks that need carrying out when launching a website. Hopefully if you follow the points in this guide, you won’t go far wrong.

 

Why you should not choose cheap website hosting

It’s very easy to look at your website hosting cost as the single most important factor when going through a hosting comparison and deciding who to host your website with. After all, it’s only hosting isnt it?  

From a business perspective, your hosting solution is an ongoing cost, so why would you not minimise this as much as possible? Quite a few reasons, actually!  Today we will be detailing why you should avoid cheap hosts and what you should look for when choosing a good host for your website.

Limited Resources

Most cheap hosts are cheap because they load their servers with many websites – in some cases hundreds or even in the worst cases thousands in order to make their business model work. This will have several unintended consequences for your website:

  • Slower, less reliable service: A busy server will take longer to respond, meaning your pages and site will take longer to load. This in turn will hurt both rankings and visitor experience and as a knock on effect, it’s likely to cost sales in the long run.
  • Artificial limits: Cheaper hosts will often place an artificial ‘limit’ (usually on amount of data or data over time) and once this limit is exhausted, they will either insist you pay more for the service, or simply suspend the service so your website goes offline.  This is the opposite of what you need: if your website suddenly becomes popular, you want to be able to ride this crest and enjoy the benefits that more traffic and increased enquiries brings, not suddenly shut down your shop the moment it starts to become a success!  Similarly, many cheap hosts will have limits such as limiting the number of databases (and therefore the number of websites) you can host.
  • Hidden costs will actually make ‘cheap’ hosts more expensive in the long-run. This is achieved by charging you large amounts for services that typically don’t cost the hosting company anything significant.  These include things like: automated website backup processes, transferring your domain or website, having a second database or escalating a support ticket, additional bandwidth being just a selection. The trouble is, in each of these cases, a quick resolution is needed to help your business operations be profitable. If your website is offline due to some technical difficulties, each moment it is offline it could potentially be losing sales and lose customer confidence that you actually are still trading.

Staff skills and experience

Cheaper hosts will typically have much lower skilled staff and greater lead times to resolving any technical issues that should arise. It also means you are not given the long-term advice and support you need to head off longer term issues before they develop.  For instance, as time goes on and security vulnerabilities are found in software your website uses, ideally you want a host who can make you aware, and, where needed, ensure you are protected going forwards i.e. by offering to upgrade your content management system (such as WordPress or Umbraco) website to the latest versions. Similarly, you want a host who can perform a quick security audit on the plugins / extensions or packages your website uses to ensure that any that need it are patched as soon as possible.

Security

In addition to the lack of skills, most cheap hosts will also not have set up their servers securely, for instance, they won’t necessarily ring fence website and technology with firewalls, have any suitable DDoS (Distributed Denial Of Service) protection nor will they likely audit your plugins / extensions or packages your CMS website uses to ensure they do not include any vulnerabilities.  

In fact, most won’t want to get involved in any of this and will provide your hosting account on a “use at your own risk” basis. As you can imagine, the cost and disruption of even a single big incident will usually cost more to your business than the entire year of hosting. If your website is hacked you will need a developer to carefully review the code, clean up any data, and the cost if their time will be a lot more expensive than the support the host would offer (and will likely need to be repeated the next time the website is attacked!)

Monitoring

Typically, cheap hosts don’t offer any kind of website monitoring. The only time you will ever even know there is an issue with your website will be when any of your customers let you know, and this never reflects well on your business. Not only will website outages cost you rankings in Google (and therefore, visitors and enquiries as a result) but may also cost your business credibility which will be much more costly and longer-term to put right.

Backups

Something we recommend all hosting has in place is some kind of automatic backup and recovery process. If the worst should happen to your website, you want to be back up and running as soon as possible! The cost of every extra day your website is offline will quickly dwarf any difference in cost for hosting that year.

The cost of putting it right!

Probably the biggest reason for not choosing a cheap host is that when any of these issues arise, they will be little help to you and it will fall down to your web developer to put it right. I’m sure, like ourselves, most web developers would be very keen to offer help, accept their time will be a lot more expensive, and they will likely need to revisit this work again because they are now using ‘sticking plaster’ to fix a fundamental problem. The problem will never be truly fixed, and likely end up needing further repair at a later date, leading to bigger ongoing costs in the long run!

It’s too expensive

I recently had a client on the phone that wanted to move their website hosting over to ourselves – note it’s a business website, not a personal site. When we got down to costs, I asked the simple question ‘How much are you paying now’.

The following is a snapshot of the rest of the conversation.

Client: We are currently paying £70, that is for a year.

Me: Ahh ok, and how has the service level been with your current host?

Client: Well we’ve had email go down on us three times in the last six months and the site is slow compared to others.

Me: Well our hosting is £45 per month for a WordPress site, it’s more expensive as we have very good reliability and support structures in place.

Client: My current site is being hosted by a friend of the old design company. Hes doing it as a favour. I can’t afford £45 per month.

Me: Let me ask you a quick question, how much do you charge an hour for your services?

Client: £40-55 per hour depending upon the service

Me: Do you realise, that just one paying client for an hours service pays for your website hosting for the entire month?

Client: Oh yes, you are right!

Summary

In summary, can you really not afford to have a decent hosting provision in place? The example above is typical of what we experience on a daily basis from clients. Yes I can put you in touch with £5 per year hosting, however I would expect that to go down regularly, be slow, not have any support etc

Personally if it’s a business site, i’d rather pay more and get a high quality service, know my data is backed up, know my site is protected by firewalls and more.

 

Buy safe, buy once!

The Art of Writing Good Web Content

The web is awash with advice on writing good content. So much advice, in fact, that you may actually get slightly different advice depending on who you speak to! Like winning an election, writing good copy isn’t about pleasing everyone all of the time – it’s about giving one specific core demographic what they want whilst at the same time trying to keep the others as happy as possible. Today we will detail some of the more universal guidelines to writing good copy that will help get better engagement from your web visitors.

First: understand your visitors!

There are a couple of characteristics you can expect web reader to have which are very different to other readers:

  1. Most of the time, they will scan the text, rather than read properly! Keep things concise and to the point. Make sure key information is clearly visible where needed.
  2. Visitors will typically be like a hunting animal; they have arrived your your site in search of something specific i.e. a product, a service, an idea etc. They will not necessarily read all the text you give them, but will keenly  ‘follow the scent’ to track whatever they came for. This is why it’s always a good idea to re-enforce your internal linking between blog posts, especially if they digress into different topics and trails.

Identify your topic

This may sound obvious: be specific about what you are writing about. As mentioned, the visitor is like a stalking animal, if you try and be all things to all people, you will confuse and throw them of the scent! When reviewing visitor stats, you may find some topics suddenly attract quite a bit of interest – if so, could this be a ‘multi part blog’ that continues to build on the successful topics you’ve written about?

Do your research

Try searching for the topics you plan on writing about to see what others have written. The first few results in Google are those that came the highest valued (in terms of social sharing, backlinks and so on).  Read these and see if you can identify what made them such as great answer to your search query. Are there any points you could further build on? Are there any additional angles you might write about?

Knowledge is universally appreciated. Be sure to research the topic you are writing about. By bringing references, facts and figures – whether the reader agrees with your points or not, your facts will stand in their own right and you may even find it attracting links from others who are also making points that rely on this information.

Write original content

Do you offer something unique in information or style? In some cases it may simply be presenting a well understood topic in a new light or with a unique spin. This is especially important since the Panda Google search update, designed to weed out ‘thin’ content and content farms. Be sure all your writing is your own and written in your own style!

Create a strong headline

An award-winning headline will help your content stand out. For most people, in almost all cases the headline of your blog is the first thing people will see (whether in the search results or a link from another website). What you write here will affect your click through rate (CTR) and also bounce rate. You can bring both SEO and creative writing skills to bear in this. Many find it best to jot down a handful of ideas, and simply mark off those that are particularly good to further refine ideas. So what makes a good headline? We recommend considering the ‘four u’s of headline writing:

  • Make it Unique – your headline should stand out in some way from all the others
  • Make it Ultra-specific – remembering the analogy of the hunting animal, your headline should be specific enough the visitor can decide in an instant, whether this is something they’re interested in or not.
  • Make it Useful – the headline should in some way, give the reader a clue about what the page includes and let people know the benefits of reading.
  • Make it Urgent  – your headline should put pressure on the reader to not risk missing out by not reading the post.

Make content actionable

Do you give visitors useful information they can apply right away? If the reader feels you are giving them valuable information throughout your blog which they can use, they are much more likely to continue reading.

The challenge is usually that, at the time of writing, you never truly know how your visitors will react.  Following these guidelines will ensure you always write copy that is fairly engaging and you can never go far wrong. But as with every creative industry, your milage will vary be sure to try new angles and explore the possibilities.

Panda and Google – panda proof quality content

Quality content has long been a major focus of Google and good SEO practices. Google’s Panda algorithm (named after Google engineer Naveet Panda)  is a search filter originally introduced in 2011 designed specifically to combat spam in the form of poor quality content. Google claims the impact would impact 11.8% of is search results in the U.S. which at that time, had a far higher impact on results than most of it’s other algorithm changes to date.

However, exactly what poor quality content is being targeted here? In a press release around this time, Google explained:

we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content. We’ll continue to explore ways to reduce spam, including new ways for users to give more explicit feedback about spammy and low-quality sites.

As “pure webspam” has decreased over time, attention has shifted instead to “content farms,” which are sites with shallow or low-quality content. In 2010, we launched two major algorithmic changes focused on low-quality sites. Nonetheless, we hear the feedback from the web loud and clear: people are asking for even stronger action on content farms and sites that consist primarily of spammy or low-quality content. We take pride in Google search and strive to make each and every search perfect. The fact is that we’re not perfect, and combined with users’ skyrocketing expectations of Google, these imperfections get magnified in perception. However, we can and should do better.

The key terms here being “sites that copy others’ content and sites with low levels of original content.” and “content farms which are sites with shallow or low-quality content”. This all set the SEO community in a bit of a spin: the scale and scope of this update seemed huge. On one end of the spectrum, Google are understandably targeting spammy content farms. But all websites lie along this continuum somewhere. Whether you realise this or not, most websites will have some kind of duplicate content, whether this is news websites quoting politicians and officials, or a descriptive block of text within a technical specification document that is common between different models of that product. Often duplicated content can be quite difficult to avoid, while keeping your information accurate.

In our guide today, we will attempt to detail the top things you need to check and how you can ensure your content is Panda proof for the future.

Identify and fix ‘thin’ content pages

What we mean here really, is: ‘pages that don’t have enough information to stand in their own right, while also including something that will negatively impact SEO’. There are many ways users create pages like these (often by accident) so let’s first look at the characteristics of pages like these, these could be:

  • Pages with little or not text, but lots of links i.e. if you use WordPress and use categories and tags, each one you create also creates a new pages which will usually by mywebsite.com/tag/tagname and mywebsite.com/category/category-name
  • Pages with no unique text (i.e. all the text on the page can be found either elsewhere on this same website, or elsewhere on the web). This can happen on listing pages (if the same titles and descriptions may be used elsewhere, such as blog or news lists).
  • A detail page that doesn’t have enough information to justify a whole page i.e. if you run an ecommerce web shop and have a single sentence description for a product, used when listing that product. When a user clicks on the product itself, they would hope for a more detailed description, including the same sentence again on this page

By the same token, it’s possible to create (for example) a blog post of only 100 words which would be considered too thin in most cases.

Check for duplicated content

Ideally, all your pages should have entirely unique content and in most cases there are really no excuses for plagiarising others copy (even if it is one of your suppliers offering a description of one of their products or services).  There are some good tools out there to ensure the copy on your website is unique such as Copyscape.

However, there may be some scenarios where you have to use content that can also be found elsewhere on the web. In these cases, be sure to include enough of your own content to make it work.  If you are quoting someone (and would like to use a specific paragraph verbatim), be sure to add your own analysis and breakdown of what it all means around it – plenty of ‘unique’ content surrounding it.  If the descriptions of products are very similar, can you add anything further – your own personal recommendations or experiences (in which scenario one variant of the product is better than another for instance?)

Ensure spelling and grammar are optimal

Google uses grammar, spelling and punctuation as some of the more important indicators of the quality of content. Ensuring that copy is well-written, makes sense and flows with the appropriate punctuation will help Google see it as high quality.

Be sure to avoid long lists of all kinds where possible. Google will penalise keyword lists, or long comma (or bullet point) separated lists and will always favour copy that flows well while still including those keywords you wanted to get yourself listed for!

Check content keyword density and optimisation

Following on from the above point, you should also be checking that you have mentioned all the important keywords you want to be visible for. It is worth spending some time reviewing copy with a view to tweaking it and allowing the inclusion of more keywords, where doing this doesn’t significantly negatively impact the readability and grammar (above).

Check for user spam and poor quality content

This can be trickier still to manage! If you have a blog, forum or any kind of system that includes some open comments, be sure to check they don’t get abused! Spambots will often look for comment sections where they might include some of their own spam, with links back to whichever paymaster sent them! Generally, if you aren’t interested in discussing openly with people, we recommend turning comments off just due to the level of problems this can create with user generated spam and the amount of time or effort that is needed to properly manage this. There are some great services such as Akismet that specialise in providing spam guards specifically to block this.

As with all things SEO, moderation is the key! If in doubt, simply ask yourself if this type of page is “helpful to at least one type of person”. If so, then it’s likely to be fine. If however, you find a page in your site with a long list of keywords but no information, this is the time to take action!

In reality, most pages will likely lie somewhere within this spectrum and in all cases, the quality of the content can be improved.

Page Load Speed

Also referred to as ‘page load time’, page load speed is a measure of the amount of time taken between the moment a user requests a web page to when this page is loaded in the users browser.  The quicker the website is, the better the user experience and the more likely you are to be ranked higher as a result.

In April 2010, Google announced that page load speed would be used as an important ranking signal, putting a bit more pressure both on website owners and developers to see if we can deliver a faster web experience. So what does this involve? What can you do to improve the speed of your website pages if you are concerned they aren’t running as quickly as they should?

Fortunately, there are also lots of useful tools to help analyse and test the speed of pages, while also offering some insight into how improvements might be made. Google Page Speed Tool and YSlow are two good (free) services. Most web browsers will come with a developer panel most of which include a ‘network’ tab – here the browser will detail (in sequence) what loaded, and how long it took and this can be another way to identify which bits of the page load quickly, which load slowly and where improvements can be made!

There are really two sides to consider when looking to improve page load speed:

Server Side Solutions

To begin delivering the requested web page, the server must first get it together! If you run a content management system (such as WordPress or Umbraco) this will involve reading the page’s data (and content) from a database or some kind of cache and putting this into a template which is then served as the final page. This means that page load speed in this first instance will depend on the quality of the website’s code, how streamlined and well optimised it runs and how easily it can obtain the information it needs for the page.

Some pages may require so much data, they will always be slow due to the amount of work needed from the computer. Consider a news website in which users post news articles. On this site it lists the latest 10 publications on the homepage. To update this, the website will need to (1) Check every news item ever posted  (2) order these by date published (3) take the top 10 of this list to display.

It isn’t easy or feasible to limit the scope of this query, since it will need to check everything, and in these scenarios it would be best to employ some kind of server-side caching, which basically involves simply ‘keeping a copy’ of the result the last time this was run.

The other way server side speeds can be improved is by simply adding more and better resources. Using a server with a faster processor and more memory will likely result in a site with pages that load a lot faster. Similarly, if you are currently using a shared hosting account, these will typically be a slower experience (shared hosting is a server with many websites hosted on it, so the availability of it will determined by how heavily other people’s websites are used too.). These shared hosting accounts tend to be slower than your own virtual private server or dedicated server, however having your own server ads to the costs considerably.

Load balancing involves having extra redundancy of resources so that the workload can be managed across all, rather than just one of those resources. For instance, it’s common for big websites to have load balanced databases; which means having at least two databases and setting the website up to decide, based on the level of usage at that moment, which database it will use for the current request.

Client Side considerations

Once the server has assembled and sent the finished web page to you, there will be several new things your browser will now need to do as a result of this. These include:

  • Loading all Javascript and CSS and other assets referenced by the page
  • Fetching each image upon the page
  • Rendering and display the page
  • Initialising and running the scripts contained within the page

To make things even more complicated, there will be some interdependency between these tasks, but not necessarily any sequential order developers can rely on. Whilst the loading of images may begin immediately, the page may start displaying before these have finished loading – with images being shown a split second or so later. The browser will make a lot of on-the-fly decisions very quickly when managing this process, but there are areas that can help to improve page load speed:

    • Encourage heavier browser caching of images – Since the image Url’s are not likely to change, many will recommend using heavy client-side caching. This can be achieved in a number of ways: generally this is done through HTTP headers but there are many other methods. If using Apache, this can be done via the .htaccess file (or php.ini), if using Microsoft IIS this can be done through the web.config file. By setting the maximum age of cache for images to something high (such as several days) you can seriously reduce the workload needed by the browser between pages.
    • Bundling and minifying Javascript – while this is technically a server-side feature, the browser will reap the benefit. Imagine loading a web page with 20 separate Javascript files – each would need to be loaded with 20 separate web request’s, one for each file, along with the first request for the page itself. This is a serious amount of legwork for the browser, by gathering all CSS into a single file and JS down into another file, you have reduced this down into 2 additional requests besides the page itself. Minifying this by removing spaces and any unnecessary text also further reduces the amount of work needed to send these across the web, however some code cannot be minified so ensure you backup and test fully before going ahead – nothing worse than finding you site not looking great and no backup to revert to!

 

  • Optimising images by reducing their size to the maximum usable size on your website, and also turning up compression can significantly reduce the file size of files and therefore the amount of work (and time) needed to serve them.

 

Each website’s requirement will be unique and while some websites may include many images (making it most important they are well optimised) some websites may have  fewer images, some may include more scripts, making it important to keep an open mind when profiling a website and checking for bottlenecks in the page load process.

Privacy Preference Center

    Necessary

    Advertising

    Analytics

    Other