My analytical brain is always looking at different ways I can improve the conversions on my websites. From changing page titles, graphics used to entice people to give away email addresses and even the “Buy Now” buttons on sales pages, I know that small changes can have a big impact. There has been one thing though that in all honesty, I’ve really overlooked in my last few years online, and only now am I doing something about it.
It’s something that has dramatically increased revenues for online retailers like Amazon, Walmart and Shopzilla, and even greatly affected how many people use Google Maps. Fortunately it’s not something ubiquitous that only those with multi-million dollar budgets can implement, but a tactic that every single person reading this can benefit from: speeding up your website.
Before you dismiss the idea as boring or potentially irrelevant to your own marketing efforts, here are some quotes that might just convince you how important your website speed can be to your online success:
- Sales at Amazon increased by 1% for every 100 milliseconds it shaves off download times according to a report by the Wall Street Journal. For a company pulling in hundreds of millions of dollars every quarter, that’s a huge 1%.
- When Google tested showing 30 search results per page instead of 10 – changing their average load time from 0.4 seconds to 0.9 seconds – traffic and revenue decreased by 20%.
- A Compuware study found that if a page load takes more than two seconds, 40% of people are likely to abandon the page. “For a $100,000 per day ecommerce site, a one-second delay means $2.5 million in lost revenues in a year.”
- When eCommerce site Shopzilla improved their average page load by four seconds, their pageviews went up by 25% and their conversion rate went up around 7-12%.
- When Google reduced the average size of their Google Maps page from 100kb to 70-80kb, traffic to the service went up 10% in its first week and 25% in the following three weeks.
And if that’s not enough to convince you, the 2010 announcement by Google that website speed is now a factor in search rankings – even if it’s a small factor – should give you more reason to prioritise right action in this area.
On that note, I found it interesting that Google’s Eric Schmidt once stressed the importance of Google’s speed in regards to real-life issues. He commented on how increasing query results by just a fraction of a second can save lives when it comes to people searching for things like “signs of a stroke” or “signs of a heart attack”.
Why I Knew I Had to Leave Shared Hosting
For the majority of webmasters online, shared hosting is more than enough to handle the traffic levels they receive and the page load speeds you can achieve are generally good enough. One of the main reasons I’ve kept ViperChill on shared hosting with Hostgator (non-aff) for all of this time is because I recommend them. I’ve happily used their services for years now, and despite the odd glitch in that time, I’ve had nothing but good things to say about them and their exceptionally low price point.
Since ViperChill handles far more traffic these days than it used to, I have wanted to upgrade my server options for quite a long time, but with many things on my to do list, it’s just something I’ve never really gotten around to. However, this year I’ve increasingly had problems with my site going down after I send an email to my list. The first 10 minutes after unveiling the Blog Tyrant for example, the whole of ViperChill was inaccessible, and it’s my experience that if it a site is down for someone once, they aren’t going to make much effort to come back to the email (and the link) later in the day.
Add to the fact that Hostgator disabled my account, and then later disabled my entire database for overusing resources, and I knew it was time to make a Twitter plea for dedicated server recommendations. The rest of this post is a guide on exactly what I’ve done to increase the load times of this website, with a step-by-step guide on how you can massively speed up your own website as well.
Choosing a Dedicated Host
Though I’m sure for the traffic I receive that the differences between a VPS (Virtual Private Server) and Dedicated server are fairly minimal, I decided to go with the latter. I’ll be moving OptinSkin and a few other sites over to my new server in the next few days, so I want to have a good solution in place for when things go a little viral.
Thanks to some savvy ViperChill followers on Twitter, I quickly received a number of recommendations. They typically recommended popular services like Rackspace, Softlayer and LiquidWeb. LiquidWeb (non-aff) had been recommended to me the most, and an email from Ramsay recommending them pushed me over the edge and helped me choose them in a sea of options. I had no prior experience with LiquidWeb, but their prices seemed fair and they have live chat which is a requirement of mine, so I took the plunge.
My first question was which dedicated server to pick up, since they have two different Linux configurations for the same price point. I quickly hopped on live chat and didn’t really get an answer I was looking for.
I decided to go with the following configuration:
- Intel E3:1240 Quad Core
- RAM: 2GB DDR SDRAM
- HD1: 500 GB SATA Drive (7,200 rpm)
- HD2: 500 GB SATA Backup Drive
- OS: Linux OS CentOs 6 – 64Bit
- ControlPanel: CentOS 6 – WHM/Cpanel (Fully Managed)
I’m really not proficient in server management at all, so opted to pay a little extra each month in order to get full WHM/Cpanel access which I’m already familiar with.
It took just 48 hours to get everything going, and I received this email shortly after.
“I had to upgrade your processor, because we are temporarily out of the one you ordered, there will not be any extra charge for this. I will update you periodically throughout this process to keep you informed of your order status. If you have any questions, concerns, or the stated order is incorrect please feel free to respond to this ticket and I will address it for you.”
This came after a personal welcome phone call from them. So far, so good.
The actual data center where ViperChill is located. LiquidWeb have three which they fully own
The positive first impression did diminish a little after that. I woke up to find my inbox flooded with literally hundreds of emails from my server, telling me about various errors. Emails to support claimed to have fixed the problem, but then a few hours later I would once again receive hundreds of emails about different errors. This was obviously frustrating, especially after having paid so much money to the company.
Fortunately the next day my problems were solved, and they even took $50 off my first bill as an apology for the issues. Any issues I’ve had have always been responded to within 15-20 minutes. It doesn’t matter what time of day it is, I’ve never had to wait more than 30 minutes, which is really impressive to me, and makes me think I’ve made the right decision.
After moving my site across and following a number of different strategies to speed up the website, I’ve essentially doubled or tripled (depending on which service you look at) the speed at which ViperChill loads. Every tool I’ve used ranks the site faster than any of the other big marketing blogs I know you all read (in some cases twice or three times as fast) so I’m really happy with the results. There are actually still some more things I can do which I’ll be implementing in the following days, but there’s more than enough information in this guide to help you get similar results.
To give you some actual data, here is how two tools viewed the site on two separate runs, before and after the server move:
|Speed Checker||Before: Test #1||Before: Test #2||After: Test #1||After: Test #2|
You can see from the screenshot below that I managed to get page load speeds to around the 1-second mark.
While I won’t claim to be an expert on this topic, I’ve researched it enough to have a decent understanding of what I’m talking about, and my own speed improvement results really speak for themselves.
Use a CDN
There are a number of CDN (Content Delivery Network) solutions online, and the uptake in bloggers using them appears to be growing. A CDN is basically a collection of servers deployed in different areas around the world, with the idea being that you will download content from a server which is geographically closest to you, decreasing load times.
Though the difference is quite minimal when you look at say, downloading a single 40KB image, it increases exponentially the more files that you are loading from one of these CDN set-ups.
I know my friend Yoast is a fan of MaxCDN, but my personal CDN of choice is the popular Amazon Cloudfront. Amazon has 16 of these ‘edge’ configurations set-up in America alone, so someone from New York will be downloading ViperChill data from a different server to someone in California or Texas.
Other server locations include:
- Amsterdam (2)
- Frankfurt (2)
- London (2)
- Paris (2)
- Hong Kong
- Singapore (2)
- Sao Paulo
You can see how thorough they are in getting their deployment out around the world. Even more impressive is that 75% of these weren’t even available when I signed up last year, so they’re actively growing the service.
Using a CDN is incredibly scalable since you’re essentially paying simply for the bandwidth you use. If you’re wondering about the cost, then I haven’t had a single month where my bill has been over $2.
I expect this to increase quite a bit now that I’m going to be putting substantially more data on their network, but the average blogger will find their prices more than reasonable.
Of course, if you’re using their services to help you with crawling the entire internet, then your bill is going to be substantially larger. Rand Fishkin had this to say about his costs over at SEOmoz: “We’ll probably stick to a hybrid cloud model for some time. We don’t quite have the need for an entirely in-house solution like Google or Bing yet. In terms of costs, April was, I believe, close to $650K with Amazon.”
For a guide on how to set-up your account, I recommend this in-depth post over at Hongkiat.
You may notice that I’ve re-routed the typical Amazon URL’s to ‘load’ on ViperChill, using the CNAME turbo (you can call this anything you like). So instead of a typical file URL being:
It now routes to:
Though there’s a free plugin for Firefox called S3Fox which allows you to manage your Amazon S3 ‘buckets’, I’ve had problems using it when trying to set file permissions so instead purchased 3Hub (Mac only) from the App Store for $2.99.
Install a Caching Plugin
One of the things most likely to have the biggest impact on your site speed efforts is to utilise a caching plugin. The two most popular ones for WordPress are W3 Total Cache and WP Supercache. Caching plugins essentially store your pages / posts as HTML on your server, and load these when someone accesses your site.
This is considerably faster than dynamically loading your content from a database and utilising the various PHP files that are necessary to get your pages to show. With Hostgator I used W3 cache, and found that it had a great affect on the performance of my site.
However, at one time when I was using a large number of resources, a member of their technical team – and I’m not exactly sure why – recommended that I use WPSupercache instead. I didn’t, but I do now on my dedicated server. Hostgator have a great guide on how users of their service should configure WP Supercache found over here.
I don’t see why these settings wouldn’t apply to pretty much any shared host. This tweak alone can, in many cases, double the speed of a website.
My personal settings that are turned on in WPSupercache are:
- Cache hits to this website for quick access
- Use mod_rewrite to serve cached files
- Compress pages so they’re served more quickly to visitors
- Cache rebuild. Serve a supercache file to anonymous users
- Mobile device support
If you use the mod_rewrite option to serve your cached files, don’t forget to scroll down the page to update your .htaccess mod rewrite rules. If you’ve found other configurations of WPSupercache to be more beneficial, I’d love to hear them in the comments.
Run Screaming Frog on Your Website
I’ve wanted to try out this tool for a really long time, but always came up with errors when installing it on my Mac (I never heard problems about Windows). Fortunately, the latest version of ScreamingFrog runs with no issues for me, and it’s a great little tool to learn more about your website.
Their “SEO Spider” crawls your website and gives you information on a large number of page criterion. Though it’s aimed primarily at people doing SEO and looking for things like duplicate description tags, no-followed links and so on, it definitely has usability benefits.
One of the main things I looked at when running the tool was broken links. You shouldn’t only care about how quick a page loads, but if you’re actually sending people to pages that exist. Slow loading times and poor navigation can both have the negative affect of people leaving your website.
The free version only allows you to crawl through 500 URL’s – and the tool gets a little bit trapped on comment links – but if you run it on individual pages independently then you should get a good view of any errors.
Allow Gravatars? You Need This
Another caching plugin that I recommend you install is the FV Gravatar cache plugin, found here. I didn’t just want to increase the speed of my homepage of course, but individual blog posts as well. Since my posts tend to get hundreds of comments, one of the largest bottlenecks I found was that of Gravatar images loading.
Gravatars, for those of you who don’t know, are the little pictures that show next to a users comments. For blog posts with just a few comments the additional load isn’t too bad, but when you get a lot like me, showing avatars next to comment dramatically slows the loading time of a page.
This cache plugin essentially ‘saves’ a users avatar on your server, in cache, so that you aren’t constantly requesting the images from the Gravatar servers when comment pages load. The plugin has currently cached 4,449 separate images right now for my own comments.
You are also given the option to download a GZIP’d version of the file, which most modern browsers can handle. You can see this takes my CSS file down to 4KB. An 83% reduction.
I did say in the introduction to the post that there are more things I could be doing with my own optimisation efforts, and this is one of them. I have compressed most of my files, but I have had some issues when using GZIP with files on Amazon S3, and will be looking into this more in the next few days.
Set Image Dimensions
Though this tactic adds to your load time slightly in the fact it will put a few extra bytes onto your file sizes, I still recommend it. Setting image dimensions basically means that you tell your browser how big an image is, instead of just pasting in the code with something like:
Make sure you add in the specific dimensions, like:
<img src="imagelink.jpg" width="400px" height="48px">
WordPress will do this for you automatically when using their publishing tools, but if you’ve added graphics to your theme manually – like I had – then there may be some tweaks necessary.
The reason this is beneficial is because it allows a browser to ‘map out’ the dimensions of a page and where graphics fit, without having to load the page and reshuffle it later. This gives the impression that a page is being rendered much faster.
Don’t Resize Graphics via CSS
A small point, but relevant, is to make sure your images are the actual size that you are displaying them at on your website, especially if you’re making them smaller. I was a culprit of this where I have a few 64 x 64px images, but just decided to reduce their size in HTML rather than editing the actual file.
Changing the file will dminish its load size, and thus speed up your website. So, open up Photoshop or whatever you have on your computer, and get things to the size they should be.
Compress the Size of your Images
Speaking of image sizes, large graphics can be one of the biggest detriments to the performance of your website. If you’re a Photoshop user, one of the options I recommend is their ‘Save for Web’ feature, when saving a file. This essentially removes the images EXIF data (information about the actual photo, like where it was taken, and with what camera) to reduce the overall file size.
If you have a lot of graphics, this can have a dramatic affect. Another option would be to upload your images to a site like IMGUR, which does this automatically, and then download them again before putting them on your server.
Here’s what the Google Page Speed website has to say about images:
“The type of an image can have a drastic impact on the file size. Use these guidelines:
- PNGs are almost always superior to GIFs and are usually the best choice. IE 4.0b1+, Mac IE 5.0+, Opera 3.51+ and Netscape 4.04+ as well as all versions of Safari and Firefox fully support PNG, including transparency.
- Use GIFs for very small or simple graphics (e.g. less than 10×10 pixels, or a color palette of less than 3 colors) and for images which contain animation. If you think an image might compress better as a GIF, try it as a PNG and a GIF and pick the smaller.
- Use JPGs for all photographic-style images.
- Do not use BMPs or TIFFs.”
One final option is to install the WP Smush.it plugin, which is recommended by Yahoo!. What this does is run all your uploaded images to WordPress through various stages of lossless compression (meaning you don’t lose image quality) to reduce the size of your files.
Utilise These Three Tools to Identify Problem Areas
I used a number of tools while optimising my site in order to test my speed increases as I went along. The one I checked the most was Pingdom, since I like the interface and it keeps a history of the changes on your site.
There is also a page grading option which gives you advice on things to change based on your specific situation. For example, when checking if I had properly GZIP’d a file I could look at Pingdom to see if it was recognising the changes.
Me testing various changes in relation to page speed, with Pingdom
Two other tools that allow you to do something similar are the Google Pagespeed tool and GTMatrix. Both looking at your particular situation and giving you feedback based on the things they want you to change. One of those recommendations is to…
Remove Unused CSS
If you’re anything like me, you’ve probably hacked away a lot at your WordPress theme in order to change its design to be something less ‘out of the box’. This theme I’m using originally came with a number of sliders and optional navigation menus, which I’ve never used. However, though I removed the code for them in the template files, I didn’t remove any references to those functions in CSS.
Though the speed impact of a few KB is often minimal, if you really want to go all the way in your optimisation process, then this is another thing you can look at implementing.
Don’t Become Speed Obsessed
Though I think it’s important to give importance to the speed of your website, I think it’s also important not to go overboard with your changes. Sometimes there are things you need, usability or otherwise, that are going to slow down your site. I see speed optimisation as something you can do over a weekend for your WordPress blog, rather than a fortnight.
For example, one of the biggest drains on page speed for ViperChill is the Facebook like box in my sidebar. It is almost 50% of the filesize of my homepage. I could remove it, and speed up the site even further, but it’s something that I think is beneficial to growing my Facebook audience, so it’s more important to have it than not.
There are alternatives, as it’s clear after a quick Google search that a number of bloggers have had this issue, but they require the site visitor to perform two clicks to become a Facebook fan. In this case I would rather have a slightly slower site than the usability issues which come with extra clicks.
Do the necessary changes that are going to have a big impact, but don’t sweat the small stuff too much.
Advanced Speed Optimisation Guide
One of the greatest things about being involved in marketing for so long is that I have a large network of contacts who are far more knowledgeable about different subjects than me. One of the people I’ve worked with is David Steven-Jennings, a server admin who’s currently working in England, helping to keep a certain sporting website online which is very popular at the moment.
I reached out to him to see if he could give me some advice on the server side of things, for people who are more technically inclined and are looking for more advanced information on how to boost their website speeds.
Here’s what he had to say..
“Optimising a website at the server level can be a rather technical topic to discuss as there are so many things to consider. After giving it much thought, I decided on simply talking about 3 straightforward things you (or your server admin) can do to speed up your site. The cool part is that they don’t require lots of changes and will help any site.
Install a PHP accelerator
While PHP is a pretty fast scripting language, it suffers from the same problem as all scripting languages — each time a file is requested it has to be read, analyzed, compiled and then executed. This becomes a major issue when there is lots of traffic coming in. To solve this, install a PHP accelerator such as APC, eAccelerator or XCache. My personal preference is XCache as it’s still actively maintained. For those with access to the root WHM/cPanel account of their server, I believe that you can compile eAccelerator or XCache into PHP via EasyApache. For the rest of you, you’ll either need to get it from your distribution repositories or compile it (both of which which are out of the scope of this guide).
As mentioned above, PHP has to do quite a bit before it can generate any output. How PHP accelerators work is by caching the compiled version in memory for later use. This allows for some good speed gains. I’ve seen XCache speed up execution times by as much as 5 or 6 times. It addition, by removing the need to compile, each PHP request will use less memory (sometimes as much as it 50%), which is always a bonus – the less memory used, the more requests that can be handled.
Mount caching directories into memory
I’m surprised this one doesn’t come up more often. While creating static cache files really lowers CPU usage, hard drive speeds are still slow. Should the server get hit with a lot of traffic, the site will begin to slow down, as Apache has to wait for the drives to find and return the file contents. Imagine a person having to always sift through a stack of papers to find something versus someone always remembering it. Everyone will love the dude with the good memory, no?
Using Linux’s tmpfs feature, directories can be mounted into memory in temporary filesystems. This removes disk speeds as a bottleneck and allows the contents to be accessed really, really quickly. How quickly? Well, as it often takes longer for the browser to download the file than for the server to find it in memory, it’s sometimes only noticeable when the site’s under load. For example, for a large WordPress site using WP Super Cache I managed to drop request times from averaging at 2 seconds per request at 180 reqs/second to about 0.9 secs just by mounting wp-content/cache. When the traffic levels idled and we continued testing, we were getting times of around 0.2 seconds a transaction but the customer couldn’t beat 0.6 as their interweb tubes weren’t as large as ours.
Turn on database query caching
Holy cow, some CMSes love to make MySQL queries. WordPress is a culprit of this — after you add a lot of the more popular plugins you’ve got a site that likes to bully MySQL servers. As the majority of queries involve selecting stuff, database self-defence 101 starts with caching (I’m sure you’ve noticed the trend by now). The first thing to do on any MySQL server is make sure query caching is enabled. This is as simple as adding the ‘query_cache_size’ setting to your my.cnf file or setting it inside MySQL. The trade-off is, once again, that you need to use a RAM for the caching, however it can be rather small (less than 128 MB) and the benefits are well-worth it.”
Thanks again to David for his advice, and I hope you all got a lot out of this guide overall. As I stated earlier, I don’t claim to be an expert on this topic, but my own changes have had a massive affect on the load times of my website. Anything you can do to improve load times (and thus pageviews and conversions) is always recommend, so definitely set aside some time to implement a number of these when you can.
The next ViperChill post is going live on Monday, where we have Ramsay sharing an epic post. If you have other recommendations for how to improve a websites speed, I would love to hear them in the comments…