If you've learnt anything from the story of tortoise and the hare then
forget it! Your website needs to make that hare look like a weight lifter
in a running race.
Frustration is poison to your websites traffic and the biggest frustration
to any web customer is speed, or lack thereof.
Impact of a slow website
We've have all arrived on a slow website before, sometimes giving it a chance to load. No doubt you, like everyone else, succumb to the poison
that is frustration and simply leave before it's fully loaded.
How long do you think you waited for that website to load?
5 seconds? Maybe 10?
If you are like 40% of people browsing the web then you didn't even give it 3. That's right,
3 seconds. If your happened to be on a mobile device that
figure jumps up to 50%.
"that's such an infinitesimal amount of time they cant seriously expect a
page to load that fast!" you exclaim in disgust and of course, you're right.
47% of people expect your page to load in 2 seconds or less. Yes you did
read that right, 2 seconds is less than 3 but what can I say? They're impatient.
Have you ever wondered what the attention span of someone browsing the web is?
This proves it's around 1 second. Your clients expect the page to load in 2
seconds and when it doesn't they wait a mere second before they exclaim "BOOOOORING"
and go somewhere else.
"What about our loyal returning customers? Surely they wouldn't abandon us
for a competitor just because our site is slower?" I hear you ask.
Sorry to dash your hopes but 52% of online shoppers say quick page loads are
important for their loyalty to a site.
I'm afraid to say that there's more bad news, what you've seen so far isn't
the worst of it. The most catastrophic affect of a slow site is on SEO (Search engine optimisation).
Slow websites damage Search engine ranking performance
Search engines like Google, Bing and DuckDuck Go will all lower your rank if
you have a slow website. You may be wondering "how is that worse than losing
tonnes of customers?" The simple answer is… you wont get any customers in the first
place. No leads, no sales. Ouch.
So what have we learnt so far?
- People are impatient
- People have high expectations
- Even loyal customers are swayed by speed
- Search engines stack the odds of your site even being seen against you
- If you were as impatient as half your customers you wouldn't have read this far
How can we prevent your website, your business from losing out on leads and sales?
Optimise, optimise and... you guessed it... cry. I was joking! Please stop
crying. Maybe you should just optimise some more?
If you look around online for some help on improving your page speed you will
find tonnes of articles lamenting on tackling the
"low hanging fruit"
and then never telling you how to claw back that last second or two.
We clung to this principle for as long as possible when attempting to bring
down the 4MB beast that was the 5and3 site a few months ago. Unfortunately
it's not always as simple as it looks. Cutting the initial time is easy but
the more that you optimise the harder it is to claw your way to your target
time.
Website analysis
"The first step to dealing with a problem is admitting that you have a problem" - Jase Robertson
Since websites don't talk (normally). I'm looking at
you Cleverbot. We had to rely on good old
fashioned detective work.
Thankfully there are many great sites for showing your website how it's not
just "big boned" and could do with a diet and exercise regime.
- Chrome Dev Tools
- Web Page Test
- Google page speed
I always like to start off with Web Page Test. The interface isn't great but
it gives you an in depth look at what is slowing your site down.
They have a handy grading system which is usually overly harsh. Don't let it
dishearten you. We will soon be showing it who's boss.
Even on a highly optimised website like 5and3 we still only score:
First Byte Time: B
Keep-alive Enabled: A
Compress Transfer: A
Compress Images: A
Cache static content: C
Effective use of CDN: X
This tells me a few important things about 5and3:
- Our TTFB (Time To First Byte) is over 600ms when it should be under 200ms.
This is down to how fast the server can respond.
- We need to look into what isn't being cached - It seems that this tool counts
the JavaScript and font CDN's bad server configurations against us. They don't
provide expires headers so we get a bad score
- We should probably use a CDN
All of these problems are yet to be resolved. Thankfully they are all easily
resolved once we get the time!
Although these scores aren't perfect they are a whole lot better than the original scores:
First Byte Time: F
Keep-alive Enabled: A
Compress Transfer: F
Compress Images: F
Cache static content: B
Effective use of CDN: X
So how did we recover from this score?
Website fonts
To kick things off we started by attacking our font stack. We were originally
downloading both the italic and the bold versions of our font (Museo slab 500).
Having those fonts available seemed logical. That is until we went through the
site and realised we weren't using them anywhere. Oops. This saved us around
600KB and took a second off of our download time.
Another slight improvement to download time (with little to no effort on our
part) was letting someone else deal with providing the fonts.
We use a service called Fonts.com and they offer a CSS
and two JavaScript methods of embedding their fonts directly. After some
experimentation we implemented their new JavaScript font loader as it was
noticeably faster.
Their new JavaScript version has the added advantage of being asynchronous.
An asynchronous script will be executed at the same time as the page is being
downloaded. This speeds up the download noticeably.
Website images
Our next vector of attack fell upon the images. Our site makes use of a lot
of of high resolution background images. Most of which seemed to have managed
to slip through the jaws of our compression process.
We sacrificed our images to the beast that is kraken.io
and most of them came back around 80% smaller. This saved at least 0.8 seconds
on downloads.
In regards to SVG's we were originally embedding them using a SVG sprite sheet
approach. We quickly realised however that the images included in the sprite
sheet weren't used frequently enough to warrant this treatment. All of the
unused sprites were bloating the size of our HTML file for no good reason.
Our solution to this was a good old fashioned diet. You know the kind from
the middle ages where they just cut you up into smaller pieces? You haven't
heard of it? That may be because I just made it up. It does work wonders for
SVG sprite sheets though.
With our sprite sheet suitable sliced we embedded each chunk only into the
page in which it is needed. This improved page loading time more than we
anticipated.
Our chubby little HTML file with its embedded SVG sprite sheet was taking
several packets to be returned. Our new slimline model could be downloaded
in just a couple. Score.
The CSS
Our medieval diet also worked wonders on our CSS files. Since we are using
bootstrap behind the scenes to structure some of our elements it was the
first thing to face the purge.
We eliminated 19 unused bootstrap components (that's more than a third of
bootstrap) and just about halved the size of our CSS.
After ensuring that all of our CSS was being compressed to within an inch
of its non-existent life we embedded it directly into the page to save on
HTTP requests.
Each HTTP request that you can ditch has the potential to save you anywhere
between 50ms and 500ms per request.
The JavaScript
Just like with our CSS files our JavaScript files also faced the jaws of the compressor.
We use a nifty little tool called UglifyJS
in our development stack which massively reduces file size.
Unlike the CSS however we don't embed the JavaScript into the page. Instead we
opted to retrieve them asynchronously. This allows them to download at the same
time as the page; netting us a saving of a few hundred milliseconds.
Developing and maintaining hundreds of lines of code in one file is craziness
encapsulated in code. So we package all of our individual scripts into one easy
to transfer chunk using RequireJS.
RequireJS helpfully provides all of the tools for compression and asynchronous
loading right out of the box.
Our final step was to review our Bower implementation and
get rid of the many, many packages that were being imported but never used.
This reduced our overall size by a fairly hefty chunk.
The web server
The server can actually do quite a lot to aid in the delivery of your site.
If you don't do something stupid like forget to check that GZip is working.
"What is GZip?" I hear you lament.
To understand GZip you must first understand magic. GZip is the cumulative
power of several dark and ancient rituals...
Or it's a very efficient file compression algorithm that will pretty much
half the size of any files on your website before sending them to the user.
Take your pick.
If you want to know more about enabling GZip Mr Sexton has go you covered:
Enable gzip compression
Another way to improve performance is ensure your browser goes straight to
where it needs to go rather than taking a meandering route. We like to call
this redirection optimisation.
When someone navigates to a page on your site, their browser may have to hop
through several redirects before it gets to your page.
If you click on a link that goes to "http://example.com/hello" and they have
their site configures to require the "www." and a trailing slash then their
browser has to do this:
- "Look it's a new URL: 'http://example.com/hello'! Lets load the page."
- "Wait a minute they want 'www.' at the front. Let me just add that for you..."
- "Look it's a new URL: 'http://www.example.com/hello'! Lets load the page."
- "Oh they want a trailing slash? Sure no problem..."
- "Look it's a new URL: 'http://www.example.com/hello/'! Lets load the page."
- "All done loading, have a nice day."
You obviously cant help it if other sites link to you in this fashion but you
can ensure all of your internal links go like this:
- "Look it's a new URL: 'http://www.example.com/hello/'! Lets load the page."
- "All done loading, have a nice day."
Much faster and much simpler don't ya' think?
Beyond the low hanging fruit
Now that we've covered the basics lets take this one step further and look
at the "high clinging fruit" (I don't think that phrase will catch on somehow).
HTML optimisation
We have always compressed all of our images, scripts and CSS files. Our
HTML was feeling a little left out.
If you're using a templating engine like we do
(Smarty template engine) you can easily save
yourself a few kilobytes by enabling its
built in HTML post-processor.
Lazy Sizes
Our biggest saving in this "advanced" category came from a wonderful little
script called Lazy Sizing. Lazy Sizing allows us to only download the required
images and scripts that are needed for the portion of the page you are
currently viewing.
Now when you land on our homepage your browser will only bother downloading
a few hundred kilobytes of imagery rather then the megabytes it could potentially
be for the full page.
This not only massively improves our download times but saves anyone on a
limited data plan money as well. Why waste your data downloading stuff you
aren't even seeing?
Critical render
All of the steps listed above were contributing to our critical render.
Your render should be
- fonts first
- inline css at top
- anything required for building the DOM should come first in the header.
the future is fast
There's one massive disadvantage to us having faster internet connections;
developers have them too. Since the speed of the internet has been improving
the size of websites has been ballooning.
This seems logical until you're stuck in the middle of nowhere on a GPRS data
connection trying to use one of these monstrously large websites.
We aren't the only ones who have realised this has to stop.
The AMP project
Google has recently backed a project called Accelerated Mobile Pages
(AMP). AMP is essentially a restricted way of
building a website which is optimised for speed.
This isn't one of Google's pipe dreams *cough*
Wave *cough*
it is already being used by huge news websites like the BBC for their mobile
news pages.
Although AMP isn't a replacement for responsive webdesign it is a way to allow
mobile user to access your content at super speeds over slow connections.
It also has the massive benefit of being promoted in Googles search results.
Polymer
Polymer is a project based around HTML
components. It allows you to structure your code into distinct chunks.
"How does that improve performance?" We hear you snort contemptuously.
Well. As we know only downloading what your end user needs is important. Polymer
is the ultimate form of that as you will only be downloading the modules that you
can currently see on the page.
It has the added bonus of being completely asynchronous. This allows every component
of your site to be downloaded in parallel.
HTTP/2
HTTP/2 is the latest and greatest version of the HTTP protocol. It is built to
be a drop in replacement where the only noticeable difference to the end user
will be the massive speed boost.
"The focus of the protocol is on performance; specifically, end-user perceived
latency, network and server resource usage. One major goal is to allow the use of
a single connection from browsers to a Web site"
Once this protocol becomes common place you can expect to see pages loading
significantly faster.
In a test by HTTPWatch they found a 0.216 second improvement on Googles homepage
just by swapping to HTTP/2.
Conclusion
We've significantly improved the performance of our website but their is still
a way to go. We need to look into improving the TTFB (Time To First Byte) which will probably
be accomplished by moving our site to a newer server.
Both of our remaining issues can be resolved by implementing a CDN.
In the future we are hoping to start implementing some of the newer technology
like AMP pages for our work projects and news articles.