Data gathered by Akamai on e-commerce sites shows that 40% of users will abandon a site if it fails to load within 3 seconds. Google will even penalise your site in search rankings if it's slow to load. Couple that with large numbers of users accessing sites from mobile data connections, and the time it takes for your site to initially render should be a very important consideration.
There's a number of ways to analyse how your page is being rendered, from directly in the browser using Chrome Developer Tools' Timeline feature to remotely with WebPagetest's visual comparison tool.
Optimising your page to load quickly is chiefly about understanding how the browser will download, parse and, interpret resources such as CSS and JavaScript. This is called the critical rendering path. A fantastic tool for analysing the front-end performance of your site is Google's PageSpeed Insights.
An issue this tool frequently highlights is failure to optimise CSS delivery. This warning will be triggered when CSS has been included externally in the head of the document.
<!doctype html>
<html>
<head>
<link rel="stylesheet" href="styles.css">
</head>
<body>
<h1>Hello, world!</h1>
</body>
</html>
Before the browser can render content it must process all the style and layout information for the current page. As a result, the browser will block rendering until external stylesheets are downloaded and processed, which may require multiple roundtrips and delay the time to first render. Google PageSpeed Insights
As well as the time required to download and parse the stylesheet, for browsers which don't yet support HTTP/2 there is an even further overhead from the round-trip time to open a new connection to the web server. To solve this problem, PageSpeed recommends adding inline CSS to the HTML document. This doesn't mean we want to embed the entire stylesheet into the document, as then we would still have to wait for the entire document to download and be interpreted by the browser, instead we only inline the CSS required to display the initial viewport when loading the page.
A great tool for this is Critical. Take a look at the source of the before and after examples and what PageSpeed Insights has to say about them.
Using Critical requires analysing the page with PhantomJS, a headless browser, which will then extract the styles required to show the above-the-fold content. This can be awkward to implement into a site which has a large number of different entry points (e.g. the homepage, an article, a product page). It would require pre-analysing each page type, storing the generated styles and then delivering the correct one depending on where the user has come into the site. This becomes even harder when you consider that each page type could have a different set of styles required depending on what content has been added to it. Will all articles have full width images? Will all products have a gallery? This kind of page examination with Critical can't be done at run-time either as spinning up a PhantomJS instance for each page request would cause atrocious performance issues.
With markup generated on the server-side, we're given an opportunity to analyse what's been produced and make some performance optimisations. I've created a simple example for Node.js and Express, which you can find on GitHub. We're currently using this technique on lullabot.com, where we use React to render the initial markup for the page on the server. After the page is delivered and client-side JavaScript downloaded, React Router takes over, and navigation around the site is done with the History API and XHR calls. This negates the need for a full page load each time, but also doesn't require the site to keep generating critical CSS for each new navigation action.
Below is a very simple example of rendering a React component to a string that can be delivered from Node.js — see https://github.com/mhart/react-server-example for more detailed examples.
var pageMarkup = ReactDOMServer.renderToString(MyReactApp);
The package we use to generate our critical CSS is PurifyCSS. It takes HTML and a stylesheet, then returns only the CSS which would be applied and used for that markup. Although the CSS generated won't be as optimised as using Critical, it gives us a dynamically generated fallback option for all types and permutations of pages on the site.
Taking the rendered markup from the code above, the example below runs it through purifyCSS and then injects it into our page template, loading the remaining CSS with the loadCSS technique.
var template = fs.readFileSync(path.join(__dirname, 'index.html'), 'utf8');
var markup = {
externalCss: 'style.css',
criticalCss: ''
};
purify(pageMarkup, [markup.externalCss], {
minify: true,
output: false,
info: false,
rejected: false
}, function (purifiedOutput) {
markup.criticalCSS = purifiedOutput;
var html = mustache.to_html(template, markup);
});
<!doctype html>
<html>
<head>
<style>
{{&criticalCSS}}
</style>
<script>
function loadCSS(e,n,o,t){"use strict";var d=window.document.createElement("link"),i=n||window.document.getElementsByTagName("script")[0],r=window.document.styleSheets;return d.rel="stylesheet",d.href=e,d.media="only x",t&&(d.onload=t),i.parentNode.insertBefore(d,i),d.onloadcssdefined=function(e){for(var n,o=0;o<r.length;o++)r[o].href&&r[o].href===d.href&&(n=!0);n?e():setTimeout(function(){d.onloadcssdefined(e)})},d.onloadcssdefined(function(){d.media=o||"all"}),d}
loadCSS('/{{&css}}');
</script>
<noscript><link href="/{{&css}}" rel="stylesheet"></noscript>
</head>
<body>
<h1>Hello, world!</h1>
</body>
</html>
Make sure to keep on eye on how much CSS is being inlined, as running this over a framework such as Bootstrap could cause a lot to be produced. Finally, whilst we haven't seen any performance issues with this technique in production (we also typically use Varnish in front of Node.js), at this point it would be advisable to run some benchmarks for your pages to ensure it's not causing any meltdowns!