summaryrefslogtreecommitdiff
path: root/sifterapp/complete/webpagetestp2.txt
blob: 5a28dd1b02d55c895bc35169591d82153478b774 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
In the last installment we looked at how WebpageTest can be used to establish a performance baseline. Now it's time to dig a bit deeper and see what we can do about some common performance bottlenecks.

To do that we'll turn to a new tool, [Google PageSpeed Insights][1].

Before we dive in, recall what we said last time about the importance of the Speed Index. That is, the time it takes to get something on the screen. This is different than the time it takes to fully load the page. Keep that in mind when you're picking and choosing what to optimize. 

For example, if PageSpeed Insights tells you "leverage browser caching" -- which means have your server set an Expires Header -- that's good advice in the broader sense, but it won't change your Speed Index number for first-time visitors. 

To start with we suggest focusing on things that will get you the biggest wins on the Speed Index. That's what we'll be focusing on here.

## Google PageSpeed Insights

Now that we know how long it's taking to load the page, it's time to start finding the bottlenecks in that process. If you know how to read a waterfall chart then WebpageTest can tell you most of what you want to know, but Google's PageSpeed Insights tool offers a nicer interface and puts more of an emphasis on mobile performance improvements.

There are two ways to use PageSpeed. You can use [the online service][2] and plug in your URL or you can install the [PageSpeed Insights add-on for Chrome][3], which will add a PageSpeed tab to the Chrome developer tools. 

The latter is very handy, but lacks some of the features found in the online tool, most notably checks on "[critical path][4]" performance (another name for Speed Index) and mobile user experience analysis. For that reason we suggest using both. The online service does a better job of suggesting fixes for mobile and offers a score you can use to gauge your improvements (though you should go back to WebpageTest and rerun your same tests to make sure that your Speed Index times have actually dropped). 

The browser add-on, on the other hand , will look at other network conditions, like redirects, which can hurt your Speed Index times as well.

PageSpeed Insights fetches the page twice, once with a mobile user-agent, and once with a desktop user-agent. It does not, however, simulate the constrained bandwidth of a mobile connection. For that you'll need to go back to WebpageTest. Complete details on what PageSpeed insights does are available in Google's [developer documentation][5].

When we ran PageSpeed Insights on the Sifter homepage the service made a number of suggestions:

![Screenshot of initial run]

Notice the color coding, red is high priority, yellow less and green is all the stuff you're already doing right. But those priorities are Google's suggestions, not hard and fast rules. As we mentioned above, one of the high priority suggestions is to add Expires Headers to our static assets. That's a good idea and it will help speed up the experience of visiting the site again or loading a second page that uses the same assets. But it won't help first time visitors and it won't change that Speed Index number for initial page loads.

Enabling compression on the other hand will. Adding GZip compression to our stylesheet and SVG icons would shave 154.8KB off the total page size. Fewer KBs to download always means faster page load times. This is especially true for the stylesheet since the browser stops rendering the page whenever it encounters a CSS file. It doesn't start rendering again until it has completely downloaded and parsed the CSS file so anything we can do to decrease the size of the stylesheet will see big wins.

Another thing that the online tool doesn't consider high priority, but shows up in the browser add-on is to minimize redirects.  

To take a closer look at how redirects hurt your page load times let's take a look at the third tool for performance testing, your browser tools network panel.

## The Network Panel

All modern web browsers have some form of developer tools and all of them offer a "Network" panel of some sort. For these examples we'll be using Chrome, but you can see the same thing in Firefox, Safari, Opera and IE. 

In this example you can see that the fonts.css file returned a 302 (temporarily moved) error:

![Screenshot of Network Panel]

To find out more about what this redirect is and why it's happening, we'll select it in the network panel and have a look at the actual response headers.

![Screenshot of Network Panel response headers]

In this case you can see that it redirected to another CSS file on our domain. 

This file is eating up time twice. First it's on a different domain (our webfont provider's domain) which means there's another DNS lookup to perform. That's a big deal on mobile, [see this talk][6] from Google's Ilya Grigorik for an incredibly thorough explanation of why. 

The second time suck is the actual redirect which forces the browser to try loading the same resource again (and keep in mind that this resource is a CSS file, so it's blocking rendering throughout these delays) from a different location. The second time is succeeds, but there's definitely a performance hit.

Given all that why still serve up this file? Because it's an acceptable trade off. Tungsten (the font being loaded) is an integral part of the design and in this case there are other areas we can optimize -- like enabling server-side GZip compression -- that will get us some big wins. It may be that we're able to get down close enough to the ideal one second end of the spectrum that we're okay with the font loading.

This highlights what is perhaps one of the hardest aspects of improving performance -- nothing comes for free.

When it comes to page load times there is no such thing as too fast, but there can be such a thing as over-optimization. If we ditch the font we might speed up the page load time a tiny bit, but we might also lose some of less tangible aspects of the reading experience. We might get the page to our visitors 500ms faster, but they also might be less delighted with what we've given them. The right answer to this question of what stays and what goes is a case-by-case problem.

For example, if you eliminate a JavaScript library to speed up your page but without the library your app stops working, well that would be silly. Moving that JavaScript library to CDN and caching it with a far-future Expires Header? Now that's smart. 

Performance is always a series of trade-offs. CSS blocks the loading of the page, but no one wants to see your site without its CSS. To speed up your site you don't get rid of your CSS, but you might consider inlining some of it. That is, move some of your critical CSS into the actual HTML document, enough that the initial viewport is rendered properly, and then load the stylesheet at the bottom of the page where it won't block rendering. Tools like Google's [PageSpeed Module][7] for Apache and Nginx can automatically do this for you.

The answer to performance problems is rarely to move from one extreme to the other, but to find the middle ground where performance, functionality and great user experience meet.

## What We Did

After running Sifter through Webpagetest we identified the two biggest wins -- enabling GZip compression and setting Expires Headers. The first means users download less data, so the page loads faster. The second means repeat views will be even faster because common elements like stylesheet and fonts are already in the browsers cache. 

We also removed some analytics scripts which were really only necessary on particular pages we're testing, not the site as a whole.

For us the change meant adding a few lines to Nginx. One gotcha for fellow Nginx users, you need to add your GZip and Expires configuration to your application *and* load balancing servers. Other than that snag, the changes hardly took any time at all.

The result? Our initial page load times as determined by Pagespeedtest dropped down under 4 seconds over 3G. That's a two second improvement for mobile users with very little work on our end. For those with high speed connections the Sifter homepage now gets very close to that magical one second mark. 

We were able to get there because we did the testing, identified the problems and were able to target the biggest wins rather than trying to do it all. Remember, don't test more, test smarter.



[1]: https://developers.google.com/speed/pagespeed/insights/
[2]: https://developers.google.com/speed/pagespeed/insights/
[3]: https://chrome.google.com/webstore/detail/pagespeed-insights-by-goo/gplegfbjlmmehdoakndmohflojccocli?hl=en
[4]: https://developers.google.com/web/fundamentals/performance/critical-rendering-path/
[5]: https://developers.google.com/speed/docs/insights/about
[6]: https://www.youtube.com/watch?v=a4SbDZ9Y-I4#t=175
[7]: https://developers.google.com/speed/pagespeed/module