Optimize pagespeed for SEO [Tools, Technologies].

For a few years now, I've been preaching that pages are too slow and that you'll scare away not only visitors but also Google. Often you hear a simple "but it loads fast enough, look". Sure, with a fiber optic line or 4G in the big city TTFB (time to first byte) values of 800ms and the overall loading times are just about bearable, but what happens in the countryside or on the road with Edge/GPRS?

So, in this post I'll talk a bit about common tools, CDNs, render blocking CSS/JS and the new HTTP/2 protocol. Later you'll read about caching, lazyload etc. Overview:

Is pagespeed a Google ranking factor?

Back in 2010, Google announced that it would include load time as a ranking factor in its search algorithm. Then came the Pagespeed Tool PageSpeed Insights, with which you can test in a simple way, how Google evaluates your page. Four years later searchmetrics showed a correlation between position and good speed. With 0.07 at that time, the correlation was quite low, but since this factor has a direct influence on the usability, i.e. the usability of a page, the pagespeed indirectly becomes one of the most influential factors in search engine optimization.

Which pagespeed tools you can use to test your homepage

For the beginners: Google's PageSpeed Insights

Google's Pagespeed tool doesn't have any selection options (e.g. regarding location), but the resulting (mobile) score gives a good and rough insight into Google's rating system. Roughly speaking: With a score of 80+ you can't be too bad.

In addition, Google shows you directly measures for optimization (the most common point will probably be the deferencing of Javascript). Very useful is the list of uncompressed images, which you can simply chase through e.g. TinyPNG.

Also for beginners: Pingdom

Doesn't have many configuration options except for the location, but it is quite fast and shows a simple waterfall analysis of the loading process. Also quite practical is the display of the distribution of requests / total sizes (content breakdown). Similar to Google's tool, optimization suggestions such as compression or image optimization are also displayed here.

Link: Pingdom Website Speed Test

For the advanced users: GTmetrix

GTmetrix from gt.net uses YSlow for pagespeed analysis and provides similar suggestions as Pingdom in the free version. After a registration there are also timings and videos of the page load. Unfortunately, it is only tested with a Firefox browser from Canada, which means that you basically forget about the local pagespeed (unless you are operating in Canade).

For the local advanced: Chrome Developer Tools

The often underestimated and forgotten secret weapon of all SEOs are the developer tools already built into Chrome. With the key combination CTRL+Shift+i you can check every page in depth. Besides the HTML/CSS Inspector, there is the Network tab, which is relevant for this article.

To start with: If your line is lousy or you are in Australia and test a site with a German server location, you will not get reliable speed values. If, on the other hand, you test a German site from Germany and also have a good line, you can get more accurate data.

Interesting in the Devtools is that you get a mobile view(adjustable to all sizes) and can throttle your network speed (EDGE, GPRS, etc.). You can also see things like protocols used (HTTP 1.1 / 2), initiators of files, waterfall analyses, status codes and much more.

 

For the experts: Webpagetest.org

Probably the most advanced tool is webpagetest.org. Here, three tests are performed in succession in the Advanced Testing run, which put the loading performance through its paces. Features worth mentioning are:

  • Location, connection, browser and device selection
  • Waterfall and connection analysis
  • Exact subdivision of Before Render / Before on Load and After on Load
  • CPU, bandwidth and browser utilization display
  • Image analysis (compression)
  • Content Breakdown (similar to Pingdom)
  • Listing of all headers
  • Load setup videos

The disadvantage of the tool is probably the lack of clarity, since you don't get a fancy overview with clear optimization suggestions. For the techies among us, however, who can do something with the database, it is a blessing. And it's free of charge.

borat - great success

Boosting at the highest level: Is a CDN necessary?

With a CDN (Content Delivery/Distribution Network), you can distribute your resources to cloud servers around the world and ensure delivery from the nearest server.

You have only one small site and you deliver content where your server is located.

Then you probably don't need a CDN. Shame on me, I don't use a CDN either, but I don't focus on the Australian or Japanese market either.

Multiple markets, lots of volume

If you have visitors from Germany and the United States alone, it will be difficult for you to guarantee a solid TTFB value due to the server location. Then it's either buy a second server or think about delivery through a CDN.

You can easily test if your values are in the green zone with keyCDN's performance tool. If the time to first byte value is over 400ms, you have a problem in this country and therefore a ranking disadvantage compared to your competitors.

Complex but important: Analyze rendering and reload elements

Stackoverflow and other forums are full of questions on how to push Google's pagespeed score to 100/100. The tasks such as image optimization, compression and response time are usually still easy to solve. What most people despair of is the following:

Eliminate JavaScript and CSS resources that block rendering in content "above the fold" (visible without scrolling).

Google's info page on solving the problem(link) is more than sparse and only scratches the surface of the problem.

Persistent topic reload javascript

Every smallest site today usually uses a framework like Bootstrap or Foundation which requires jQuery for its own javascript library. Here is a small example of the load sizes that must be loaded to use the two frameworks:

Foundation 6.3.1121kB Foundation + 252kB jQuery.js = 373kB
Bootstrap 3.3.737kB Bootstrap + jQuery 95kB = 132kB

If you do not reload these resources, the browser must download this amount after loading the body. The question that Google rightly asks itself is: Do you need 100-300kB Javascript to show the user the first screen?

Mostly the answer is: No - That's why you should concentrate on the following, magic term:

Above the fold - prioritize visible content and define Critical CSS

I can't repeat it often enough: Visitors happy, Google happy. That's why Google's optimization proposal with the renderblocking elements is not a torture measure of the company but actually a relevant matter for the user.

Why does the visitor have to load a whole page, including script crap and images that appear 4000px further down, if his viewport can only display 360x560px to begin with? Exactly, is nonsense. So you proceed as follows (simple illustration):

  1. You look at the page in different viewports (Easy going)
  2. Now you decide which elements to prioritize based on what you can see without scrolling and see in which files they are addressed (Phew, already more difficult)
  3. Move the code of the prioritized elements (CSS, JS) to the HTML source code so that they are loaded with the DOM (Holy Shit!).
  4. The rest you load after

The solution: HTTP/2

The Hypertext Transfer Protocol (HTTP) version 1.1 has been the ruling web standard for 18 years now. It's about time to refresh it a bit. HTTP/2 is the successor and supports, among other things, multiplexing and server push.

Why is this the solution?

Quite simply, the issues with critical CSS and all the sequential and late loading of resources are due to the limitations of HTTP 1.1. Google's SPDY laid the foundation for HTTP/2 in 2012 and by now, using the new protocol, you can almost do without the above mentioned measures. Nevertheless, it is still difficult to keep the pagespeed fast enough without CDN on an international level.

Who or what supports the HTTP/2 protocol?

Since modern browsers support the new protocol only in conjunction with an SSL certificate, you should switch to HTTPS anyway. If you handle customer data on the site (registrations, orders, etc.), an HTTPS page is not only a trust signal from the user's point of view, but also from that of Google. For more than three years now, an SSL certificate has been a ranking factor (albeit a weak one).

If you switch to HTTP/2 and your visitor's browser doesn't support it, don't worry, the browser will just use the old 1.1 protocol again.

Browser support for HTTP/2 - Source: Caniuse.com

Why HTTP/2 is faster

With HTTP 1.1, the browser had to establish a separate TCP connection for each request. The result of this was the classic waterfalls that you know from the Dev Console or the speed tools mentioned above. If there are too many files, the browser loads them only sequentially instead of simultaneously. However, the new protocol can transfer all data in parallel with only one TCP connection (called multiplexing). You can test a speed comparison between the old and the new protocol here: Akamai HTTP/2 Demo

Worth reading on the topic

Conclusion?

For me, there was never really any doubt whether pagespeed is relevant for Google or not. A top pagespeed is still a worthwhile endeavor for everyone. Therefore: Switch to HTTP/2, use CDNs (if necessary) and don't let your visitors wait anymore. If you absolutely want to have a Google Pagespeed Score of 100, you'll have to put a little more effort into it, of course. But be warned: At the latest at 99/100, when your own Google Analytics.js file shows a caching problem, it will be fun for you.

Soon there will be something to read about caching, but that's enough for now.

Chenqui.