Optimize pagespeed for SEO [Tools, Technologies].

For some years now, I have been preaching that websites are too slow and that this not only drives away visitors but also Google. You often hear a simple "But it loads fast enough, look". Sure, with a fiber optic line or 4G in the big city, TTFB (time to first byte) values of 800ms and the overall loading times are just tolerable, but what happens in the country or on the road with Edge/GPRS? So, in this article I'll take a look at common tools, CDNs, render blocking CSS/JS and the new HTTP/2 protocol. Later you will read about caching, Lazyload etc. Overview:

Is pagespeed a Google ranking factor?

As early as 2010, Google announced that the loading time as a Ranking factor into the search algorithm. This was followed by the PageSpeed Insights tool, with which you can easily test how Google rates your site. Four years later, searchmetrics has developed a Correlation between position and good speed was shown. At 0.07 at the time, the correlation was quite low, but as this factor has a direct influence on usability, i.e. the usability of a page, page speed is indirectly one of the most influential factors in search engine optimization.

Which pagespeed tools you can use to test your homepage

For the beginners: Google's PageSpeed Insights

The Pagespeed Tool from Google has no selection options (e.g. in terms of location), but the resulting (mobile) score gives a good and rough insight into Google's rating system. Roughly speaking: With an 80+ score, you can't be too bad. In addition, Google shows you direct measures for optimization (the most common point will probably be the deferring of Javascript). Quite useful is the list of uncompressed images, which you can simply replace with e.g. TinyPNG can hunt.

Also for beginners: Pingdom

It does not have many configuration options apart from the location, but it is quite fast and shows a very simple waterfall analysis of the loading process. The display of the proportion distribution of requests / total sizes (content breakdown) is also very practical. Similar to Google's tool, optimization suggestions such as compression or image optimization are also displayed here. Link: Pingdom Website Speed Test

For the advanced users: GTmetrix

GTmetrix used by gt.net among others YSlow for pagespeed analysis and provides similar suggestions to Pingdom in the free version. After registering, you can also get timings and videos of the page load. Unfortunately, it only tests with a Firefox browser from Canada, which means you can basically forget about local pagespeed (unless you're operating in Canada).

For the local advanced: Chrome Developer Tools

The often underestimated and forgotten secret weapon of all SEOs are the developer tools already built into Chrome. With the key combination CTRL+Shift+i you can check every page in depth. In addition to the HTML/CSS Inspector, there is the tab Networkwhich is relevant to this article. First of all: If your connection is poor or you are based in Australia and are testing a site with a German server location, you will not get reliable speed values. However, if you test a German site from Germany and also have a good connection, you can get more accurate data. The interesting thing about Devtools is that you can use a Mobile viewt (adjustable to all sizes) and you can also throttle your network speed (EDGE, GPRS etc.). You can also see things like protocols used (HTTP 1.1 / 2), file initiators, waterfall analysis, status codes and much more.

 

For the experts: Webpagetest.org

Probably the most advanced tool is webpagetest.org. Three tests are carried out in succession in the Advanced Testing run, which put the loading performance through its paces. Features worth mentioning are:
  • Location, connection, browser and device selection
  • Waterfall and connection analysis
  • Exact subdivision of Before Render / Before on Load and After on Load
  • CPU, bandwidth and browser utilization display
  • Image analysis (compression)
  • Content Breakdown (similar to Pingdom)
  • Listing of all headers
  • Load setup videos
The disadvantage of the tool is probably the lack of clarity, since you don't get a fancy overview with clear optimization suggestions. For the techies among us, however, who can do something with the database, it is a blessing. And it's free of charge. borat - great success

Boosting at the highest level: Is a CDN necessary?

With a CDN (Content Delivery/Distribution Network), you can distribute your resources to cloud servers around the world and ensure delivery from the nearest server.

You have only one small site and you deliver content where your server is located.

Then you probably don't need a CDN. Shame on me, I don't use a CDN either, but I don't focus on the Australian or Japanese market either.

Multiple markets, lots of volume

If you have visitors from Germany and the United States alone, it will be difficult for you to guarantee a solid TTFB value due to the server location. In this case, you will either have to purchase a second server or consider delivery via a CDN. You can easily check whether your values are in the green zone with keyCDN's Performance Tool test. If the time to first byte value is over 400ms, you have a problem in this country and therefore a Ranking disadvantage compared to your competition.

Complex but important: Analyze rendering and reload elements

Stackoverflow and other forums are full of questions on how to push Google's pagespeed score to 100/100. Tasks such as image optimization, compression and response time are usually easy to solve. What most people despair about is the following:
Eliminate JavaScript and CSS resources that block rendering in content "above the fold" (visible without scrolling).
Google's info page on how to solve the problem (Link) is more than sparse and merely scratches the surface of the problem.

Persistent topic reload javascript

Every smallest site today usually uses a framework like Bootstrap or Foundation which requires jQuery for its own javascript library. Here is a small example of the load sizes that must be loaded to use the two frameworks:
Foundation 6.3.1 121kB Foundation + 252kB jQuery.js = 373kB
Bootstrap 3.3.7 37kB Bootstrap + jQuery 95kB = 132kB
If these resources are not loaded, the browser has to download this amount after loading the body. The question that Google rightly asks itself is: Do you need 100-300kB of Javascript to display the first screen to the user? The answer is usually: No - That is why you should concentrate on the following magical term:

Above the fold - prioritize visible content and define Critical CSS

I can't repeat it often enough: Visitors happy, Google happy. That's why Google's optimization proposal with the render-blocking elements is not a torture measure by the company, but actually a relevant matter for the user. Why does the visitor have to load an entire page, including script crap and images that appear 4000px further down, if their viewport can only display 360x560px to begin with? Exactly, that's nonsense. So you proceed as follows (simple illustration):
  1. You look at the page in different viewports (Easy going)
  2. Now you decide which elements to prioritize based on what you can see without scrolling and see in which files they are addressed (Phew, already more difficult)
  3. Move the code of the prioritized elements (CSS, JS) to the HTML source code so that they are loaded with the DOM (Holy Shit!).
  4. The rest you load after

The solution: HTTP/2

Hypertext Transfer Protocol (HTTP) version 1.1 has been the prevailing web standard for 18 years now. It's about time to freshen it up a bit. HTTP/2 is the successor and supports, among other things Multiplexing and Server Push.

Why is this the solution?

Quite simply, the problem with the critical CSS and all the sequential and late loading of resources is due to the limitations of HTTP 1.1. Google's SPDY laid the foundation for HTTP/2 in 2012 and now, by using the new protocol, you can already almost do without the measures mentioned above. Nevertheless, it is still difficult to keep pagespeed fast enough on an international level without a CDN.

Who or what supports the HTTP/2 protocol?

As modern browsers only support the new protocol in conjunction with an SSL certificate, you should switch to HTTPS either way. If you handle customer data on the site (registrations, orders, etc.), an HTTPS site is not only a signal of trust from the user's point of view, but also from Google's point of view. For a good three years now, an SSL certificate has been a ranking factor (albeit a weak one). If you switch to HTTP/2 and your visitor's browser does not support it, don't worry, the browser will simply use the old 1.1 protocol again. Browser support for HTTP/2 - Source: Caniuse.com

Why HTTP/2 is faster

With HTTP 1.1, the browser had to establish a separate TCP connection for each request. This resulted in the classic waterfalls that you are familiar with when loading from the Dev Console or from the speed tools mentioned above. If there are too many files, the browser only loads them sequentially instead of simultaneously. However, the new protocol can be used with only transmit all data in parallel on a single TCP connection (called multiplexing). You can test a speed comparison between the old and the new protocol here: Akamai HTTP/2 Demo

Worth reading on the topic

Conclusion?

I have never really had any doubts as to whether pagespeed is relevant for Google or not. Top pagespeed is still a worthwhile endeavor for everyone. Therefore: Switch to HTTP/2, use CDNs (if necessary) and don't keep your visitors waiting. If you absolutely want a Google Pagespeed Score of 100, you will of course have to put in a little more effort. But be warned: At the latest at 99/100, when the own Google Analytics.js file shows a caching problemit's going to be fun for you. There'll be more about caching here soon, but that's enough for now. Chenqui.