Speed ​​Google. Why You Shouldn't Worry About Google PageSpeed ​​Insights

On November 12, Google quietly updated PageSpeed ​​Insights, changing almost everything in it. This will be a big change for the entire website building industry. It looks like there will be some wave of panic and hype around this event. The article contains an analysis of the changes and what they will bring us.

What is PageSpeed ​​Insights Just a few words for those who are not in the know. For 8 years, PageSpeed ​​Insights has been the main site speed meter, you can enter the page address into it and find out its score on a scale from 0 to 100, along with recommendations for improvement.

Of course, there are many other good speed test tools out there. But since this one is from Google, and they have stated that site speed affects the rankings in the SERPs, for most, this rating seems to be the most important. Especially for customers and bosses, and as a result, almost everyone is trying to raise the PageSpeed ​​Score of their projects, and the metric has become almost the most important in the industry.

What changed? In short, everything. The old PageSpeed ​​has been swept aside and replaced with Lighthouse scoring and analytics, an open-source site auditing tool that's built into Google Chrome, among other things.

The cardinal difference of the approach is that points are now awarded not for following the rules, but for speed. Page loading is estimated according to several time characteristics - how long after the start of loading something is already visible, when you can already click, how much everything slows down while it is loading, and when everything is loaded. These characteristics are compared with those of the best sites and converted into points. Below we will analyze this in more detail, now the principle itself is important.

There are recommendations, as before - but now they carry a completely different burden. Recommendations are in no way directly related to points, and it is absolutely not a fact that their implementation will improve the situation (but it can easily worsen if implemented thoughtlessly).

Panic is inevitable It is now the night of the 13th and everything is relatively quiet. Only a couple of specialized resources posted short notes about the update, only a couple of clients wrote excited letters about the strange behavior of PageSpeed ​​Insights. It seems to be the calm before the storm.

Right now it is clear that the tool is behaving unstable - the ratings for the same page fluctuate within 20 points, sometimes there are complaints about the inability to get the page of the site being evaluated. Some sites in general, in his opinion, are not available - although they actually feel great.

It is obvious that soon a lot of people will rush to watch the evaluations of their projects, covering the service with an international habra effect. Everything will work through a stump-deck, fail and frighten with jumping estimates.

It's not easy, but try to relax and stay calm. The first thing to remember is that updating PageSpeed ​​Insights does not affect the ranking principles in any way. search results. Secondly, it will take at least two weeks for the update to be tested, corrected, and it starts to work stably. Do not make sudden movements, you may have to roll them back later.

Reflections and Forecasts There are many positives in these changes. The dominance of the old PageSpeed ​​Insights with its forced recommendations has done quite a bit of trouble. Firstly, any recommendation can be close to useless in your specific situation. Secondly, it can be implemented at the expense of more important things, such as page generation time. But there was no choice - I had to do all this in order to get a beautiful assessment.

For example, what is the minification recommendation html code pages? On average, this operation takes about 100 ms, and this delay is dozens of times greater than any possible benefit from reducing the page size. The only way this can be beneficial is if you serve pre-minified pages from the cache.

In any project of recent years, a lot of effort has been spent on image optimization, minification and grouping of resources, delayed launch of JavaScript, which would not break anything. Most often, this diverted the focus from the essence - the speed of the site for visitors. The Internet was full of examples of sluggish sites that rated great, as well as fast sites that rated poorly.

Now this tinsel will gradually fall off. On the first tests, the estimates with and without minification-grouping of resources practically do not differ. What really matters is how fast the server responds and how much heavy stuff is on the page. All whistles are widgets social networks, interactive maps, chats and luxurious pop-ups will inexorably hit the rating, no matter how you wrap them.

It is likely that all this will lead to really fast sites and understanding how to make them. At least, I really want to believe it.

New metrics A for the most persistent - a detailed analysis of the new metrics that affect the assessment. There are 6 in total and they have different weight in the formation of the final score. Let's go through them in decreasing order of importance.1. Loading time for interaction This is the most important characteristic- and the hardest. The timestamp when the page is fully ready for user interaction. This moment comes when:
  • page displayed
  • registered event handlers for most visible elements
In fact, the page should be drawn, not slow down and be ready to respond to actions.2. Load Speed ​​Index Indicates how quickly page content becomes available for viewing. The Speedline module is used for evaluation.

This is the time when the page in the browser stops changing visually. A frame-by-frame comparison of the page view is used to determine.

3. First content load time An indicator that determines the time interval between the start of page loading and the appearance of the first image or block of text.4. CPU End Time This parameter indicates the time at which the main thread of the page becomes free enough to process manual input. This moment comes when:
  • most of the elements on the page are already interactive
  • the page responds to user actions in a reasonable amount of time
  • response to user actions is less than 50 ms
The Russian translation of this metric loses its essence a bit. In the original, it sounds First CPU Idle - the first simple processor. But this is not entirely true either. This refers to the moment in the loading of the page when it can already basically respond to actions, although it continues to load.5. Time to load enough content This parameter shows the time after which the main content of the page becomes visible. This moment comes when:
  • the largest change in page appearance has occurred
  • fonts loaded
6. Approximate input delay time This is the least significant characteristic. Shows the time in milliseconds it takes the page to respond to user actions during the busiest 5 seconds of page load. If this time exceeds 50 ms, users may think that your site is slowing down.

Each metric is compared to the scores of all sites assessed. If yours is better than 98% of the sites, you get 100 points. If it's better than 75% of the sites, you get 50 points.

At first glance, these metrics are very vital and it will be almost impossible to deceive them with dirty manipulations that do not actually speed up the site.

So far, the principle of evaluation remains a mystery mobile version site. Rather, the principle is the same, but often the scores are significantly lower. It is not clear on what virtual configuration of the mobile device they are running.

Sergey Arsentiev

Increase site loading speed in Google Page Speed ​​with your own hands

To achieve high positions in the search, you need to Google service Page Speed ​​rated your site as high

I noticed that it is the speed of the site in Google that is crucial, all other factors being equal, such as external links, internal optimization and domain age.

Therefore, for normal SEO, this is a very important step that needs to be done after creating a site.

Well, we start by checking the current values.

In your personal opinion, your site may fly, but this still does not mean anything for Google.

It has its own criteria for checking site speed, so go here https://developers.google.com/speed/pagespeed/insights/

Enter the website address and click check.

Enter the address with http or https and at the same time make sure that the site is enabled in normal mode (not in maintenance mode), but allows indexing of all important elements of the site.

The closer the score is to 100, the better.

The norm is 80 points and above.
If you have more, great.
That is, if your result is 90 points, then twitching and trying by any means to make 100 is not a very rewarding task.

So if you have 80 points and above, then don't worry, but rejoice: most projects have 40-50 points before download speed optimization.

I personally think that for projects with great functionality, even 70 points on Google Page Speed ​​is quite acceptable.

Well, if you have less than 70 points, and even more so 20-30, then you should seriously think about optimizing the speed of the site for Google.

I will start with the most simple ways and end with complex ones that are suitable for the most meticulous site owners who need 100 out of 100

Enable compression

The easiest way to speed up the loading of a site for Google is to enable GZIP compression. This is a special on-the-fly data archiving mechanism supported by 99.99% of hosts.

If your hoster does not support it, then change the hoster (whom).

You can enable GZIP compression in WordPress using the WP Fastest Cache caching plugin.
In Opencart using the NitroPack plugin

If you do not want to install plugins, you can modify the data directly in the .htaccess file, which is located in the root folder of the site ().

Add the code generated by these plugins there. At the end of the article, I will post these codes completely from my sites, so as not to lay out pieces here at each point, otherwise you will get confused.

Reduce server response time

It depends on the servers of your hosting provider, their workload and the gluttony of your project.

Sometimes this indicator alone can negate all your SEO optimization efforts.

It is important to check this site speed parameter on Google Page Speed ​​at different times of the day or night - it is quite possible in this moment the hosting server is simply overloaded with requests.

If at other hours everything is mostly fine, then you can ignore the occasional one-time workload for the time being.

If this indicator is consistently bad, then it makes sense:

  • install caching modules (I recommend installing them for every project in any case!) to reduce the load on weak hosting and the time it takes to generate site pages.
  • think about changing the host to the one with the fastest hosting or just switch to . This is usually enough in most cases for developing projects.
  • if you have an extra penny, then order (20% discount using the promo code "moytop"). This will give a very stable and fast work site.
  • the most difficult way is the internal optimization of site scripts (up to a complete redesign of the entire site). I recommend doing this last, since it is very difficult and it is almost impossible to find a normal specialist for sane money.
Use browser cache

Also an easy way to quickly increase your Google Page Speed.

As well as in the case of GZIP compression, it is solved by adding code to htaccess. You can add it manually or with the help of the same caching plugins.

At the end of the article, I will give the entire code, you can copy and try to add it to your file manually via .

Better yet, use the proven plugins that I already suggested above.

Optimize Images

The most profitable optimization method that everyone can easily do with their own hands.

The task is to reduce the quality of the images on the site so that they take up less space and load faster.

It is most reasonable to do this through Google Page Speed ​​itself.
But you have to do it manually, that is, you have to tinker!

For more than a year I have been looking for a normal program that can automatically compress pictures using the Google algorithm and preserve the folder structure.

Alas, I did not find such a program. I reviewed all sorts of viewers like XnView, Infran, programs like Caesium, FileOptimizer, Imbatch, ImageOptim, services like TinyPng, etc. - no automation.

If anyone knows such a program - write in the comments.

Primary requirements:

  • compression according to the Google algorithm (so that after downloading ready-made pictures they met the requirements of Google for download speed)
  • minimal loss of quality
  • work with folder structure preservation
  • removal of metadata, etc. garbage.
  • automatic detection of input and output format

In the meantime, there is no program, you will have to do it "handles".

Downloading and unpacking.

And then we change the files via FTP on the site for a long and dreary time, replacing the old version with the new one.

Alas, the quality of JPG images is lost significantly.
PNG - the quality remains very good, almost like in the original, and the size is reduced by 2-3 times. So if you have a lot png files that would be a very big savings.

I would recommend saving the original original files before updating, you may not like the whole result of the "Google" reduction, you can quickly restore individual files. Well, compare pictures in the viewer before uploading, so as not to order a frankly ugly picture.

There is no way out, in order to get a high site speed in Google, you will have to sacrifice the quality of some pictures.

Minify CSS + HTML + JS

In this case, Google Page Speed, by analogy with pictures, asks you to remove everything superfluous from layout files, scripts and page code - spaces, comments, etc.

So that the file itself becomes less "weight" and the speed of loading the site becomes faster.

When using caching plugins, this is all done automatically on the fly. This is the most convenient way, since it does not change your code, but creates separate - minified css files and HTML, which Google palms off and everyone is happy.

If you minify files in some service, then keep in mind that later it will be problematic to make any changes in such truncated files.

You can download reduced files "from Google" in the same place as the pictures - they will be in one archive.

But re-upload the original files with them - decide for yourself, in any case, save a backup of the old files.

Optimize the loading of visible content

In most cases, there is nothing you can do about it, except to remake the site again.

Therefore, I would try not to bother here especially and deal with other site speed factors in Google.

Remove JavaScript and CSS

A complex item that requires knowledge and practice.

Of course, you can add an async tag near the download of each JS file or put a script from Google for extsrc=, but according to my observations, in 90% this either does nothing or breaks the site!

We need a JavaScript specialist who will shake up the entire site and analyze all the scripts so that they can be loaded asynchronously without damaging any functionality.

And since in the same WordPress, usually 9 out of 10 scripts are external plugins, you are unlikely to do anything there, since they are loaded from their folders and updated 3 times a week.

And your changes made will be trite.

Many of you have probably used a wonderful service from Google: PageSpeed ​​Insights? Do you want to get the coveted 100 out of 100?

Picture to attract attention

And it's a matter of small.

So here are my test results. We take any site, for example, I took a free ready-made adaptive site template, transferred it to my hosting and started testing: The result of the first testing (link to the site):
  • speed for mobile - 79/100;
  • speed for computer - 93/100;
Not bad huh?

Complains about:

Be sure to fix:
Remove render-blocking JavaScript and CSS from the top of the page.
Number of blocking CSS resources per page: 3. They slow down the display of content.
All content at the top of the page is displayed only after the following resources have been downloaded. Consider deferring the loading of these resources, loading them asynchronously, or inlining their most important components directly in the HTML code.
We do little tricks. Transferring styles from file to code:
Was:


It became:

article, aside, details, figcaption, figure, footer, header, hgroup, nav, section ( display:block; ) /* and other styles */
And - cheers! - we have the results above (link to the site):

  • speed for mobile - 99/100;
  • speed for computer - 99/100;
And complains only about: Fix it if possible:
Minify HTML
Compressing HTML code (including inline JavaScript code or CSS) allows you to reduce the amount of data to speed up loading and processing. But this problem can be solved by compressing the code. Do not apply to this topic.
And also we do not forget that after all we did not solve the problem described above:
All content at the top of the page is displayed only after the following resources have been downloaded. Consider deferring the loading of these resources, loading them asynchronously, or inlining their most important components directly in the HTML code. How much they weighed in the file, they weigh the same in the code!

And now the most important question: Bug or feature?
Thank you!

PageSpeed ​​Insights (PSI) reports on the performance of a page on both mobile and desktop devices, and provides suggestions on how that page may be improved.

PSI provides both lab and field data about a page. Lab data is useful for debugging performance issues, as it is collected in a controlled environment. However, it may not capture real-world bottlenecks. Field data is useful for capturing true, real-world user experience - but has a more limited set of metrics. See for more information on the 2 types of data.

performance score

At the top of the report, PSI provides a score which summarizes the page's performance. This score is determined by running to collect and analyze about the page. A score of 90 or above is considered fast, and 50 to 90 is considered average. Below 50 is considered to be slow.

Real World Field Data

When PSI is given a URL, it will look it up in the (CrUX) dataset. If available, PSI reports the (FCP) and the (FID) metric data for the origin and potentially the specific page URL.

Classifying Fast, Average, Slow

PSI also classifies field data into 3 buckets, describing experiences considered fast, average, or slow. PSI sets the following thresholds for fast / average / slow, based on our analysis of the CrUX dataset:

Fast Average Slow
FCP (1000ms, 2500ms] over 2500ms
FID (50ms, 250ms] over 250ms

Generally speaking, fast pages are roughly in the top ~10%, average pages are in the next 40%, and slow pages are in the bottom 50%. The numbers have been rounded for readability. These thresholds apply to both mobile and desktop and have been set based on human perceptual abilities .

Distribution and selected value of FCP and FID

PSI presents a distribution of these metrics so that developers can understand the range of FCP and FID values ​​for that page or origin. This distribution is also split into three categories: Fast, Average, and Slow, denoted with green, orange, and red bars. For example, seeing 14% within FCP"s orange bar indicates that 14% of all observed FCP values ​​fall between 1,000ms and 2,500ms. This data represents an aggregate view of all page loads over the previous 30 days.

Above the distribution bars, PSI reports the 90th percentile First Contentful Paint and the 95th percentile First Input Delay, presented in seconds and milliseconds respectfully. These percentiles are so that developers can the most frustrating user experiences on their site. These field metric values ​​are classified as fast/average/slow by applying the same thresholds shown above.

Field data summary label

An overall label is calculated from the field metric values:

  • Fast: If both FCP and FID are Fast.
  • Slow: If any either FCP or FID is Slow.
  • Average: All other cases.
Differences between Field Data in PSI and CrUX

The difference between the field data in PSI versus the Chrome User Experience Report on BigQuery, is that PSI’s data is updated daily for the trailing 30 day period. The data set on BigQuery is only updated monthly.

lab data

PSI uses Lighthouse to analyze the given URL, generating a performance score that estimates the page's performance on different metrics, including: , and .

Why do the field data and lab data contradict each other? The Field data says the URL is slow, but the Lab data says the URL is fast!

The field data is a historical report about how a particular URL has performed, and represents anonymized performance data from users in the real-world on a variety of devices and network conditions. The lab data is based on a simulated load of a page on a single device and fixed set of network conditions. As a result, the values ​​may differ.

Why is the 90th percentile chosen for FCP and the 95th percentile for FID?

Our goal is to make sure that pages work well for the majority of users. By focusing on 90th and 95th percentile values ​​for our metrics, this ensures that pages meet a minimum standard of performance under the most difficult device and network conditions.

Why does the FCP in v4 and v5 have different values?

V5 FCP is looking at the 90th percentile while v4 FCP reports the median (50th percentile).

What is a good score for the lab data?

Any green score (90+) is considered good.

Why does the performance score change from run to run? I didn't change anything on my page!

Variability in performance measurement is introduced via a number of channels with different levels of impact. Several common sources of metric variability are local network availability, client hardware availability, and client resource contention.

More questions?

If you"ve got a question about using PageSpeed ​​Insights that is specific and answerable, ask your question in English on Stack Overflow .

If you have general feedback or questions about PageSpeed ​​Insights, or you want to start a general discussion, start a thread in the mailing list .

Feedback

Was this page helpful?

Yes Great! Thank you for the feedback. If you have a specific, answerable question about using PageSpeed ​​Insights, ask the question in English on Stack Overflow mailing list . Sorry to hear that. If you have a specific, answerable question about using PageSpeed ​​Insights, ask the question in English on Stack Overflow . For general questions, feedback, and discussion, start a thread in the

Hello, dear readers of the blog site! Today's post will be dedicated to the wonderful Google Page Speed ​​tool, which allows you to test any page to explore the possibilities of speeding up its loading ().

It's no secret to anyone that search engines, especially Google, Lately pay serious attention to this aspect, so fast websites get a certain advantage in ranking and higher positions in search results.

Google Page Speed ​​Insights not only detects and points out all the reasons why the page does not load fast enough, but also offers specific ways to fix them, and some of the problems found can be easily eliminated by the service itself in automatic mode.

What optimization tools does Google offer to speed up websites?

In principle, for each of my projects, I devote a lot of time to page optimization activities that contribute to them. fast loading. Really did not reach the hands only to the blog site, which is essentially a visual aid for novice webmasters who are not devoid of healthy ambitions and charged to achieve the goal.

But, since I am a perfectionist by nature, I could not admit that an informational web resource, carrying, so to speak, knowledge to the masses, would be among the lagging behind in some aspect. Therefore, it was decided to make every effort to optimize it, including the recommendations of Paige Speed.

Therefore, I immediately turn to the description of this tool so that you can evaluate its functionality and put it into practice for the benefit of your web resources.

Previously, Page Speed ​​Insights could be used as browser extensions and download links were present on a special Google page. Moreover, in order to apply it in Mazil, it was first necessary where Page Speed ​​was present as its supplement:


Now it is no longer possible to use extensions for browsers specifically from Google, although the same plugins, but in a slightly different interpretation, are available on the official web pages of Chrome and Mazila (and). In addition, there is a fully functional eponymous online service Google with exactly the same functionality and no less features. If you go to the Developers section, you will see a link to the Page Speed ​​tool there:

By the way, on the same PageSpeed ​​Tools page are all Google's suggestions for speeding up sites. It seems that the developers of the "empire of good" have taken seriously the optimization and acceleration of the entire Internet space, since in Developers you will find a link to the Open Source Optimization Library (Integrate the PageSpeed ​​Optimization Library).

Availability of open source software of this project, working on the basis of the module " mod_pagespeed”, which is installed on Apache servers (of which, by the way, the vast majority), means that anyone can improve or update it.

That is, in this way it is quite realistic to create an acceleration tool that is close to perfect at the expense of the Internet community. How all this will look in practice, time will tell. Naturally, in order to make a full contribution and achieve success in this good cause, you need to be a fairly skilled programmer.

If you take a look at the top screenshot (the information is circled in green), you will see a link to install the latest version of the ready-made module. This may well be of interest to owners who can install it on their servers, providing automatic acceleration of the sites located there.

Moreover, there are two modifications of the Page Speed ​​module: directly for Apache servers and for the Apache + Nginx bundle, where the second one plays the role of a proxy server:


By the way, option sharing Apache and Ngnix in modern conditions are most in demand and are used by most advanced hosters (including, for example, Sprinthost), since they provide effective use resources. So, if you are interested in this method of automatically accelerating sites, you can have a hand in this yourself or strain your hosting provider.

Google's PageSpeed ​​recommends (Ways to increase site page loading speed)

Above, we considered the possibility, so to speak, of a global increase in the speed of loading websites, which requires specific knowledge (it is likely that this information will also be useful to someone, at least in the future).

But for the main part of readers, that is, simple webmasters, the option that is understandable and accessible "here and now" is more relevant. Just such conditions are fully met by the PageSpeed ​​Insights service, which we will talk about in more detail. For analysis specific page site, enter its URL () on this page:



After the analysis, you will see Google's score, both in relation to mobile devices and PC, which is especially valuable. In this example, for home page For this blog, 76 credits out of 100 for mobile phones is an average result that can be significantly improved if you use the recommendations below.

Moreover, in order to optimize each resource that slows down page loading, links are given to the relevant sections (in order to obtain them, click the link "How to fix?"), where you will find descriptions of the necessary actions to eliminate shortcomings.

But here an important remark must be made. Need to parse pages different types to achieve a reduction in the loading time of the entire site as a whole. For example, for a standard WordPress blog, the level of optimization in this aspect for the main, headings, static pages and records can be completely different.

Above I gave an example home page, which displays announcements of posts, but for one of the articles, Page Speed ​​gave a much lower rating:


This happened because posts are usually much more voluminous and include a lot of different kinds of resources and content (images, videos, scripts). Therefore, it is better to start checking for optimization with them. By the way, from this point of view, start from the assessment that Page Speed ​​provides for mobile phones, since achieving a decent download speed on such devices requires much more gestures.

But even different pages of records can differ in quality. This happens, among other things, because some of them run additional scripts that slow down loading. For example, the level of optimization of one of these web pages caused me not just surprise, but a real shock:


It is quite obvious that 62 points out of 100 possible is not the result to which one should strive. And all because on this and a number of other pages of posts, I have a SyntaxHighlighter plugin script that initiates a beautiful design of codes with highlighting.

There may be several such extensions on the site. In each case, you need to decide for yourself how important the plugin is for your resource in terms of the functionality that it provides. And depending on this, you must either remove it or take steps to optimize it, which will help speed up page loading.

The above especially applies to the most popular CMS in the world, which is used by the vast majority of webmasters. This multifunctional engine is good for everyone, but it needs to be constantly debugged in terms of reducing server load and reducing page loading times. Therefore, owners of web resources running on WP should take note of this. And the presence of such a service as Page Speed ​​is most welcome here.

Be sure to read all these articles at least in view mode by clicking on the links above, you may find a lot of useful things for yourself. In the future, I plan to return to the problem of accelerating web resources more than once, and therefore you can subscribe so as not to miss the release of fresh publications on this topic.

Finally, watch this helpful video to learn why Page Speed's tips are recommended and why you shouldn't follow them fanatically.