Keeping on sharing cool tips on how to improve your Google Core Web Vitals on the example of news.amomama.com, we decided to talk a little about monitoring these metrics. And how with the help of monitoring we stopped worrying about the impact on the performance of this or another feature.
We stopped waiting for feedback on field data for up to 28 days, and almost in real-time, we monitor and immediately respond to certain factors that affect our performance.
I would like to remind you that AmoMama is a completely ‘organic’ product and one of our main sources of traffic is Google, so we need to constantly “keep our finger on the pulse” to ensure that our performance always remains in the green zone.
It will be an article that contains two episodes. The first is about monitoring tools, pros, and cons, but in the second episode, we are going to talk about the best performance measurement tool for us and how you can set up it for yourself. Also, we will share with you how you can simulate user interactions and get almost real field core web vitals data.
What are lab and field data?
Let’s remind ourselves what Google considers lab data and what is field data. Briefly, by lab data, Google means the results of the run only for a specific device and for the first screen and for the first viewport, for example, Lighthouse by default runs on Moto G4 with slow 4G
But with the release of the new Google Chrome v.103, the company also presented a new version of Lighthouse with which you can measure laboratory data not only for the first screen but also simulate field data in the lab, although under the hood of this simulation, there is still quite an alpha version of Lighthouse user flows. But it already helps a lot when we test Core web vitals, more about that later.
Let’s return to more complex things, such as field data, which actually play a very important role in ranking your site in search results. What is it and how to work with it correctly?
Field data is measured by monitoring all users who visit a page and measuring a specific set of performance metrics for each of them.
Because field data is based on visits from real users, it shows the actual devices, network conditions, and geographic location of your users. Which is very difficult to maintain and predict with lab measurements that were collected in a controlled environment with a predefined device and network parameters.
It is quite difficult to manage the field data, as we have more than one website and more than one location, where the vast majority of users have outdated mobile devices, outdated versions of browsers, and, as a bonus, poor Internet connection: fast 3G or slow 3G. But we cannot turn off or block our website for such locations because our main task is to provide our users with high-quality and interesting content for any location and Internet connection. So, for the development and maintenance of the main website and such locations, we have started to create clear and high-quality monitoring of field data, which would help us to provide our website for each user to consume entertainment content quickly and smoothly.
Google Core Web Vitals measurement and monitoring tools, their advantages and disadvantages
First of all, let’s talk about how we came to the best solution in our case.
Classify the tools for measuring and tracking field and laboratory data that Google offers:
PageSpeed Insights is a powerful tool that allows you to immediately view field and laboratory data for a specific link and origin, which is its huge advantage.
At the same time, I would like to highlight some drawbacks:
You need to constantly view field data (with a delay) and, if necessary, keep a manual report, for example, in Excel. It is very inconvenient. There is no historical data, but only the current state of key metrics
Field data:
Lab data:
Chrome UX Report(CrUX) is a publicly available dataset of real user experience on websites. It measures the field performance of all core Google metrics.
Unlike lab data, CrUX data is collected from users. Using this data, we can understand the distribution of real user experience on our websites. This reporting is based on data from BigQuery. The CrUX dataset on BigQuery includes detailed Core Web Vitals data for all key metrics and is available as monthly reports at the origin level. To generate a report on the site, you can use this link.
What are the advantages and disadvantages of this tool in our case:
Advantages:
- Provides a clear picture of Core Web Vitals for the month and past periods;
- Ability to create flexible reports;
- Comfortable and clear work through Data Studio;
- Automation of report exporting.
Disadvantages:
- Reports for the previous month appear only on the second Tuesday of the next month;
- If you pull data from BigQuery directly, it is quite a lot of money;
- Slow response to changes in metrics in one direction or another.
Despite the drawbacks that we found for ourselves, we can say with confidence that this tool should be used for monitoring real field data.
The ability to retrospectively view changes in metrics is cool.
And the CrUX report looks like this:
Search Console helps to identify groups of pages on our website that need attention based on real (field) data from CrUX. URL performance metrics are grouped by status, metric type, and URL group (groups of similar web pages). This is the main tool used by our business holders because there is a fairly clear and understandable picture. It is immediately obvious in which zone the website is located (green, yellow, or red) and how it affects SEO.
For example,
Thanks to such a simple tool, businesses do not need to deal with a ton of metrics and analyze whether we meet the needs of Google or not.
By entering the Search Console, it is completely clear what the situation is :)
Advantages:
- Clarity and transparency;
- Without unnecessary words and text, it is clear what is the situation with Core Web Vitals on the site.
Disadvantages:
- You need to wait up to 28 days until the graph will be updated, it is according to Google documentation. But actually, you can see graph changes in around 2 weeks(it’s just our experience);
- Data comes with a delay of 2 days, PageSpeed Insights gives a little fresher data;
- Unfortunately, Search Console is unsuitable for development, search for new areas of performance development, and quick response to problems.
Chrome Dev Tools and Web Vitals extension — honestly I would not like to stop on this topic, because these tools are must-haves to use for the localization of performance bugs during development.
But, after analyzing all the tools that Google offers us, we did not find a solution that would help us to quickly respond to problems and monitor field data in real-time without a long wait. Although we even tried to use the PageSpeedInsight API and output metrics to Grafana using Prometheus Exporter for Google Pagespeed Online Metrics but using this API we realized that this is a buggy story that did not give us the desired result, so we decided to choose another solution for ourselves…
What have we chosen for ourselves and how do we use it?
After detailed research, we finally chose the appropriate tool that allowed us to track both laboratory data and simulate field data and build the necessary monitoring using Grafana — Sitespeed.io
Sitespeed.io is a set of Open Source tools that makes it easy to monitor and measure the performance of your website.
We liked the approach of the developers of this wonderful tool:
“Measuring performance shouldn’t be hard: you should be able to have full control of your metrics, own your data and you should be able to do it without paying top dollars. That’s why we created sitespeed.io.”
And as our team is very respectful of open-source, we just couldn’t ignore it!
A step-by-step installation guide will provide to you in the next episode of this article.
Keep in touch and see you in the second part…