How BuzzFeed optimized their Frontend
Hey Everyone!
Today we’ll be talking about
How BuzzFeed Optimized their Frontend
Web Core Vitals (LCP, FID, CLS) and how they're crucial to ranking high on Google Search
Improving Observability for Web Core Vitals with Synthetic Monitoring and Real User Monitoring
The specific changes BuzzFeed made to improve their CLS scores
Tech Snippets
A fantastic video on how the PageRank algorithm works
How Firefox protects against browser fingerprinting
What Programmers Should Know about SSDs
We also have a solution to our last Bloomberg interview question and a new question from Google.
How BuzzFeed optimized their Frontend
In June of 2021, Google made a big change to their search algorithm with the Page Experience Update.
With this update, Google would start using their Core Web Vital metrics as a factor in their page rankings. Sites with poor Core Web Vitals would rank lower in Google search results.
Core Web Vitals are a set of standardized metrics that can measure how good of a user experience a website is giving. They currently focus on 3 metrics
Largest Contentful Paint (LCP) - how many seconds does a website take to show the user the largest content (text or image block) on the screen? A good LCP score would be under 2.5 seconds.
First Input Delay (FID) - how much time does it take from when a user first interacts with the website (click a link, tap a button, etc.) to the time when the browser is able to respond to that interaction? A good FID score is under 100 milliseconds.
Cumulative Layout Shift (CLS) - How much does a website unexpectedly shift during its lifespan? A large paywall popping up 10 seconds after the content loads is an example of an unexpected layout shift that will cause a negative user experience. You can read about how the CLS score is calculated here. Google has come up with two metrics: impact fraction and distance fraction, and they multiply those two to calculate the CLS score.
BuzzFeed is a digital media company that covers content around pop culture, movies, tv shows, etc. and they get a significant part of their traffic from Google Search (more than 100 million visits per month). Having their articles rank high on the google search results page is extremely critical to their business.
Edgar Sanchez is a software engineer at BuzzFeed, and he wrote a great 3 part series on how BuzzFeed fixed their Core Web Vitals to meet Google’s standards. More specifically, how BuzzFeed fixed their CLS score (their LCP and FID scores had already met Google’s standards).
Here’s a Summary
When Google announced that they’d be factoring Core Web Vitals into PageRank, engineers at BuzzFeed took note.
They checked their Largest Contentful Paint (LCP), First Input Delay (FID) and Cumulative Layout Shift (CLS) scores.
Their LCP and FID scores were fine. However, their CLS score was very poor.
Only 20% of visits to BuzzFeed were achieving a “good” experience (a CLS score of less than 0.1). In order to pass Google’s Core Vitals test, 75% of visits should get a CLS score of less than 0.1.
The first step in addressing this issue was to improve Observability over CLS, so engineers could figure out the cause of the issue.
To increase Observability, BuzzFeed used two tools.
Synthetic Monitoring - Use Calibre to create a testing environment and run CLS tests several times a day.
Real User Monitoring - Add analytics metrics to the frontend that measure how much CLS users are experiencing. Hence monitor real users.
BuzzFeed started with Synthetic Monitoring.
They broke their web pages down into independently testable layers to help make tests more consistent.
Content Layer - just the page content. So, the article, any quizzes, interactive embeds etc.
Feature Layer - Include everything above (page content) but also include complimentary units like a comment section, polls, trending feed, etc.
Full Render Layer - Include everything above (features + content) but also include ads.
They loaded a couple hundred pages into Calibre and ran tests to figure out what was causing the CLS issues.
With Synthetic Monitoring, they were quickly able to narrow in on some of the causes for the issues.
They took a data-driven approach to prioritizing optimizations, and looked at which page types/units achieved the highest volume of page views. Engineers optimized CLS on those pages first.
However, even after solving the biggest issues that were apparent from their tests in Calibre, BuzzFeed was still unable to get their CLS score above Google’s threshold.
Therefore, they turned to Real User Monitoring (RUM).
With RUM, BuzzFeed would lean on their massive audience (more than 100 million visits per month), their analytics pipeline and the Layout Instability API.
The Layout Instability API provides 2 interfaces for measuring and reporting layout shifts so you can send that data to your backend server.
BuzzFeed has an in-house analytics pipeline that they use for keeping track of various types of real user monitoring data, so they hooked the pipeline up with the Layout Instability API.
The data travels from the frontend through various filters before being stored in BigQuery (data warehouse from Google). From there, engineers can run analyses or export the data to analysis tools like Looker and DataStudio.
Engineers looked at page volume and CLS scores to figure out which areas to target and where they should spend their time optimizing.
Optimizations
Here are some of the common issues BuzzFeed solved that resulted in improvements to their CLS scores
Correct Image Sizing - All images should have width/height attributes.
Static Placeholder for Ads - BuzzFeed has ads that will change dimensions depending on which ad is being served. They looked at the most common ad sizes and created static placeholders for them so the page wouldn’t change suddenly once an ad was loaded.
Static Placeholders for Embedded Content - BuzzFeed frequently embeds content from other websites (Tweets, TikTok video, YouTube, etc.). However, finding the dimensions for static placeholders for embedded content was quite difficult due to the huge variety of content sizes.
The most difficult to solve was generating static placeholders for embedded content since many embeds have no fixed dimensions and are difficult to accurately size. Embedding a tweet, for example, can vary dramatically in height depending on the content of the tweet and whether it contains an image/video.
BuzzFeed engineers solved this by gathering embed dimensions from all their pages and collecting them in their analytics pipeline and eventually in BigQuery.
Now, when a page is requested, the rendering layer will check BigQuery for the dimensions of the embedded content and add correctly-sized placeholders for the content.
As new pages get published, the dimensions of any third party embeds on those pages will be loaded into BigQuery.
Result
With these changes, BuzzFeed was able to achieve ~80% of all page views having a good CLS score. This is a massive improvement from their starting point of ~20% of page views.
For more details, you can read the full series here.
Tech Snippets
How PageRank Works - PageRank is the algorithm created by Larry Page that formed the foundation for Google Search. This is a fantastic video that goes into exactly how PageRank works. It's from the YouTube channel called Reducible, which is like a 3Blue1Brown for CS and algorithms. I'm a really big fan of his content and highly recommend checking it out.
Firefox's Protection against Fingerprinting - Many websites are moving away from cookies and towards different techniques to track their users. Browser fingerprinting is where a website will use the various web APIs to generate unique information about each of their users and then use this information to identify them.
Canvas fingerprinting is a popular way of doing this, where a website will use the HTML5 canvas element to identify a user's device.
Mozilla wrote a really interesting post on the steps they take to protect user privacy and prevent websites from fingerprinting Firefox users.
What Programmers should know about SSDs - Emmanuel Goossaert is a Senior Engineering Manager at Booking.com and he wrote a great series of blog posts on how SSDs work. It is a 5 part series with the final 6th part (linked above) being a summary of the entire series. You can view the entire series here.
Interview Question
Given a positive integer n, write a function that computes the number of trailing zeros in n!
Here’s the question in LeetCode.
Previous Question
As a refresher, here’s the previous question
You are given an array of intervals where intervals[i] = [start_i, end_i].
Merge all overlapping intervals, and return an array of the non-overlapping intervals that cover all the intervals in the input.
Here’s the question in LeetCode.
Solution
We can solve this question by first sorting the input list of intervals.
The way we’ll sort the list is by the first integer in each interval (the start time). This will allow us to quickly check whether any two intervals overlap.
After we sort the input list, we’ll have to iterate through it and check if any intervals overlap.
If any two intervals do overlap, then we’ll merge the two intervals into one interval.
We can check if two intervals overlap by checking if the start bound of the later interval is smaller than (or equal to) the end bound of the earlier interval.
If so, we merge the two intervals.
We can do that by replacing the first interval’s ending bound with the max of (the first interval’s ending bound, the second interval’s ending bound).
Then, we delete the second interval.