PageSpeed vs reCAPTCHA and YouTube: Will Google Ever Get on the Same Page?

With the Core Web Vitals Report now in Google Search Console, and an impending algorithm update coming, I decided it was time to revisit Detailed Image to see if there were some relatively easy “wins” when it came to improving our Google PageSpeed.

With about 10 days of tweaks and changes, I was able to get most of our important pages to score 90 or above on mobile. Most of those pages were in the 20 – 70 range to start. A simple page like our Privacy Policy is now scoring up around 100 (scores fluctuate over time, as explained here):

Google PageSpeed - Detailed Image

What to do specifically varies by site. There were some easy wins – lazy loading images, deferring javascript, adjusting how fonts load – and some other wins that were much harder to come by. The two worse culprits were third-party scripts from none other than Google: reCAPTCHA v3 and YouTube.

What Core Web Vitals Gets Right…and What it Doesn’t

Similar to my gripes a few years back during our HTTPS migration – the end goal of a faster web is a noble one, but is executed by Google without thought and consideration for real world implications.

Google’s tools, particularly Lighthouse, are fantastic. And how they calculate PageSpeed now – by using actual speed metrics instead of a set of rules like “minify CSS” and “use a CDN” – is a big step forward. However, many of the suggestions are difficult or borderline unrealistic for large, established sites to accomplish.

The WebP image format that Google created and is pushing is a perfect example. Sure, if you’re a new site you create all of your images in WebP and an established format (JPG, GIF, PNG), use the picture element to provide a fallback for browsers that don’t support WebP, and avoid using CSS background images. It absolutely is a better compression with comparable images being significantly smaller in size.

What about a store like ours with 2,000+ products, each with multiple photos and 10+ photo sizes for each photo? PHP can do it, but in my experience every image still needs to be saved again as WebP (real-time conversion slows things down even more than just serving a JPG). And even if we solve that problem, we’re left with CSS background images that we made an integral part of the site long before WebP even existed. Solving that problem is messier. Oh, and by the way, Safari on both mobile and iOS doesn’t yet support WebP (although it will soon).

We decided to pass on WebP for now. Thankfully, we were able to get solid improvements elsewhere with easier-to-implement solutions.

Even more frustrating is that the worst PageSpeed offenders are often Google’s own third-party libraries. Your Google rankings depend on a high PageSpeed score, but the primary thing holding you back is something that Google themselves has sold you on as a necessary tool for your business. One would hope that they could get their departments on the same page and ensure that all of the code that they instruct site owners to install doesn’t hurt PageSpeed. The reality, unfortunately, is that clearly isn’t the case, and we’re left with compromising user experience or the effectiveness of those libraries to improve PageSpeed.

And, in general, many of the suggestions break old browser support and result in more code, messier code, and code that’s more difficult to maintain.

Enough ranting, let’s get to some solutions.

Improving PageSpeed for reCAPTCHA v3

We only added reCAPTCHA v3 this year. It is a brilliant piece of software that runs in the background and provides you with a score each time the user submits a form. This all happens without the user needing to solve a puzzle or check a box stating that they are human.

If you install the code per Google’s instructions though, you’ll pay a price with PageSpeed. This was by far our worst offender because it is loaded on every single page. Using the defer attribute didn’t improve things enough, so it became clear that we needed to delay loading the script even longer.

The problem, of course, is that if you delay it by a few seconds you might miss a user login and not get a score. The solution that I came up with was to wait to load the script until the user “took action” on the page, such as scrolling or moving their mouse. I wrote the following script, which ensures that the library is only loaded once:

<script type="text/javascript">
	var reCAPTCHALoaded = false;
	function loadReCAPTCHA() {
		if (!reCAPTCHALoaded) {			
			var element = document.createElement("script");
			element.src = "";
			reCAPTCHALoaded = true;
	window.addEventListener("load", function(){

There still has been a slight drop off in the percentage of form submissions that return a score. It is possible if your browser auto-fills and submits you login information, for instance, to submit that form without really interacting with the page. We already weren’t relying on reCAPTCHA as the only signal (some legitimate visitors might block external scripts), so this isn’t a huge deal for us. If you absolutely need a score, this solution might not work for you.

Improving PageSpeed for YouTube Embeds

The absolute easiest win, which takes only a few seconds to implement and has no downside, is to lazy load the YouTube iframe.

This too wasn’t quite good enough for us. We were still seeing PageSpeed affected by pages with multiple videos, even when it probably shouldn’t be. I considered using a similar solution to reCAPTCHA – use CSS to put a placeholder where the videos are, and only start loading the videos once the user took action on the page. This is perfectly viable, but for us ended up being more work than the solution that I came up with.

We instead show a thumbnail of the video with a CSS generated play button:

YouTube PageSpeed fix

When the user clicks anywhere on the image, the video plays in an overlay (functionality that was already programmed into our cart):

YouTube PageSpeed fix

And now the page score is much improved:

Google PageSpeed - Detailed Image

Wrapping Up

Overall I’m thrilled with how much our PageSpeed improved. I notice it myself when browsing the site. Every user on a modern browser, which is almost all of our users, will have a better experience on our site. It did however come with some compromises and custom programming. The current “best practices” are messy at times. But it is possible, even for a site like ours that’s been around for over a dozen years, to make meaningful improvements in a reasonably short amount of time.