You might be surprised to know that most of the page speed recommendations you either implemented and guides found on many reputable websites are almost pointless.
General recommendations such as:
- Browser caching
- Optimising images
- Gzip compression
- Minify CSS
- Minify HTML
Will generally yield almost no noticeable difference in page load speeds when benchmarking using tools such as WebPageTest and GTMetrix. The one, page speed recommendation that can make a significant difference is external requests.
Please don’t confuse the comment above with Page Speed scores from Google’s PageSpeed Insights or Lighthouse. This guide is a demonstration of how external on-page requests can impact your full load times shown in various different tools.
Most webmasters underestimate the impact external requests have on a website. Large enterprise websites suffer the most, hence why you may always find page speeds on large websites such as Macy’s, John Lewis, Best Buy etc have fairly poor page load speeds when being tested by popular tools such as WebPageTest or GTMetrix.
To give you a quick demonstration on how external on-page requests can impact a websites page load speeds. We prepared a side by side comparison. On the left is the John Lewis homepage, but optimised by us, removing all unnecessary external requests. Whereas on the right-hand side it’s the original untouched version of John Lewis.
We were able to drop page speed down by 9.5 seconds by removing 170 external requests that are not compulsory for the page to function correctly.
Note: Nothing else on the website has changed, everything has remained identical.
So, how do you remove external requests?
Most page speed tools will have an advanced featured called ‘blacklist’. This allows you to enter a list of domains that you would like to blacklist, in other words, prevent them from loading simulating the removal of those requests.
Within GTMetrix, when you run a page speed report, you will find under the Yslow tab ‘Reduce DNS lookups’.
Once you open this tab, you will find a long list of domains.
The next step requires a bit of technical knowledge. We need to exclude all the domains that are not required to render the page. The first domain, for obvious reasons, should not be included in the blacklist. For John Lewis specifically, we did not blacklist some external domains such as johnlewis.scene7.com. This is the domain John Lewis uses to load static assets such as images. If we removed this, images on the page would not function correctly.
Once you have gone through the list, you will need to add a protocol at the beginning and an ending slash with a asterisk /*. The final list should look similar to this:
Using GTMetrix, you can paste the list above directly into the URL Blacklist field.
It’s important to note that some domains are indeed compulsory, for example Google tracking and other important tracking requests. It’s your responsibility to review all external requests and remove any that are not compulsory to simulate an accurate test.
Never underestimate the impact external requests can have on a website. That all said, this trick won’t help improve your Google Page Speed Insights score by a significant amount. Other page speed factors combined with this will need to be implemented to see a high, green score.