I’m using Google Lighthouse to calculate a performance score. One of the criteria is caching static assets such as images and scripts.
I don’t have control over all of these, but the ones I do have control over the cache has been set to 30 days. However, Lighthouse is still reporting these as an issue. Lighthouse does report these as having a 30d cache, but still reports as an issue.
What do I need to do to rectify this?
Please see screenshot below:
3
Answers
Lighthouse will warn you to serve static assets with an efficient cache policy if your score for that audit is not greater than or equal to 90. It will also list all of your static assets in the details summary (regardless of whether they pass or not).
Since you do not have control over some of your static assets, your score appears to be lower than 90, and therefore, you are still seeing your static assets that pass the audit in the details summary.
You can verify this by saving your results as a JSON file, opening it in any text editor, and searching for the section containing “uses-long-cache-ttl“.
The score underneath will likely be less than 90.
You can learn more about this particular audit by visiting this link:
https://developers.google.com/web/tools/lighthouse/audits/cache-policy
I also had a 30 day cache policy and what fixed this for me was adding the public and no-cache values to the Cache-Control header.
I only figured this out as I was testing Firebase hosting vs my old host which was IIS. The IIS hosted site was passing even though it had a shorter max-age value. I checked the network developer tools in chrome and saw it had public and no cache values in my IIS web.config under the Cache-Control header but my firebase.json didn’t have those values. Once added I’m passing again!
Now why this passes is a mystery to me, but see if you can add and test again.
In my case to fix the Serve static assets with an efficient cache policy error in Lighthouse, I have to increase the max-age value to 97 days:
My version of Lighthouse is 5.7.0