Streaming live at 10am (PST)

How to implement an efficient cache policy to improve SEO

Hey guys, just been doing some SEO tests on my website below:

Something Familiar | CreativeStudio

One of the places for improvement is " Serve static assets with an efficient cache policy" which is:

" When a browser requests a resource, the server providing the resource can tell the browser how long it should temporarily store or cache the resource. For any subsequent request for that resource, the browser uses its local copy rather than getting it from the network."

I’ve seen that you can improve this when you arn’t hosting on Webflow, but we are. And quite honestly it’s pissing me off that I’m paying so much for hosting if there are things that I can’t amend to improve SEO. In other words, hosting with webflow is hindering my SEO.

Anybody know how I can implement something to sort this out?


Can you point to where this is documented as a detriment to SEO by Google, or Bing? If one of them is penalizing my sites I would love to know about it. If you are getting recommendations from a service, they are are just that. What tests were you running?

As an official google speed test, it’s a recommended improvement for my site. However, looking into it further. It seems that I am unable to implement any code that will help me leverage browsing data.

there’s a forum discussion here, that also alludes to this. So I’m wondering how I can improve it if I’m hosted with WF but restricted in certain areas.

Lighthouse recommendations are recommendations. Any site using Google fonts will normally get that warning.

I would be more concerned about serving up 11 MB of data on my page. And just for fairness; even Google fails that test on

1 Like