Streaming live at 10am (PST)

Excluding a page from robots.txt generates an error in Google Lighthouse SEO report


I have a page in my Webflow project that I don’t want to be crawled by search engines, and which is not linked to any other page. So I have added this custom code in the tag of the page:

It works fine, but I just noticed that it impacts the Google Lighthouse SEO report score, which generates this error:

Do you know how to fix this error?

The website url is if you want to have a look.

These two are unrelated. Read the error message again. Fix your disallow string.

See > The Web Robots Pages (

You’re right, the issue was coming from the projects settings, I missed the user agent in the robots.txt field! Thanks!