Page Speed Insight says robots.txt not valid

Here is the content I saved and published in SEO robots.txt input field:

User-agent: *
Allow: /

Here’s the robots.txt that was created by Webflow:

User-agent: *
Allow: /

Sitemap: https://%%PUBLISH_URL_REPLACEMENT%%/sitemap.xml

There’s an error: the link to the sitemap contains a variable instead of the actual domain.
Any hints how to fix this?

4 Likes

Happening across all the sites I manage also. A huge issue.

Having this issue as well.

I am also having the same issue

This looks like an internal Webflow issue. Open a ticket with support.

Same problem here, did this ever get fixed? Or did you find any sort of solution to this?

I am experiencing the same issue with my website (educart.co) and am unsure how to resolve it.

Looks good here, except for the duplicate Sitemap entry.

User-agent: *
Disallow:

Sitemap: https://www.educart.co/sitemap.xml

Sitemap: https://www.educart.co/sitemap.xml

https://www.educart.co/robots.txt