Can't remove entry from Robot.txt

Hi, I am experiencing some odd behaviour when editing the robot.txt field on a project.

Under the SEO settings, I have added a custom sitemap.xml and then added a link to it on the robot.txt field as per instructions 'sitemap: [sitemap-url.xml]

However when I check the robot.txt file itself it now has the sitemap link twice. Ok, so maybe it ads the sitemap link automatically. So I deleted my line in the robot.txt field, republish and nothing changes. But when I type something else, say “Allow: /” it updates, and still keep the second sitemap link.

Am I doing something wrong here or is there something weird going on? Whey can’t I remove the custom entry?

Here is the current robot.txt:
https://www.insightonline.co.nz/robots.txt

Hey, looks like you might have solved this issue. Any insights you can offer me? I’m running into the same issue of having the sitemap display twice.

Webflow automatically adds a sitemap entry to your robot.txt file, with or without manual sitemap, so no need to add it manually to the file. If you’re stuck with two entries, try resetting the robots.txt file and the sitemap to the default ones, hit save, then publish and then change it agin, it might solve the behavior.

Did anyone find a solution for this?

I did the same thing – I added my sitemap manually to SEO > Indexing > robots.txt field so my robots.txt is showing errors with two entries. It doesn’t remove it when I remove it from the field.

How do you reset the robots.txt file?

@codyyan - Have you tried to follow the docs? → https://help.webflow.com/hc/en-us/articles/33961355371667-Create-a-sitemap-in-Webflow#how-to-auto-generate-a-sitemap