Streaming live at 10am (PST)

Can't remove entry from Robot.txt

Hi, I am experiencing some odd behaviour when editing the robot.txt field on a project.

Under the SEO settings, I have added a custom sitemap.xml and then added a link to it on the robot.txt field as per instructions 'sitemap: [sitemap-url.xml]

However when I check the robot.txt file itself it now has the sitemap link twice. Ok, so maybe it ads the sitemap link automatically. So I deleted my line in the robot.txt field, republish and nothing changes. But when I type something else, say “Allow: /” it updates, and still keep the second sitemap link.

Am I doing something wrong here or is there something weird going on? Whey can’t I remove the custom entry?

Here is the current robot.txt: