Critical robots.txt Error

Our robots.txt field is populated properly but the file rendered includes random characters and duplicates our sitemap. Is anyone else having this issue?

Screen Shot 2022-01-27 at 3.10.45 PM

I guess there is a pasting issue here that is bringing some characters to your robots.txt.
I suggest you clean the robots.txt and then save the changes.
Then write the same text again without copying from anywhere else and see if that fixes anything.

1 Like

Hi Yigit,

Thanks for your help I tried that and it got rid of the weird characters. However, I am still getting the double sitemap. I’ve tried just deleting the field entirely and republishing, but it is not deleting the file.


Good to hear you’ve got rid of the weird chars.

For the double sitemap issue, did you delete the sitemap line in your robots.txt?

Yes, we deleted everything in our robots.txt and we still have the issue. See

Assuming you’ve published the changes you’ve made to robots.txt this shouldn’t happen.
Then, I’d check if you have another robots.txt file in your root folder that might be causing this.

I suspect the double entry of the sitemap in robots.txt is due to you using a custom sitemap. It’s really not an issue as the bots will just follow the link. Won’t hurt anything. Ignore it. I like to put poems in mine.

1 Like

Thanks again, Yigit. I checked with our lead backend engineer and he confirmed that there is no robots.txt file in our root folder or anywhere else. The plot thickens…

1 Like