I’m not sure what you mean by support is unable to help; if you’ve removed the custom robots.txt and republished, and it’s not republishing, that’s a major bug they’d be chasing down.
In the meantime, I’d try these things;
Remove it again, save it again, republish it again
Try changing it to spaces, save and republish, see if that fixed it
Even after removing the robots.txt rules and republishing, Webflow is not updating the actual robots.txt file. I agree, this is a major bug (something which is happening on my end at least).
I had already tried resaving and republishing many times. It’s not working.
The support team gives some redundant answers once every day.
Totally not happy with the way Webflow and their team are working.
I also tried the following:
User-agent: *
Allow: /
The customer support representative actually asked me to remove that and resave it.
Yes that would be super difficult-
My guess is some form of caching or CDN issue.
I can see on your bettermode.webflow.io site, that it was published about 1h 30 ago from now, and that the /robots.txt is still showing the old content;
User-agent: *
Disallow: /
Look this is a terrible hack, but it might just get you out of the deep end for now.
Let’s see if adding a 301 redirect will at least prevent googlebot from accessing that bad data…
Try adding this to your redirect table, and republish everything.
In the test I just did, Webflow accepts it, and this redirect works. I’m hoping that if you then resubmit your site in Google Search Console, it will decide the robots.txt no longer exists, and that will resolve your issue.