Need advise on robot.txt - It is hiding my website

Hey

So I added this code to my settings thinking I was doing the right thing but it just complicated the situation

User-agent: *
Disallow: /

Consequently, it turns out it is hiding my site. How do I reset the effect?
I was thinking of turning off the sitemap generation and adding the custom
sitemap manually according to this Sitemap | Webflow University

But I think I should be cautious and don’t want over complicate things, any advice on
the best approach to address this situation?

Remove the /.
The “most” correct, most widely supported allow-all for robots.txt is;

User-agent: *
Disallow: 

However you should just be able to delete any robots.txt settings in your Webflow site config, republish, and get that same result.

If you only care about Google-bot, you can use Allow, as in;

User-agent: *
Allow: /

Thanks for the code. I will add

Just to clarify, this will say yes to all robots universally resulting in being visible to all search engine?


User-agent: *
Disallow: 

while

This one is Is specific to targeting Google-bot, resulting in being visible to google search engine but not others?

User-agent: *
Allow: /

Following adding the robot txt with disallow, I should say yes to autogenerate the site map, is this correct? As the robot txt uses the sitemap to do its job?

Correct.

Correct.

I would autogen the sitemap, but sitemap.xml and robots.txt are separate things. Sitemap xml can also include robot.txt settings, so autogen is best. I’m not sure what happens if the robots.txt and sitemap.xml conflict. Depends on the robot.