Hey, @khalada, I recommend taking a different route than modifying robots.txt. Simply pop this code into the <head> of each page you’d like to affect, and search engines won’t index/follow them:
Here’s the code: <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
From the Designer, go to the settings of the page(s) you’d like to prevent Google from seeing, and paste the code into the tag (scroll down quite a bit on the page settings to see this).
You can use a special HTML tag to tell robots not to index the content of a page, and/or not scan it for links to follow. Source info on Robots<META>tag: The Web Robots Pages
doesn’t exclude the page from the sitemap.xml page generated by webflow that we submit to Google (it includes all pages in the sitemap). Is there a way to manage that? Or should I even be concerned about it? When I have hosted off-site I have manually edited the site.xml. Not sure what to do for a webflow hosted site.