Question about SEO, sitemap, and robots.txt

Hi all, for a client I am working with they want a few of their pages to not be crawled by search engines, so for those pages I have added: < meta name=“robots” content=“noindex” > inside of the head tag. They are also asking for a robots.txt file but according to this webflow forum, this would not be necessary: https://discourse.webflow.com/…/how-to-hide-a…/27237. I am wondering if 1: I should still add a robots.txt file, and 2: if I did add a robots.txt file, what would be the correct way of setting

User-agent: *

Disallow: /page

to multiple pages. Sorry this question is a bit all over the place, ty for any help in advance!

Block Search Indexing with ‘noindex’ | Google Search Central

Robots.txt Introduction & Guide | Google Search Central

1 Like