Streaming live at 10am (PST)

Robots.txt | HELP PLEASE

Hey anybody able to help me with this? I’m trying to setup robots.txt, but I’m not sure if I’ve done it correctly… I don’t want the following pages to be ignored:

User-agent: *
Disallow: /cookies-consent/
Disallow: /test-pagina/
Disallow: /pop-ups/pop-up-page/
Disallow: /socials-feed/socials-feed-page/
Disallow: /reviews/review-page/
Disallow: /ticket-filters/ticket-filter-page/
Disallow: /401/
Disallow: /404/
Disallow: /search/

Thanks in advance!

If you don’t want your pages to be ignored, in other words, if you want those pages to be crawled:

You should remove all the Disallow:'s related to the paths and change your robots.txt to

User-agent: *
Allow: /

or

User-agent: *
Disallow:

Please be aware that these will enable your entire site to be crawled by robots.

My bad, I want these pages not to be crawled by robots. My question is if the sitemap is setup correctly. Especially the cms ones.

User-agent: *
Disallow: /cookies-consent/
Disallow: /test-pagina/
Disallow: /pop-ups/pop-up-page/
Disallow: /socials-feed/socials-feed-page/
Disallow: /reviews/review-page/
Disallow: /ticket-filters/ticket-filter-page/
Disallow: /401/
Disallow: /404/
Disallow: /search/

For example to disallow all the cms pages out of the pop-ups collection. https://dockfec.webflow.io/pop-ups/*/ which one would I use?

  • Disallow: /pop-ups/pop-up-page/
  • Disallow: /pop-ups/*/
    or how?

Read Only

Indexing (Sitemap.xml & Robots.txt) I’ve made a new forum with all the info, because I had two open tickets which I think are the same problem. Please help me out with this one…

Thanks in advance.