Best Practice for Disallow or NoIndex 404 page

Based on another Webflow post…I started researching 404 Disallow of the 404 page vs NoIndex in meta data.

After a few hours, I came across sources that said disallow the 404 page isn’t a guarantee it won’t be indexed but that disallowing the 404 page could possible cause crawl issues for the entire site.

Adding meta name=“robots” content=“noindex” and meta name=“googlebot” content=“noindex” should accomplish no indexing. Since we can now add header data to each page would this method be safer/better than disallow? Adding that meta data to the 404 page shows towards the bottom of the header code when added in Webflow.

Noindex and nofollow are fine for 404s. Google goes into detail and explains this here:

Regarding disallow/robots.txt, if you’re using noindex and nofollow, this won’t be necessary for your 404.

As for the placement in the header, Google will see it wherever you put it:

tl;dr: If you want to prevent a single page from appearing in Google results, add noindex and nofollow to that page’s header:

<meta name="robots" content="noindex, nofollow" />

@mbrannon47 Before using Webflow I was using “noindex” in the meta. Initially, Webflow didn’t have per-page meta available if hosting with Webflow. I feel better using “noindex” over “disallow” (and I hate seeing the errors in webmaster tools)…so I’ll be switching back to the “noindex” method.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.