Is there a way to hide/disable a page without deleting it entirely?
i.e. I want to publish my website, but a few pages aren’t ready or are more experimental in nature. Is there a way I can disable these pages from being published so they won’t be indexed by search engines or built into my site map?
In Webflow, no, not yet, this topic has been discussed and as I recall, Webflow people said they’ll think about it. The code you need to put on pages for them not to be crawled by engines has to be put in the header, which in webflow would impact all pages. So I don’t know a solution for you right now.
Vote for this feature because MENU could automatically react to visible/ hidden pages so that there would be no need to manually remove the pages from menus.
Also, realized that even if the pages are not linked, they do get published and search engines do find their content and may display that - go some lore ipsum there (not preferred option). On the other hand, it is handy to have them as “raw material” for building the site while it has been already published.
+1 for this feature request: unpublish or hide a page would be great!
I would like to keep a page that I have to periodically reactivate.
The trick now is to use the library and put there the blocks of the page you need to unpublish.
I was designing a blog for my site last month and google started indexing and showing my test posts and templates everywhere. Even worst, the blog is now included in the sitelinks results!
I now have to delete all my work to get it unpublished because it already started getting traffic.
At the moment per page control of indexing options is not yet available, but it is possible to prevent search engines from indexing certain pages on the site using a custom domain, using the Robots.txt commands on the SEO tab of site settings.
Hi @cyberdave Originally I had the robots.txt with disallow. Webmasters showed only one page of the two in there were actually not coming up in search. I know this is not within your control but I still need the pages hidden regardless. I changed over to meta name=“robots” content=“noindex, nofollow” / after reading an article suggesting that is the better way so Google can first read the page then not follow it because you can confuse Google’s crawler (can’t find the article). Would it be better to revert back to robots.txt until then? I’ve just had to continuously submit it to temporary hide section in webmasters every three weeks because we cant risk the page being available.
Webflow version of the site is already set to no index. “When updates are made, publish changes to the staging version of the site on the webflow.io domain and when ready.” Will do this thanks.