Warning: Webflow Spam Protection Feature Can Wipe Your Site from Google Search

Hey everyone,

I want to alert other Webflow users about a serious issue with the “Bots are being blocked” spam protection feature that was added to the Forms settings. Enabling this setting can dramatically harm your site’s visibility in Google Search — to the point where your site may completely disappear from search results.

Here’s what happened in my case:

  • I had an older Webflow site that was ranking on the first page of Google.
  • A few months ago, I turned on the new spam protection option (“Bots are being blocked”) under Form settings.
  • Since then, the site has vanished from Google’s search results — it no longer shows up even when searching keywords mentioned on the site and used to show my site on first result page.
  • In the browser console, I noticed multiple 401 errors from this URL:
    https://challenges.cloudflare.com/cdn-cgi/challenge-platform/

These errors come from Cloudflare’s bot challenge system — injected by Webflow when this spam protection is turned on. The issue is that Googlebot cannot get past these challenges, and treats your pages as unreachable or broken, leading to complete deindexing.

Important Notes:

  • As soon as I disabled the “Bots are being blocked” setting, the 401 errors disappeared immediately.
  • This issue has been reported to Webflow Support months ago, but there’s still no fix or official warning.
  • I haven’t added any custom code — this behavior is entirely caused by the built-in feature.

Recommendation:

Avoid using the “Bots are being blocked” spam protection feature until Webflow addresses this issue. If you’ve already enabled it and seen a drop in search visibility, I strongly recommend disabling it and submitting a reindex request in Google Search Console.

This is a serious SEO risk, and many users might not even realize it’s happening. Hopefully, Webflow will release a proper fix or at least display a warning in the UI before enabling the feature.

Anyone else experienced this? Would be great to hear if others are seeing the same pattern.

Hey Eli, do you have any actual reference indicating problems?

Turnstile’s 401 error are a fundamental part of its design, so don’t concern yourself with those. Yes they’re ugly, because browsers can’t tell whether they are real errors or errors-by-design.

Regarding Googlebot, Turnstile cannot block it. Turnstile is not a WAF solution, it’s only used as a turing test to validate form submissions, which Googlebot does not submit.

I can’t guess what has caused your de-indexing, but Google Search Console should have information for you if the bot is unable to index your site, or if you’ve been flagged for anything.

It’s extremely unlikely it has anything to do with Turnstile. If such as thing were possible, more than half of all Webflow sites would have disappeared from Google.

Thanks for your reply. I understand how Turnstile is meant to work, but I can only share what actually happened in my case.

I made no changes to my site at all in the past few months, except for enabling the “Bots are being blocked” feature about 1-2 months ago. Shortly after, my site, which was previously ranking on page 1 of Google, completely disappeared from the first few pages of search results.

As soon as I disabled the feature, the 401 errors disappeared immediately. I’ve now requested reindexing and will monitor if the site recovers.

I’m not saying this will happen to everyone, but in my case, there’s a clear timeline correlation between enabling this feature and losing organic visibility.

Just sharing my experience so others can be aware and keep an eye on it. Hope it helps!

I understand the correlation you’re seeing, but it’s not logically possible, so be careful about jumping to incorrect conclusions. If there were any problems here, the Internet and this forum ( for Webflow-specific sites ) would have imploded long ago.

Correct, again this is part of Turnstile’s operation. Turn it off, and you won’t see that Turing test activity in the console log. Those 401’s have nothing to do with page delivery, they’re just part of the bot detection to block form submission spam.

What did you find in GSC?

Following your theory, if there was some kind of Googlebot blocking happening on the page requests, you’d see those errors directly in your GSC reports.

Hey @Eli11

Thanks for flagging this. The “Bots are being blocked” feature protects Webflow Forms from spam submissions. It only applies to form submissions, and shouldn’t affect Googlebot crawling. The 401 errors are part of Cloudflare’s form verification process. Our team is already addressing your ticket and will provide a detailed response there. If anyone else is seeing unexpected behavior, please contact support with details.

2 Likes

Thanks @Jessie_Oh for your reply. You’re probably right that Google crawlers can still access the page, but I believe the repeated 401 errors may still impact how Google treats the site — in my case, visibility dropped right after enabling the feature, without any other site changes. I’ve since disabled it and reindexed, and will share the results.

Additionally, I noticed another issue with this spam protection feature. When “Bots are being blocked” is enabled, animations on Safari become choppy and not smooth — but only on pages that include a form. I tested this on a very simple page with one basic animation, and the issue happened every time the feature was turned on.

This makes me wonder if the bot challenge script injected into the page is causing performance issues beyond just form submissions.

Hopefully, this feedback helps Webflow’s team investigate further.

Read-only link

Published link