Our website was built with Webflow and it’s been a great tool.
Until this week, all of our normal pages have been indexed by Google and appear in search listings. It was easy to get them indexed.
However, we recently created some blog articles in the Webflow CMS, and when we published them, Google console returned the following error:
Crawl allowed? error | No: blocked by robots.txt
We couldn’t tell if our inability to index these new pages was specifically related to something funky with the CMS directory, so to confirm that it was not CMS-related, we built a new and separate directory.
Then we created a test web page and placed it inside that new directory. When we went to Google Console and request indexing for the new URL, the same error appeared:
Crawl allowed? error | No: blocked by robots.txt
So it appears to be that anything in a directory isn’t getting indexed by Google.
Yesterday in the SEO section of Webflow dashboard, our settings were set to this:
Disable Webflow Subdomain Indexing:yes
**User-agent: * **
Disallow: /
However, today we changed it to this:
Disable Webflow Subdomain Indexing:yes
**User-agent: * **
Allow: /
Is this simply a matter of our needing to be patient and wait for Google to recognize these changes to our robots file?
Thanks for any feedback or insights.