Robots Blocked From Crawling our Site

Hi everyone,

We are currently experiencing issues with our sites that seem to be blocking bots from crawling, despite our efforts to let them to!

We thought we have solved the issue by updating our robots txt file to allow all bots:

User-agent: *
Allow: /

When inspecting the robots txt file online, it looks like it should work as normal:

However, judging by all the live tests we’re conducting in Google Search Console, bots seem to still be blocked (been testing during the course of over 6 hours now).

Is anyone else experiencing the same issue or has been experiencing this in the past?

Many thanks,

GSC should tell you if it’s being robot-blocked.
If so there’s a menu option to get it to re-assess.

Thanks Michael,

The issue seems to be fixed now.

Good to know that GSC takes a while to update, even when testing URLs live…!

Have a great day!

Many thanks,