Hi everyone,
We are currently experiencing issues with our sites that seem to be blocking bots from crawling, despite our efforts to let them to!
We thought we have solved the issue by updating our robots txt file to allow all bots:
User-agent: *
Allow: /
When inspecting the robots txt file online, it looks like it should work as normal: https://www.protex.ai/robots.txt
However, judging by all the live tests we’re conducting in Google Search Console, bots seem to still be blocked (been testing during the course of over 6 hours now).
Is anyone else experiencing the same issue or has been experiencing this in the past?
Many thanks,
Thibault