Robots.txt is blocking Google Indexing

Hello!

Is anyone having issues with their Robots.txt file?

I’m trying to do the following:

1 - Upload the sitemap to Google but it says it couldn’t fetch the sitemap
2 - I’m trying to test the site for rich results test on Google but it says the site is blocked by robots.txt

I’ve tried 2 things:

and I’ve also tried to delete the text from the SEO tab on the webflow project and that didn’t work either.

  • I’ve tried to manually index each URL on the Google’s sitemap app and it still says it is being blocked by the robots.txt file.

At this point I don’t know what else to do.

Help! :frowning:

The “User-agent: *” means this section applies to all robots. The “Disallow: /” tells the robot that it should not visit any pages on the site. This includes your sitemap.

So if you want your site to be able to be crawled you need to update the robots.txt file.

Thanks a lot Jeff! I’ve read various guide article on robots.txt files and they all advise that the “Disallow:/” will help search consoles crawl. So what would be the solution? Would it be:

Allow:/

or

Disallow:

Thanks again for your advise!

User-agent: *
Disallow:

Thanks Again Jeff! Unfortunately, it still doesn’t work :frowning:

I’ve ran another test on the rich results tester and it still says it is being blocked by the robots.txt. I also ran another test on the sitemap mapper isn’t able to fetch the sitemap. :confused:

Stop relying on the tools for a bit. They can be caching results. The truth is the file, which is correct at my last check, and resources will update accordingly if you leave it alone.

https://www.grupoinmobiliariofa.com/robots.txt

Sounds good Jeff. Thanks a lot!!!

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.