Page Cannot Be Indexed: Blocked by robots.txt

Hi Guys,

Just like the title says, my page cannot be indexed. I am new to this and I want to learn. What can I do to fix this and get my page indexed? Here is the error I am receiving and the robot.txt I have down currently.



Here is my site Read-Only: Webflow - Cody's Cookbook

1 Like

URL is www.codyscookbook.com

Hi Cody, your setup looks fine;

https://www.codyscookbook.com/sitemap.xml

https://www.codyscookbook.com/robots.txt

You really don’t need to specify a robots.txt file at all, but your configuration is fine there.

My guess is at one point you had Disallow: / by accident, and Google picked that up, and now it just needs to refresh which may take awhile. In Google Search Console you can request a re-indexing.

Hi Michael,

First off, thank you very much for responding to my query. I am glad sitemap and robots.txt look acceptable. Google search console will not accept my sitemap and re-indexing keeps giving me the same result which is that it’s blocked by robot.txt. Will I need to wait a few days and try again? Even prior to configuring a robot.txt file it would not pass a live URL inspection.

Are you certain you’ve added the right sitemap xml to the right GSC property?
The URLs must match, so you want to add the full thing;

https://www.codyscookbook.com/sitemap.xml

If you try to add the webflow.io one to your codyscookbook.com property, it likely won’t work.

Before you created the robots.txt, were you getting the robots error then as well?

You might try running the GSC tester tool on your verified property; it might have an old one cached that’s blocking you.

I did submit the correct sitemap and here is what I received back:

And yes, before I created the robots.txt today, I was getting the same exact error for the past several days. If you mean the robot.txt tester, that is also not working funny enough. It will never pop up with a page where I can test it like this guy can. Instead, it pops up with property search and then it shows I am a verified owner.

Often with GSC, I have to tell it to retry once after the initial sitemap.xml submission.
I think there’s a button just below, off-screen from your screenshot?

It does sound like there’s something weird going on with your system, you should definitely be able to run the robots.txt testing tool.

Just in case there is some kind of client-side dependency, I’d try clearing your cache and turning off browser add-ons, especially any ad blockers.

Outside of that, I’d wonder if something’s off with the verification. These days I create domain properties exclusively in GSC because it seems happier with that on the www. and naked domain versions.