Just like the title says, my page cannot be indexed. I am new to this and I want to learn. What can I do to fix this and get my page indexed? Here is the error I am receiving and the robot.txt I have down currently.
You really don’t need to specify a robots.txt file at all, but your configuration is fine there.
My guess is at one point you had Disallow: / by accident, and Google picked that up, and now it just needs to refresh which may take awhile. In Google Search Console you can request a re-indexing.
First off, thank you very much for responding to my query. I am glad sitemap and robots.txt look acceptable. Google search console will not accept my sitemap and re-indexing keeps giving me the same result which is that it’s blocked by robot.txt. Will I need to wait a few days and try again? Even prior to configuring a robot.txt file it would not pass a live URL inspection.
And yes, before I created the robots.txt today, I was getting the same exact error for the past several days. If you mean the robot.txt tester, that is also not working funny enough. It will never pop up with a page where I can test it like this guy can. Instead, it pops up with property search and then it shows I am a verified owner.
Often with GSC, I have to tell it to retry once after the initial sitemap.xml submission.
I think there’s a button just below, off-screen from your screenshot?
It does sound like there’s something weird going on with your system, you should definitely be able to run the robots.txt testing tool.
Just in case there is some kind of client-side dependency, I’d try clearing your cache and turning off browser add-ons, especially any ad blockers.
Outside of that, I’d wonder if something’s off with the verification. These days I create domain properties exclusively in GSC because it seems happier with that on the www. and naked domain versions.