Streaming live at 10am (PST)

Robots.txt tile blocking website

A robots.txt file seems to be blocking this site I’ve just done from Google?

I don’t have any robots.txt code in the custom code.

But if I used webmaster tools it can’t ‘fetch’ any of the pages and says a robots.txt file is blocking it, and that the page has ‘serious health issues’ as a results. And I can’t submit the site map.

If you do a google search for ‘’ organically it says “A description for this result is not available because of this site’s robots.txt”

Does anyone know why this robots.txt file would be showing?

PS I’ve tried entering:

User-agent: *

Into the robots.txt text box under the SEO tab, but this doesn’t seem to have fixed it?

The live site:

You just have to wait for Google to re-index, probably the previous version of the robots.txt blocked it.

You can expedite this by submitting your sitemap in Google Search Console.


Thanks Sam! It says that there is an error whenever I try and submit the sitemap, but I’ll just wait then.

One further question: should I delete the:

User-agent: *

I have in the SEO robots.txt text box, or just leave it there?

What’s the error?

That’s disallowing nothing, so it wouldn’t affect your site.


Thank you so much for this Sam!

The Webmaster error went away overnight, and I’ve deleted the ‘user-agent: * Disallow’