Streaming live at 10am (PST)

Googlebot blocked by robots.txt

Hi! My search console says Googlebot has been blocked by my robots.txt even though my txt file has allowed access to it. Does anyone have the same issue?
Could you help me?
File link:

There is something wrong with your robots.txt.
If you want to make googlebot crawler all your website pages, you can update it as follows:

User-agent: *
Allow: /


For example this website’s robots.txt - clipartkey:

Keep in mind that if you want to get traffic from Google search engines, don’t block googlebot. Otherwise it’s hard to get Google traffic.

1 Like

Hi! Thanks for the reply!

So this is what my txt file says:
User-agent: *

Which basically means any bot is allowed to crawl. I checked with & that’s exactly what is allowed. Everyother crawler on the net says the txt file grants access, but the live URL test on google says the bot is blocked.

You are wrong.
Disallow means “not be allowed”, any bots can’t access to you website.
Allow means “be allowed”, any bots can access to your website.
User-agent: * means “any bots”

The "Disallow: /" tells the robot that it should not visit any pages on the site.

Please check the article carefully.

Oh. All right. I’ve updated it to :

Is this all right?
Again thanks so much :slight_smile:

Well done. Hope your site gets Google likes..