Published site unable to fetch by google (godaddy domain)

So I have just published to a custom domain ( for my client’s site.
It is now published on 2 destinations: (webflow) (godaddy)

I am trying to upload the xml sitemap to Google Search Console, after following exactly all the guide I could find, I managed to get to be fetched by google, however can’t be fetched.

I really couldn’t pinpoint what went wrong despite my efforts.
Any help would be great!

Here is my site Read-Only:

Hi @KevinWong, thanks for your post about the sitemap.

I took a look at the custom domain and was prompted with a password protected page:

Shared with CloudApp

On a password protected page google is unable to index the project as the site is password protected.

Have you tried to disable the password protection and then submit the sitemap?

Thanks in advance,

1 Like

Hi @cyberdave , thanks for the quick response.

All the tests in the screenshot was conducted while the password was disabled. So I don’t think that affected it.

However I have just disabled the password for safety measures.

I rechecked but still nothing changes.

Hi @KevinWong, thanks for your reply.

I checked directly on the sitemap url and the sitemap shows up:

Shared with CloudApp

I also ran that through xml sitemap validator and it also passed:

Shared with CloudApp

I might remove the seo global canonical tag as the default domain is already set on the hosting tab of project settings:

Shared with CloudApp

Once removing the global canonical tag, republish the site and then try that again. If that is causing no problem, you can re-add that, but I would probably wait until the issue is resolved.

I would also check the custom code tab and see if you have any redundant google verification script lines, having multiple verifications can sometimes cause google to not be able to read the domain.

If the issue still persists, let me know so I can help to check further.

1 Like

Hi @cyberdave ,
I have took your advice to

  • remove global canonical tag, republished the site.
  • removed redundant google verification script (in fact I did a clean reset/setup for google search console verification, now only has
    …still no luck fetching the sitemap.xml

I also did a URL inspection for both homepage and /index.html hope it may help.



Hi @KevinWong , thanks for the update, there is a little info button on the indexing result on the page that should be submitted, can you click that and show what it says?

Shared with CloudApp

Thanks in advance

Hi @cyberdave ,
It appears to be an unresponsive button :roll_eyes:

On the other hand I have tried searching the site on google via site:
here’s the results

There is no result for
However there is a cached result for (before the site is hosted with webflow)
Not sure does this information help.

Really appreciate your help and patience.

Hi @KevinWong, thanks for hanging in there, this can be sometimes tricky as the indexing is taking place at Google and our team at Webflow does not have direct visibility.

Can you click the View Source on the last page that was cached, so we can see what that looks like, did it originate in Webflow and the last published date that was indexed.

Shared with CloudApp

​Thanks in advance, I will be looking forward to your response.


Hi @cyberdave, here’s what I found

<!doctype html><html lang="en"><head><meta http-equiv="content-type" content="text/html;charset=utf-8"><meta name="viewport" content="width=device-width,initial-scale=1"><link rel="shortcut icon" href="data:image/x-icon;," type="image/x-icon"><title></title><script src="" type="text/javascript"></script><noscript><style>#content-main{display:none}</style><div>For full functionality of this site it is necessary to enable JavaScript. Here are the <a target="_blank" href="">instructions how to enable JavaScript in your web browser</a>.</div></noscript><script type="application/javascript">window.LANDER_SYSTEM="PW"</script></head><body><div id="contentMain"></div><script>!function(e){function r(r){for(var n,a,i=r[0],l=r[1],p=r[2],c=0,s=[];c<i.length;c++)a=i[c],,a)&&o[a]&&s.push(o[a][0]),o[a]=0;for(n in l),n)&&(e[n]=l[n]);for(f&&f(r);s.length;)s.shift()();return u.push.apply(u,p||[]),t()}function t(){for(var e,r=0;r<u.length;r++){for(var t=u[r],n=!0,i=1;i<t.length;i++){var l=t[i];0!==o[l]&&(n=!1)}n&&(u.splice(r--,1),e=a(a.s=t[0]))}return e}var n={},o={1:0},u=[];function a(r){if(n[r])return n[r].exports;var t=n[r]={i:r,l:!1,exports:{}};return e[r].call(t.exports,t,t.exports,a),t.l=!0,t.exports}a.m=e,a.c=n,a.d=function(e,r,t){a.o(e,r)||Object.defineProperty(e,r,{enumerable:!0,get:t})},a.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},a.t=function(e,r){if(1&r&&(e=a(e)),8&r)return e;if(4&r&&"object"==typeof e&&e&&e.__esModule)return e;var t=Object.create(null);if(a.r(t),Object.defineProperty(t,"default",{enumerable:!0,value:e}),2&r&&"string"!=typeof e)for(var n in e)a.d(t,n,function(r){return e[r]}.bind(null,n));return t},a.n=function(e){var r=e&&e.__esModule?function(){return e.default}:function(){return e};return a.d(r,"a",r),r},a.o=function(e,r){return,r)},a.p="";var i=this["webpackJsonpparking-lander"]=this["webpackJsonpparking-lander"]||[],l=i.push.bind(i);i.push=r,i=i.slice();for(var p=0;p<i.length;p++)r(i[p]);var f=l;t()}([])</script><script src=""></script><script src=""></script></body></html>

Hi @KevinWong,

Thanks so much. Ok, from that I can see that the last page cached was not originating from Webflow (it appears).

So next I would backtrack to the domain and start looking at that a bit more closely. If the old page from a previous hosting is still in index, it might be that there is a Robot.txt entry that was applied to the site in the past that disallowed indexing.

Once that Robots.txt directive is given, it can take some time for the directive to be removed automatically from the indexing.

My guess (at the moment) is that maybe at one point there was a robots.txt file for that domain which disallowed indexing.

If the domain was moved to Webflow but no further Robots.txt was made, Google would still think the site should not be indexed.

I would make another change on your site for good measure, that would be to add a line in the Robots.txt field of the SEO tab of project settings.

There you can paste in the following text:

User-agent: *
Allow: /

After pasting in that code, republish the site for the change to take effect.

Let me know when done so that I can help to run some additional tests on the public custom domain and check how things look.

Hi @cyberdave ,

What you said makes a lot of sense,
I have done that and have republished :crossed_fingers:

Hi @KevinWong,

Awesome, I see that has been updated:

Shared with CloudApp

Next is the domain check. Both domains the and domain should be added to the hosting tab.

The issues detected for the custom domains is shown on the hosting tab of project settings. See how to Connect a custom domain.

The next step would be to check and update the DNS records on the domain(s) to point the DNS records at the correct servers.

The root domain (without www, like should have these two A-records:

Type: A
Name: @

Type: A
Name: @

After DNS changes are made, got to the hosting tab of project settings, set the
WWW domain as the default domain and then Publish your project.

I took a look at a public DNS lookup and while the domain is pointed at Webflow, the domain is not:

Shared with CloudApp

I would next check the DNS records and remove that current A Record for the domain and add the two A Records for Webflow.

After changes, let me know and I am happy to check.

Hi @cyberdave,

Ok I have completed the steps.

  • connected the 2 custom domains & published
  • updated the DNS settings

full page screenshot:

HI @KevinWong,

Great, you did an awesome job, can you try to do the Google sitemap submission now for in Google?

If it does not work at first, it might be due to dns propagation, just need to wait sometimes for a little while, but from public dns lookup the dns is looking good now.


Hi @cyberdave

still no luck after removing and resubmitting the sitemap url

I also ran URL Live Test again on &

Really appreciate your time for your help :persevere:

Hi @KevinWong, ok, I see that google is trying to scan both the root domain and the www domain, I would set the global canonical domain to be set to the default www domain, this can be changed on the SEO tab of project settings:

Shared with CloudApp

After change, republish and let me know and try to submit the sitemap again using url

I am happy to help :slight_smile:


Hi @cyberdave, that’s done.

still says couldn’t fetch, same error.

Hi @KevinWong, thanks, to the best of my ability, I took a look and all seems correct now in terms of settings, and I have run multiple sitemap checkers and all came back 100% ok.

One more thing to check, can you please submit the sitemap with a slash following the domain like:

Thanks in advance.

Hi @cyberdave

coudn’t fetch still.

Could it be an issue from the domain registrar - GoDaddy ?

Hi @KevinWong,

Thanks for the update. It could be a matter of just waiting for propagation to complete, I do see some nameservers that are not yet updated (or may not be) with the most recent domain changes on the root domain

Shared with CloudApp

I would give it up to 24 hours, if the issue still is not resolved, let me know.

1 Like