I just moved a client’s existing website from Wordpress to Webflow, and I’m wondering if someone can run me through the basics on how the Google Site Verification works in this scenario. I’ve watched a few of the Webflow University tutorials, but do I need to be logged into Google Search Console with my client’s Google account? Or can I just add a new property in my account? I’m also trying to figure out whether or not the site has been verified already, and if so, where I can find it and its verification ID. I doubt my client would know if I asked. Any info/resources/help would be appreciated! Thanks
That’s more or less an older mechanism now.
The more popular route is domain properties, which is verified at the DNS level, not the site level.
It will depend on how your client has setup GSC.
You don’t need to be logged into their account, they can invite you as a GSC admin instead.
Thanks for getting back to me @memetican. I managed to figure out domain verification at the DNS level, but now looking in GSC at the critical errors in Settings, it looks like my robots.txt hasn’t been fetched (“Not Fetched - Not Found (404)”) since the site was launched a couple days ago. I have auto-generated site map turned on, but typing in the URL of my site with robots.txt at the end leads me to a 404 error page. From what I’ve read, I can manually put a robots.txt into the Indexing section of the SEO page, but it sounds like that’s unnecessary if Webflow automatically generates one and links to it in the sitemap. Also, would that override the indexing of certain pages I’ve established through Webflow’s page settings?
You don’t need a robots.txt and won’t have one by default.
I think you may be thinking of
You can define a robots.txt if you want to but it’s more likely to create harm than benefit. A 404 is better unless you are doing something unusual on your site.
If you mean excluding from the sitemap, that’s automatically picked up as part of Webflow’s automatic sitemap generation feature.
To confirm @memetican, I don’t need a robots.txt? I’m obviously new to this, but the “critical errors” callout in GSC made me think it was a serious problem. I know the sitemap has been indexed, is that all I need to worry about?
No you don’t need one. Otherwise Webflow would create you a default one.
Easiest Solution: Ask the client’s Google Search Console (GSC) admin to grant you access. This approach provides immediate access to all necessary data and tools without needing to go through the verification process again.
Verify the site yourself in GSC using a method like adding a TXT record to the domain’s DNS, uploading an HTML file to the site, or using a meta tag.