Every time I publish a new page, Google considers that there are two versions (one with and one without www) and this causes duplicate content errors.
However, I have set a default version in the global settings for the site (the one without www). Have you ever had this problem? How did you solve it? Thank you.
To prevent search engines from treating www and non-www versions of your site as duplicate content, youâll need to set up a global canonical tag. This tells search engines which version of your site should be considered the primary version.
Hereâs how to set it up:
Navigate to Site settings > SEO > Global canonical tag URL
After implementation, you can use Google Search Console to request a recrawl of your site, which will help search engines recognize these changes more quickly.
Hopefully this helps! If you still need assistance, please reply here so somebody from the community can help.
If I set the global canonical tag URL, Google takes the general title of the site + a meta description that it created for all my blog articles (even though I have set the title and meta description correctly). So, I deleted the global canonical tag.
I put it back, but why do I have this problem with the preview of my articles in the SERP?
No, when I have no global canonical tag, my blog posts appears with title and meta description I choose. Thatâs the reason why I didnât add it first.
Itâs still the case that those pieces are only suggestions. Google uses them if and how it wants to. If you are watching whatâs happening, this trend is extending to the actual viewing of your content too. Google modifies the Youtube videos you upload, and AI assistants modify and rehash your HTML content to suit the userâs query. The web is changing.
As far as why Googleâs behaving differently when you donât have a canonical, youâd have to ask Google. It could be drawing from different cached page content in either the with canonical or without canonical scenarios.