I’m loving the new pagination feature on the collection pages, but all the SEO analysis tools are giving errors because each paginated page of the collection now has identical OG/meta tags etc.
What do you suggest so we don’t see negative SEO consequences due to what Google thinks is duplicate content?
Question: If you had the ability to add unique meta data to a paginated page, what would you put there? A differently worded description of page 1’s or 2 or 3 and so on?
Paginating a collection is just breaking up a list. If you needed unique descriptions for parts of a list, why not break it up into new lists. You could create a new page with a filtered collection list. That will always rank better because it’s specific and so are the list items. I personally use silos and good pruning for great success. Just my thoughts.
I have come to the forum to research this issue, as it is now my concern.
Here’s what the literature has to say about SEO and duplicate titles.
Duplicate Titles
A title is considered to be a duplicate if it matches the exact title of another page. Duplicate titles will diminish the quality of a page since it is unclear on which page has more relevance to a given topic. Furthermore, it will also confuse the user when navigating your site.
With Pagination, the exact page where the element is found on is rendered again and thus indexed with a new url, with the exact same title. There’s no way, at present moment, to vary the subsequent pages because it’s internal.
Is there a way to append the pagination page and ID number to the title of the page it is found on like you mentioned? And this be done automatically by the internal system would be superb.
Right now, my client’s site is showing hundreds of duplicate titles…which may negatively effect their SEO and my services to them if I can’t show the problem resolved.
So while the URLS are different (as they should be because they are now indexing as 3 different pages instead of 1) the title for EACH of these pages in SEO SETTINGS > TITLE TAG is exactly the same for all 3 of these pages. Why? Because it’s really only 1 page and there’s really only 1 place where we can edit the Title Tag. In the above example the title “About Dr. Scott Hollander” is indexing 3 times.
I explained that to the best of my abilities. Let me know of a good resolve for this. Thanks in advance!
Edit directly following the post: Works the same way for duplicate descriptions so we’ll need a workaround or resolve for that as well.
The following Moz article on this subject is well thought out and includes actual recommendations from Google about what you can do and recommendations. While the article is a little old, the recommendations are not, and much of the well-referenced content is available at the Official Google Webmaster Central blog.
So from my understanding, we need to add the rel next/prev markup to EACH paginated page. Where do we have control of each page of the pagination to do so? And if we did have control, wouldn’t we just name/describe it differently in meta?
The other way then is to hide specific subpages in robots text…which I think is scary lol
I think the best way (without having access to HTML inside of paginated subpages) would be the rel canonical HTML markup…BUT we actually LOSE the content on the now non-indexed subpages in doing so.
In the end, we still need a way to edit the duplicate pages that pagination creates.
You might want to ask the question “Why am I paginating 10 results?”. Also, expecting a page, or a list of pages, to rank on thin content with no real structure (think Wikipedia), is not realistic.
The SEO purpose of a list is to get pages crawled. The benefits of a list to a user is to provide helpful related information. Concentrate on the later. Google can see and track user behavior (google analytics for example). It is an element of the algorithm.
If this was a big issue, Amazon would not be showing the same title, and meta-description when paginating. What they do instead, is to create landing pages that are relevant to the parent link and category, with strong content silos.
Totally understand the uses and the how-to’s of SEO. Some of the content is thin simply because my client hasn’t yet added his commentary, as we can’t include large chunks of someone else’s work (citing an article for discussion). Beyond that, there are other pages that aren’t as famished. The paginated element is a symbol and exists on nearly all pages, so that does not help as much.
In any case, it sounds like there’s no definite resolve other than don’t use pagination or ignore these SEO trips because it doesn’t matter anyway.
We’re hitting problems here as well. We are consulting with an SEO company to help drive more visitors to our Webflow blog and their primary concern for us is that paginated blog content is being seen as duplicate content by google because each subsequent page has the same meta as the original.
Right now it doesn’t appear there’s a scaleable solution to this problem. Removing pagination is not an option because we can’t realistically load 100 blog posts on a single page.
I was hoping to find more information in the Webflow Universtity article on pagination but they only link to an outdated Google blog and incorrectly suggest using rel=‘next’ and rel=‘prev’ tags.
I use a list on a static page to display all posts with a page.
I followed the comments of the post, but I saw that there was still no clear solution for that.
The client’s CMS has more than 300 posts, I cannot remove the page from it.
However, I have discovered a rather major caveat:
The component is not compatible with srcset images, at least not if one wants it to work properly on Safari.
Just an update to say the issue I reported above has now been resolved to perfection by the @Finsweet team. I highly second the recommendation of using their cms library, as it resolves the OP’s issue and is a powerful tool with many other advantages too. And free!