Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate content through product variants
-
Hi,
Before you shout at me for not searching - I did and there are indeed lots of threads and articles on this problem. I therefore realise that this problem is not exactly new or unique.
The situation: I am dealing with a website that has 1 to N (n being between 1 and 6 so far) variants of a product. There are no dropdown for variants. This is not technically possible short of a complete redesign which is not on the table right now. The product variants are also not linked to each other but share about 99% of content (obvious problem here). In the "search all" they show up individually. Each product-variant is a different page, unconnected in backend as well as frontend. The system is quite limited in what can be added and entered - I may have some opportunity to influence on smaller things such as enabling canonicals.
In my opinion, the optimal choice would be to retain one page for each product, the base variant, and then add dropdowns to select extras/other variants.
As that is not possible, I feel that the best solution is to canonicalise all versions to one version (either base variant or best-selling product?) and to offer customers a list at each product giving him a direct path to the other variants of the product.
I'd be thankful for opinions, advice or showing completely new approaches I have not even thought of!
Kind Regards,
Nico
-
Hehehe yes we do usually!
-
Thanks for the hint!
Personally, I am a big fan of schema.org and marking up all the products has been on my further ToDo list.
-
Hi Martijn,
Thanks for your reply. I'll have to check with the responsible developer - but I fear that this option is not on the table. Then again, I have been hinted at that a complete redesign might eventually be. As I said below: Nobody who does SEO seems to have been around when the site was created. And we all know what happens in such a case, don't we?
-
Hi Matt,
If it were only that easy... I have since learnt that way back when the client had that website developed he specifically asked to NOT have an ecommerce website. (I, nor anybody advising on SEO, was not around back then AFAIK.)
The products are not connected. They are litereally independently created pages with the same template. The URLs are not parameter based but look like
http://www.example.de/category/subcategory1/subcategory2/product_name-further_description_1
http://www.example.de/category/subcategory1/subcategory2/product_name-further_descripittion_2
So, identical apart from the last bit that is NOT a parameter. And the last bit might be "750-kg" or "Alu" or "with-brakes". Thanks for the advice; I agree that it is generally a good starting point but sadly not possible in this case.
-
Just implemented something similar to this, and used canonicals. Also, if you're able to add more than just canonicals, possibly worth looking at microdata? We used schema.org isVariantOf for colors and size variants, not sure how much this influences googles understanding / search display, but it's widely recommended and seems unlikely to hurt. Implementing took a little trial and error, this helped as did google's schema testing tool.
-
What do the duplicate content URLs look like? In a lot of ecommerce systems you end up with parameter-based URLs such as:
http://www.example.com/products/women/dresses/green.htm
http://www.example.com/products/women?category=dresses&color=greenAccording to Google "When Google detects duplicate content, such as the pages in the example above, a Google algorithm groups the duplicate URLs into one cluster and selects what the algorithm thinks is the best URL to represent the cluster (and) tries to consolidate what we know about the URLs in the cluster, such as link popularity, to the one representative URL. However, when Google can't find all the URLs in a cluster or is unable to select the representative URL that you prefer, you can use the URL Parameters tool to give Google information about how to handle URLs containing specific parameters." (see more at Google Support)
If your URLs are parameter based I would suggest looking into handling them at that level in Search Console or (last resort) robots.txt as well. However, I'd start with canonicals and parameters if possible.
-
Hi Nico,
As you said it's far from prefect but I would indeed go with using a canonical on the pages that have duplicate variants. But if you're doing this already then it might be not that much more effort to also link them back on the back-end of your site so you can do more advanced things.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content, although page has "noindex"
Hello, I had an issue with some pages being listed as duplicate content in my weekly Moz report. I've since discussed it with my web dev team and we decided to stop the pages from being crawled. The web dev team added this coding to the pages <meta name='robots' content='max-image-preview:large, noindex dofollow' />, but the Moz report is still reporting the pages as duplicate content. Note from the developer "So as far as I can see we've added robots to prevent the issue but maybe there is some subtle change that's needed here. You could check in Google Search Console to see how its seeing this content or you could ask Moz why they are still reporting this and see if we've missed something?" Any help much appreciated!
Technical SEO | | rj_dale0 -
Duplicate Content on a Page Due to Responsive Version
What are the implications if a web designer codes the content of the site twice into the page in order to make the site responsive? I can't add the url I'm afraid but the H1 and the content appear twice in the code in order to produce both a responsive version and a desktop version. This is a Wordpress site. Is Google clever enough to distinguish between the 2 versions and treat them individually? Or will Google really think that the content has been repeated on the same page?
Technical SEO | | Wagada0 -
Handling of Duplicate Content
I just recently signed and joined the a-moz.groupbuyseo.org system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
Duplicate Page Content and Titles from Weebly Blog
Anyone familiar with Weebly that can offer some suggestions? I ran a crawl diagnostics on my site and have some high priority issues that appear to stem from Weebly Blog posts. There are several of them and it appears that the post is being counted as "page content" on the main blog feed and then again when it is tagged to a category. I hope this makes sense, I am new to SEO and this is really confusing. Thanks!
Technical SEO | | CRMI0 -
How to deal with duplicated content on product pages?
Hi, I have a webshop with products with different sizes and colours. For each item I have a different URL, with almost the same content (title tag, product descriptions, etc). In order to prevent duplicated content I'am wondering what is the best way to solve this problem, keeping in mind: -Impossible to create one page/URL for each product with filters on colour and size -Impossible to rewrite the product descriptions in order to be unique I'm considering the option to canonicolize the rest of de colours/size variations, but the disadvantage is that in case the product is not in stock it disappears from the website. Looking forward to your opinions and solutions. Jeroen
Technical SEO | | Digital-DMG0 -
Duplicate content problem from an index.php file
Hi One of my sites is flagging a duplicate content problem which is affecting the search rankings. The duplicate problem is caused by http://www.mydomain.com/index.php which has a page rank of 26 How can I sort the duplicate content problem, as the main page should just be http://www.mydomain.com which has a page rank of 42 and is the stronger page with stronger links etc Many Thanks
Technical SEO | | ocelot0 -
Are recipes excluded from duplicate content?
Does anyone know how recipes are treated by search engines? For example, I know press releases are expected to have lots of duplicates out there so they aren't penalized. Does anyone know if recipes are treated the same way. For example, if you Google "three cheese beef pasta shells" you get the first two results with identical content.
Technical SEO | | RiseSEO0 -
Duplicate Content issue
I have been asked to review an old website to an identify opportunities for increasing search engine traffic. Whilst reviewing the site I came across a strange loop. On each page there is a link to printer friendly version: http://www.websitename.co.uk/index.php?pageid=7&printfriendly=yes That page also has a link to a printer friendly version http://www.websitename.co.uk/index.php?pageid=7&printfriendly=yes&printfriendly=yes and so on and so on....... Some of these pages are being included in Google's index. I appreciate that this can't be a good thing, however, I am not 100% sure as to the extent to which it is a bad thing and the priority that should be given to getting it sorted. Just wandering what views people have on the issues this may cause?
Technical SEO | | CPLDistribution0