Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Artist Bios on Multiple Pages: Duplicate Content or not?
-
I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print.
My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google.
Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future.
Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution.
Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution? Is this effort even necessary?</p> <p>Thoughts?</p></iframe>
-
Hi Darin,
Let me add my 2 cents:
If it makes sense from a usuability standpoint to have the author bio on the page, then by all means leave it there.
What's most important, from a search engine point of view, is that the unique content on the page is the most important.
This means placing the paragraphs about the print description front and center on the page. Since Panda, Google seems to treat page content using more of a Reasonable Surfer model in a similar manner as they handle links. That is, the higher up and more prominent the content, the more likely that weighs into their calculations to what the page is "about."
Matt Cutts has previously said it only takes 2-3 sentences to make a page unique, but personally I think closer to a couple hundred words is a safer number.
Hope this helps! Best of luck with your SEO.
-
The <iframe>makes the most sense for this company's requirements. Do I need to do anything regarding noindex or nofollow if we create a dedicated page for each artist's bio and then pull the bio into the <iframe> on each print's page? Or does simply pulling that data via the iframe from the original "source" (that being the proposed artist bio page) eliminate the duplicate content concern?</p></iframe>
-
Well, according to this post from a Google employee on a Google forum, Google ignores the noindex or nofollow in an <iframe>:</p> <p>http://productforums.google.com/forum/#!topic/webmasters/tSHq764AA0A</p> <p>He also references this link on the robots.txt file:</p> <p>http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710</p></iframe>
-
Chad, while posting a link instead of the dupe content makes sense logically, it dramatically reduces the amount of content on the page, so from a usability standpoint to the visitor (as well as the directive of the site owner), the bios need to remain on each print's page.
-
If the artist bio is not the major content on the page and there is other content available which is unique so there are less chances that Google will take this in to play but you never know Google... so it’s better to play safe.
Now if you want to play safe you have two choices, either to have a dedicated page for each artist and on that painting’s page just put the clickable image of the article that will take people to the artist’s bio page (not really helpful from conversion point of view)
The other idea is to use the iframe to show the content on each page and this way Google will count that a different page.
-
Why can't you just have a link to a artist bio page.
For example:
Click to read: John Doe's bio
This seems to solve the issue of usability as well as the issue with duplicate content. Just a suggestions. Learning more myself.
-
I was actually going to suggest putting the artist's info into a graphic before I finished reading your post. If that is going to be too much of an undertaking, then yes, an iframe would be a reasonable solution. Instead of using robots.txt, I'd suggest putting the noindex tag into the head of the iframed content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content on recruitment website
Hi everyone, It seems that Panda 4.2 has hit some industries more than others. I just started working on a website, that has no manual action, but the organic traffic has dropped massively in the last few months. Their external linking profile seems to be fine, but I suspect usability issues, especially the duplication may be the reason. The website is a recruitment website in a specific industry only. However, they posts jobs for their clients, that can be very similar, and in the same time they can have 20 jobs with the same title and very similar job descriptions. The website currently have over 200 pages with potential duplicate content. Additionally, these jobs get posted on job portals, with the same content (Happens automatically through a feed). The questions here are: How bad would this be for the website usability, and would it be the reason the traffic went down? Is this the affect of Panda 4.2 that is still rolling What can be done to resolve these issues? Thank you in advance.
Intermediate & Advanced SEO | | iQi0 -
Multiple Landing Pages and Backlinks
I have a client that does website contract work for about 50 governmental county websites. The client has the ability to add a link back in the footer of each of these websites. I am wanting my client to get backlink juice for a different key phrase from each of the 50 agencies (basically just my keyphrase with the different county name in it). I also want a different landing page to rank for each term. The 50 different landing pages would be a bit like location pages for local search. Each one targets a different county. However, I do not have a lot of unique content for each page. Basically each page would follow the same format (but reference a different county name, and 10 different links from each county website). Is this a good SEO back link strategy? Do I need more unique content for each landing page in order to prevent duplicate content flags?
Intermediate & Advanced SEO | | shauna70840 -
Woocommerce SEO & Duplicate content?
Hi Moz fellows, I'm new to Woocommerce and couldn't find help on Google about certain SEO-related things. All my past projects were simple 5 pages websites + a blog, so I would just no-index categories, tags and archives to eliminate duplicate content errors. But with Woocommerce Product categories and tags, I've noticed that many e-Commerce websites with a high domain authority actually rank for certain keywords just by having their category/tags indexed. For example keyword 'hippie clothes' = etsy.com/category/hippie-clothes (fictional example) The problem is that if I have 100 products and 10 categories & tags on my site it creates THOUSANDS of duplicate content errors, but If I 'non index' categories and tags they will never rank well once my domain authority rises... Anyone has experience/comments about this? I use SEO by Yoast plugin. Your help is greatly appreciated! Thank you in advance. -Marc
Intermediate & Advanced SEO | | marcandre1 -
Tabs and duplicate content?
We own this site http://www.discountstickerprinting.co.uk/ and just a little concerned as I right clicked open in new tab on the tab content section and it went to a new page For example if you right click on the price tab and click open in new tab you will end up with the url
Intermediate & Advanced SEO | | BobAnderson
http://www.discountstickerprinting.co.uk/#tabThree Does this mean that our content is being duplicated onto another page? If so what should I do?0 -
Link Juice + multiple links pointing to the same page
Scenario
Intermediate & Advanced SEO | | Mark_Ch
The website has a menu consisting of 4 links Home | Shoes | About Us | Contact Us Additionally within the body content we write about various shoe types. We create a link with the anchor text "Shoes" pointing to www.mydomain.co.uk/shoes In this simple example, we have 2 instances of the same link pointing to the same url location.
We have 4 unique links.
In total we have 5 on page links. Question
How many links would Google count as part of the link juice model?
How would the link juice be weighted in terms of percentages?
If changing the anchor text in the body content to say "fashion shoes" have a different impact? Any other advise or best practice would be appreciated. Thanks Mark0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
Duplicate content on subdomains.
Hi Mozer's, I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com. So, I want to know how can i avoid content duplication. Many Thanks!
Intermediate & Advanced SEO | | HiteshBharucha0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0