Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
.com and .co.uk duplicate content
-
hi mozzers
I have a client that has just released a .com version of their .co.uk website. They have basically re-skinned the .co.uk version with some US amends so all the content and title tags are the same. What you do recommend? Canonical tag to the .co.uk version? rewrite titles?
-
Just a quick question, the client in question, in their wisdom, decided to put the US website live without telling me and our UK rankings have dropped significantly, do you think the tag will start to fix this?
-
It is unlikely because Google normally gives preference to the original for a fairly long period of time. However with Google there are no certainties but they do get this right in almost all cases I have seen.
The only users you should see decline on your site are non UK visitors as you are telling them with default-x that they should be sent to the .com
There are many huge companies adopting this process and also thousands of other smaller sites, I think Google has ironed out most of the issues over the last 2 years. You are more likely to see a slower uptake on the new domain than the original than the other way around.
Hope that helps
-
Hi Gary,
thanks for the help, as a UK website, we primarily want to rank in the UK but we obviously want to rank in the US. By making the .com website (which is brand new) is this likely to affect our UK rankings or should they be unaffected?
Thanks again,
Karl
-
The actual page you want to look at is https://support.google.com/webmasters/answer/189077
hreflang is the tag you should implement.
I have had long chats with John Mueller at Google about this.
Your setup should be something like this on all pages on both sites.
Within about 7 days depending on the size of your website the .com should appear in favor of the .co.uk for your US based results. For me it happened within an hour!
Setting your .com as a default will be better than setting your co.uk. The co.uk is already a region specific TLD and will not rank well generally in other search engines even if set in the hreflang to do differently.
This will let Google decide where to send traffic too based on their algo/data.
If you use a canonical tag you will be suggesting/pushing US users to the original content instead of the US site.
-
Ok, thanks for the help. I'll have a look into it and see what it says. The .com website is up now and they are hell bent on it staying! I did recommend having a /US but they preferred the .com!
Anyway thanks for the advice!
-
Hiya,
The alternative tag is a good start but you may want to do some more reading I'll put some links below. It's easier to try to make unique content or have a structure like www.example.com/us which may be an easier short term until you've got enough content for a .com site.
http://a-moz.groupbuyseo.org/community/q/duplicate-content-on-multinational-sites
https://support.google.com/webmasters/answer/182192#3
I always find it nicer to formulate your own answers and learn a bit along the way so I help the above helps you do that.
-
Thanks Chris,
So would you implement the rel=alternative href=x tag then?
-
A similar question was posted not so long ago there are some great points in it worth a look - http://a-moz.groupbuyseo.org/community/q/international-web-site-duplicate-content
Florin Birgu brings some fantastic points up and I'll be they answer your question, if you're still stuck let us know and i'm sure we can help you
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content and Subdirectories
Hi there and thank you in advance for your help! I'm seeking guidance on how to structure a resources directory (white papers, webinars, etc.) while avoiding duplicate content penalties. If you go to /resources on our site, there is filter function. If you filter for webinars, the URL becomes /resources/?type=webinar We didn't want that dynamic URL to be the primary URL for webinars, so we created a new page with the URL /resources/webinar that lists all of our webinars and includes a featured webinar up top. However, the same webinar titles now appear on the /resources page and the /resources/webinar page. Will that cause duplicate content issues? P.S. Not sure if it matters, but we also changed the URLs for the individual resource pages to include the resource type. For example, one of our webinar URLs is /resources/webinar/forecasting-your-revenue Thank you!
Technical SEO | | SAIM_Marketing0 -
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
Query Strings causing Duplicate Content
I am working with a client that has multiple locations across the nation, and they recently merged all of the location sites into one site. To allow the lead capture forms to pre-populate the locations, they are using the query string /?location=cityname on every page. EXAMPLE - www.example.com/product www.example.com/product/?location=nashville www.example.com/product/?location=chicago There are thirty locations across the nation, so, every page x 30 is being flagged as duplicate content... at least in the crawl through MOZ. Does using that query string actually cause a duplicate content problem?
Technical SEO | | Rooted1 -
How does Google view duplicate photo content?
Now that we can search by image on Google and see every site that is using the same photo, I assume that Google is going to use this as a signal for ranking as well. Is that already happening? I ask because I have sold many photos over the years with first-use only rights, where I retain the copyright. So I have photos on my site that I own the copyright for that are on other sites (and were there first). I am not sure if I should make an effort to remove these photos from my site or if I can wait another couple years.
Technical SEO | | Lina5000 -
Handling of Duplicate Content
I just recently signed and joined the a-moz.groupbuyseo.org system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
Duplicate Page Content and Titles from Weebly Blog
Anyone familiar with Weebly that can offer some suggestions? I ran a crawl diagnostics on my site and have some high priority issues that appear to stem from Weebly Blog posts. There are several of them and it appears that the post is being counted as "page content" on the main blog feed and then again when it is tagged to a category. I hope this makes sense, I am new to SEO and this is really confusing. Thanks!
Technical SEO | | CRMI0 -
Are recipes excluded from duplicate content?
Does anyone know how recipes are treated by search engines? For example, I know press releases are expected to have lots of duplicates out there so they aren't penalized. Does anyone know if recipes are treated the same way. For example, if you Google "three cheese beef pasta shells" you get the first two results with identical content.
Technical SEO | | RiseSEO0 -
Duplicate Content issue
I have been asked to review an old website to an identify opportunities for increasing search engine traffic. Whilst reviewing the site I came across a strange loop. On each page there is a link to printer friendly version: http://www.websitename.co.uk/index.php?pageid=7&printfriendly=yes That page also has a link to a printer friendly version http://www.websitename.co.uk/index.php?pageid=7&printfriendly=yes&printfriendly=yes and so on and so on....... Some of these pages are being included in Google's index. I appreciate that this can't be a good thing, however, I am not 100% sure as to the extent to which it is a bad thing and the priority that should be given to getting it sorted. Just wandering what views people have on the issues this may cause?
Technical SEO | | CPLDistribution0