Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Using the same content on different TLD's
-
HI Everyone,
We have clients for whom we are going to work with in different countries but sometimes with the same language.
For example we might have a client in a competitive niche working in Germany, Austria and Switzerland (Swiss German) ie we're going to potentially rewrite our website three times in German,
We're thinking of using Google's href lang tags and use pretty much the same content - is this a safe option, has anyone actually tries this successfully or otherwise?
All answers appreciated.
Cheers,
Mel.
-
Short answer: Using the same content on different country-targeted TLDs is generally not a problem.
The explanation:
1. Matt Cutts, the head of Google's web-spam team, says in this video that what you describe is generally not a problem (because you're not being a spammer who is trying to game the system). You can have the same content on different international domains under the same company / brand.
2. I'd review the international best SEO practices described here by Google just to make sure you're all in the clear. Google says you shouldn't worry too much about it, either. But I'd be sure to follow all of these guidelines -- geo-targeting settings for each domain in Webmaster Tools, for example -- in general to "tell" Google that you've got different TLDs targeting different countries.
So, having sites with similar content at multiple international domains should be fine.
Good luck! I hope everything's clear.

-
Coinidentally, I just touched on that today here http://a-moz.groupbuyseo.org/community/q/duplicate-title-tags-how-to-solve-that
I would go the way of subfolder over subdomain. There is a lot of info out there, but the crux of it comes down to all traffic improving domain rank for a single TLD. If you go the route of ccTLDs instead of subfolders, then you're spitting that rank among those domains. What circumstances would prevent you from concentrate all link juice to one domain? Then that duplicate content issue you're fearing becomes a nonissue.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Different content on the same URL depending on the IP address of the visitor
Hi! Does anybody have any expierence on the SEO impact when changing the content of a page depending on the IP address of the visitor? Would be text content as well as meta information. This happening on the same URL. Many thanks.
Intermediate & Advanced SEO | | Schoellerallibert0 -
Ranking 1st for a keyword - but when 's' is added to the end we are ranking on the second page
Hi everyone - hope you are well. I can't get my head around why we are ranking 1st for a specific keyword, but then when 's' is added to the end of the keyword - we are ranking on the second page. What could be the cause of this? I thought that Google would class both of the keywords the same, in this case, let's say the keyword was 'button'. We would be ranking 1st for 'button', but 'buttons' we are ranking on the second page. Any ideas? - I appreciate every comment.
Intermediate & Advanced SEO | | Brett-S0 -
Partial Match or RegEx in Search Console's URL Parameters Tool?
So I currently have approximately 1000 of these URLs indexed, when I only want roughly 100 of them. Let's say the URL is www.example.com/page.php?par1=ABC123=&par2=DEF456=&par3=GHI789= All the indexed URLs follow that same kinda format, but I only want to index the URLs that have a par1 of ABC (but that could be ABC123 or ABC456 or whatever). Using URL Parameters tool in Search Console, I can ask Googlebot to only crawl URLs with a specific value. But is there any way to get a partial match, using regex maybe? Am I wasting my time with Search Console, and should I just disallow any page.php without par1=ABC in robots.txt?
Intermediate & Advanced SEO | | Ria_0 -
Using both dofollow & nofollow links within the same blog site (but different post).
Hi all, I have been actively pursuing bloggers for my site in order to build page rank. My website sells women undergarments that are more on the exotic end. I noticed a large amount of prospective bloggers demand product samples. As already confirm, bloggers that are given "free" samples should use a rel=no follow attribute in their links. Unfortunately this does not build my page rank or transfer links juice. My question is this: is it advisable for them to also blog additional posts and include dofollow links? The idea is for the blogger to use a nofollow when posting about the sample and a regular link for a secondary post at a later time. What are you thoughts concerning this matter?
Intermediate & Advanced SEO | | 90miLLA0 -
What's the deal with significantLinks?
http://schema.org/significantLink Schema.org has a definition for "non-navigation links that are clicked on the most." Presumably this means something like the big green buttons on Moz's homepage. But does anyone know how they affect anything? In http://a-moz.groupbuyseo.org/blog/schemaorg-a-new-approach-to-structured-data-for-seo#comment-142936, Jeremy Nelson says " It's quite possible that significant links will pass anchor text as well if a previous link to the page was set in navigation, effictively making obselete the first-link-counts rule, and I am interested in putting that to test." This is a pretty obscure comment but it's one of the only results I could find on the subject. Is this BS? I can't even make out what all of it is saying. So what's the deal with significantLinks and how can we use them to SEO?
Intermediate & Advanced SEO | | NerdsOnCall0 -
Does Google crawl the pages which are generated via the site's search box queries?
For example, if I search for an 'x' item in a site's search box and if the site displays a list of results based on the query, would that page be crawled? I am asking this question because this would be a URL that is non existent on the site and hence am confused as to whether Google bots would be able to find it.
Intermediate & Advanced SEO | | pulseseo0 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0 -
URL Length or Exact Breadcrumb Navigation URL? What's More Important
Basically my question is as follows, what's better: www.romancingdiamonds.com/gemstone-rings/amethyst-rings/purple-amethyst-ring-14k-white-gold (this would fully match the breadcrumbs). or www.romancingdiamonds.com/amethyst-rings/purple-amethyst-ring-14k-white-gold (cutting out the first level folder to keep the url shorter and the important keywords are closer to the root domain). In this question http://www.seomoz.org/qa/discuss/37982/url-length-vs-url-keywords I was consulted to drop a folder in my url because it may be to long. That's why I'm hesitant to keep the bradcrumb structure the same. To the best of your knowldege do you think it's best to drop a folder in the URL to keep it shorter and sweeter, or to have a longer URL and have it match the breadcrumb structure? Please advise, Shawn
Intermediate & Advanced SEO | | Romancing0