Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
If we should add a .eu or remain .com solely
-
Hello,
Our company is international and we are looking to gain more traffic specifically from Europe. While I am aware that translating content into local languages, targeting local keywords, and gaining more European links will improve rankings, I am curious if it is worthwhile to have a company.eu domain in addition to our company.com domain.
Assuming the website's content and domain will be exactly the same, with the TLD (.eu vs .com) being the only change - will this add us benefit or will it hurt us by creating duplicate content - even if we create a separate GSC property for it with localized targeting and hreflang tags? Also - if we have multiple languages on our .eu website, can different paths have differing hreflangs?
IE: company.eu/blog/german-content German hreflang and company.eu/blog/Italian-content Italian hreflang.
I should note - we do not currently have an hreflang attribute set on our website as content has always been correctly served to US-based English speaking users - we do have the United States targeted in Google Search Console though.
It would be ideal to target countries by subfolder rather if it is just as useful. Otherwise, we would essentially be maintaining two sites.
Thanks!
-
My company currently uses an EU subdomain (similar approach to what you're describing, except varying the subdomain rather than the TLD). But one of the issues we have found is that the search engines have no geo-targeting options for Europe as a whole. They allow to target a single country. Or a single language. Or a country-language combination. But there are no options to target "Europe". So, with hreflang tags, and also with GSC settings, you will experience some challenges because of this. We haven't resolved all our issues, whcih is why I'm not providing a solution to you, but we are far enough in to know it is problematic.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to add titles to Pardot landing pages
I have 5 URLs that are "missing titles" however, all 5 are landing pages that were created in Pardot. how would I go about adding the missing title? Would I need to add it on our website platform or in Pardot?
Technical SEO | | cbriggs0 -
Forwarding a .org domain to a .com domain: any negative impact to consider?
Hello! I have a question I've been unable to find a clear answer to. My client's primary domain is a .com with a satisfactorily high DA. My client owns the .org version of its domain (which has a very low DA, I suppose due to inactivity) but has never forwarded it on. For branding/visibility/traffic reasons, I'd like to recommend they set up the .org domain to forward to the .com domain, but I wanted to ask a few questions first: 1. Does forwarding low-value DA domains to high-value DA domains have any negative authority/SEO impact? 2. If the .org domain was to be forwarded, am I correct that an SSL cert is not necessary for it if the .com domain has an SSL cert? Thanks in advance!
Technical SEO | | mollykathariner_ms1 -
Good alternatives to Xenu's Link Sleuth and AuditMyPc.com Sitemap Generator
I am working on scraping title tags from websites with 1-5 million pages. Xenu's Link Sleuth seems to be the best option for this, at this point. Sitemap Generator from AuditMyPc.com seems to be working too, but it starts handing up, when a sitemap file, the tools is working on,becomes too large. So basically, the second one looks like it wont be good for websites of this size. I know that Scrapebox can scrape title tags from list of url, but this is not needed, since this comes with both of the above mentioned tools. I know about DeepCrawl.com also, but this one is paid, and it would be very expensive with this amount of pages and websites too (5 million ulrs is $1750 per month, I could get a better deal on multiple websites, but this obvioulsy does not make sense to me, it needs to be free, more or less). Seo Spider from Screaming Frog is not good for large websites. So, in general, what is the best way to work on something like this, also time efficient. Are there any other options for this? Thanks.
Technical SEO | | blrs120 -
Is it better to use XXX.com or XXX.com/index.html as canonical page
Is it better to use 301 redirects or canonical page? I suspect canonical is easier. The question is, which is the best canonical page, YYY.com or YYY.com/indexhtml? I assume YYY.com, since there will be many other pages such as YYY.com/info.html, YYY.com/services.html, etc.
Technical SEO | | Nanook10 -
Ranking on google.com.au but not google.com
Hi there, we (www.refundfx.com.au) rank on google.com.au for some keywords that we target, but we do not rank at all on google.com, is that because we only use a .com.au domain and not a .com domain? We are an Australian company but our customers come from all over the world so we don't want to miss out on the google.com searches. Any help in this regard is appreciated. Thanks.
Technical SEO | | RefundFX0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0 -
Redirecting blog.<mydomain>.com to www.<mydomain>.com\blog</mydomain></mydomain>
This is more of a technical question than pure SEO per se, but I am guessing that some folks here may have covered this and so I would appreciate any questions. I am moving from a WordPress.com-based blog (hosted on WordPress) to a WordPress installation on my own server (as suggested by folks in another thread here). As part of this I want to move from the format blog.<mydomain>.com to www.mydomain.com\blog. I have installed WordPress on my server and have imported posts from the hosted site to my own server. How should I manage the transition from first format to the second? I have a bunch of links on Facebook, etc that refer to URLs of the blog..com format so it's important that I redirect.</mydomain> I am running DotNetNuke/WordPress on my own IIS/ASP.Net servers. Thanks. Mark
Technical SEO | | MarkWill0 -
How do I add meta descriptions to Archives in Wordpress?
My most recent crawl returned a number of 'missing meta description' errors, and when I checked individual URLs, it turned out they were Wordpress Archived pages - for individual months and days (e.g. http:// .../2011/01). What's the best way to go about adding descriptions to these pages, if at all? Or should I have these pages not be indexed? I am using the All in One SEO plugin, so maybe there is an easy fix through this plugin, or it may be the cause of these errors? Any help is appreciated, thanks in advance! **EDIT After looking it up further, I have decided to use noindex for Archives, which should solve my problem right? Or is there a benefit to having those archived pages?
Technical SEO | | NetPicks0