Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we鈥檙e not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What's the point of an EU site?
-
Buongiorno from 18 degrees C Wetherby UK
On this site http://www.milwaukeetool.eu/ the client wants to hold on to the EU site despite there being multiple standalone country sittes e.g. http://www.milwaukeetool.fr & http://www.milwaukeetool.co.uk
Why would you ever need an EU site? I mean who ever searches for an EU site? If the client holds on to the eu site despite my position it's a waiste of time from a search perspective is the folowing the best appeasment?
When a user enters the eu url or redirects to country the detected, eg I'm in Paris I enter www.milwaukeetool.eu it redirects to http://www.milwaukeetool.fr. My felling this would be the most pragmatic thing to do?
Any ideas please,
Cioa,
David -
The .eu domain termination is a generic, hence it is not bound to geo-targeting on Google Webmaster Tool. In that sense, it is an alternative to the .com domain termination, if the .com is not available.
It was created by the European Union has a way to "communicate" that the business owning the domain has an European nature and that it is based on a nation of E(uropean) U(nion) and that is primary market is the EU.
From an SEO point of view, it doesn't offer any really advantage with respect any other generic domain name:
-
you can't geo-target more countries with a single domain name
-
you can't geo-target political regions (or continents).
Hence, it is good to have it for defending your brand, and to use it if .com (or .net) have been already taken. But if you have a .com, then it is better to redirect the .eu to it.
-
-
I had that argument with a client once and i manage to persuade them not to go ahead with it as they had .com and .co.uk .
.com is international so i think it did make sense not to use .eu but in your case, if you don't have .com then probably you need to look at which countries you want to target. If your client is UK based and if they target client from all around EU, then it might make sense to use .eu as you have very little chance to target someone in Italy with co.uk or .com.fr domains. If you are targeting only UK and FR then, you don't need .eu. It will just duplicate your work.
-
Hello
I got the same situation with customers recently. The best I've found was to tell them:
"You're definitely right we cannot loose the .eu. So we gonna redirect it to the BESTDOMAINE.com as main website. So people will still find you and you'll get more customers and will keep the .eu."
Then you can provide some technical arguments explaining that if he sticks to this position he will loose business.
In my case it worked out
From 50掳 in Dubai
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I safely asume that links between subsites on a subdirectories based multisite will be treated as internal links within a single site by Google?
I am building a multisite network based in subdirectories (of the mainsite.com/site1 kind)聽 where the main site is like a company site, and subsites are focused on brands or projects of that company. There will be links back and forth from the main site and the subsites, as if subsites were just categories or pages within the main site (they are hosted in subfolders of the main domain, after all). Now, Google's John Mueller has said: <<as far="" as="" their="" url="" structure="" is聽concerned,="" subdirectories="" are="" no="" different="" from="" pages="" and="" subpages="" on="" your="" main="" site.="" google="" will="" do="" its="" best="" to="" identify="" where="" sites="" separate="" using="" but="" the="" is="" same="" for="" a="" single="" site,="" you="" should="" assume="" that="" seo="" purposes,="" network="" be="" treated="" one="">></as> This sounds fine to me, except for the part "Google will do its best to identify where sites are separate", because then, if Google establishes that my multisite structure is actually a collection of different sites, links between subsites and mainsite would be considered backlinks between my own sites, which could be therefore considered a link wheel, that is, a kind of linking structure Google doesn't like. How can I make sure that Google understand my multisite as a unique site? P.S. - The reason I chose this multisite structure, instead of hosting brands in categories of the main site, is that if I use the subdirectories based multisite feature I will be able to map a TLD domain to any of my brands (subsites) whenever I'd choose to give that brand a more distinct profile, as if it really was a different website.
Web Design | | PabloCulebras0 -
Do Wordpress sites outrank SquareSpace?
I was a big fan of Wordpress. I used it for 10 years. However, because I run a very small business, the constant upkeep needed on WP in the end started to frustrate me in the end, so I moved to SquareSpace. However, I am beginning to question my decision, as one of my sites is struggling really badly, and I mean badly. The other sites are okay. So I started asking around, and most people are saying there shouldn't be a difference. A few people have said their Wordpress sites always outranks their SquareSpace sites. Then I read what Rand Fishkin聽said in the below Twitter thread, now I am even more confused. I am very reluctant to move to Wordpress, its just so much hassle. But at the same time, if a site doesn't get much traffic then it's useless. https://twitter.com/drew_pickard/status/991659074134556673 https://twitter.com/randfish/status/991974456477278209 Please let me know your thoughts and experience.
Web Design | | RyanUK0 -
I am Using <noscript>in All Webpage and google not Crawl my site automatically any solution</noscript>
| |
Web Design | | ahtisham2018
| 聽 | <noscript></span></td> </tr> <tr> <td class="line-number">聽</td> <td class="line-content"><meta http-equiv="refresh" content="0;url=errorPages/content-blocked.jsp?reason=js"></td> </tr> <tr> <td class="line-number">聽</td> <td class="line-content"><span class="html-tag"></noscript> | and Please tell me effect on seo or not1 -
Is it against google guidelines to use third party review sites as well as have reviews on my site marked up with schema?
So, i look after a site for my family business. We have teamed up with the third party site TrustPilot because we like the way it enables us to send out reviews to our customers directly from our system. It's been going great and some of the reviews have been brilliant. I have used a couple of these reviews on our site and marked them up with: REVIEW CONTENT We work in the service industry and so one of the problems we have found is that getting our customers to actually go online and leave a review. They normally just leave their comments on a job sheet that the workers have signed when they leave. So I have created a page on our site where we post some of the reviews the guys receive too. I have used the following: REVIEW TITLE REVIEW Written by: CUSTOMER NAME Type of Service:House Removal Date published: DATE PUBLISHED 10 / 10 stars I was just wondering I was told that this could be against googles guidelines and as i've seen a bit of a drop in our rankings in the last week or so i'm a little concerned. Is this getting me penalised? Should I not use my reviews referencing the ones on trust pilot and should i not have my own reviews page with rich snippets?
Web Design | | BearPaw881 -
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
Hi all, So far I have found this one:聽http://unused-css.com/聽 聽It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit. I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS? Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time? Thanks!
Web Design | | danatanseo0 -
Site Activity, SEO, and behind login
I have a site that provides online education and as such, most of the user activity happens behind a login. This has me thinking about potential SEO impacts with a few questions that maybe someone could lend some light on: How important is activity (above just search activity) to the search engines Would it help to enter these pages, even though they're behind a login, into GA as we have with the front-end of the site Does a subdomain make a difference (right now we implement the course as a subdomain of the main site Lastly, as I was looking at compete.com, I am wondering how they get these use statistics?
Web Design | | uwaim20120 -
The use of foreign characters and capital letters in URL's?
Hello all, We have 4 language domains for our website, and a number of our Spanish landing pages are written using Spanish characters - most notably: 帽 and 贸. We have done our research around the web and realised that many of the top competitors for keywords such as Dise帽o Web (web design) and Aplicai贸n iPhone (iphone application) DO NOT use these special chacracters in their URL structure. Here is an example of our URL's EX:聽 http://www.twago.es/expert/Dise帽o-Web/Dise帽o-Web However when I simply copy paste a URL that contains a special character it is automatically translated and encoded. EX: http://www.twago.es/expert/Aplicaci贸n-iPhone/Aplicaci贸n-iPhone (When written out long had it appears: http://www.twago.es/expert/Aplicaci贸n-iPhone/Aplicaci贸n-iPhone My first question is, seeing how the overwhelming majority of website URL's DO NOT contain special characters (and even for Spanish/German characters these are simply written using the standard English latin alphabet) is there a negative effect on our SEO rankings/efforts because we are using special characters? When we write anchor text for backlinks to these pages we USE the special characteristics in the anchor text (so does most other competitors). Does the anchor text have to exactly I know most webbrowsers can understand the special characters, especially when returning search results to users that either type the special characters within their search query (or not). But we seem to think that if we were doing the right thing, then why does everyone else do it differently? My second question is the same, but focusing on the use of Capital letters in our URL structure. NOTE: When we do a broken link check with some link tools (such as xenu) the URL's that contain the special characters in Spanish are marked as "broken". Is this a related issue? Any help anyone could give us would be greatly appreciated! Thanks, David from twago
Web Design | | wdziedzic0 -
Combining web pages and it's affects on SEO?
We are looking into amending a website we are working on to try and combine 2 or 3 current pages onto one page. This site is similar to an estate agents site and currently has images, map, floor plan sub pages etc. Can anyone tell me, if we were to combine these pages and include the above details on one page, how that would affect the current search engine rankings?
Web Design | | SoundinTheory0