Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
US domain pages showing up in Google UK SERP
-
Hi,
Our website which was predominantly for UK market was setup with a .com extension and only two years ago other domains were added - US (.us) , IE (.ie), EU (.eu) & AU (.com.au)
Last year in July, we noticed that few .us domain urls were showing up in UK SERPs and we realized the sitemap for .us site was incorrectly referring to UK (.com) so we corrected that and the .us domain urls stopped appearing in the SERP. Not sure if this actually fixed the issue or was such coincidental.
However in last couple of weeks more than 3 .us domain urls are showing for each brand search made on Google UK and sometimes it replaces the .com results all together. I have double checked the PA for US pages, they are far below the UK ones.
Has anyone noticed similar behaviour &/or could anyone please help me troubleshoot this issue?
Thanks in advance,
R
-
As your own agency told, I too consider that when the hreflang will be implemented, this kind of issues should terminate.
Regarding the sitemap error, it was surely something that could be confusing Google about what site to target.
However, I see that you have also an .eu domain name...
I imagine that that domain is meant for targeting the European market and I suspect that it is in English.
If it is so, remember:
- In countries like Spain, France, Germany, italy... we don't search in Internet using English, but Spanish, French, German, Italian... Therefore, that .eu domain is not going to offer you those results you maybe are looking for;
- The .eu domain termination is a generic one, and cannot be geotargeted via Google Search Console. This means that - by default - it targets all the world, hence, you probably can see visits from English speaking users in countries like South Africa, UK, IE, Australia, New Zealand or India, where English is the main language or one of the official ones;
- When it comes to domains like .eu and hreflang, it is always hard to decide how to implement it. In your specific case, as you are targeting UK, US, AU and IE with specific domain names, the ideal would be to implement this hreflang annotation for the .eu (the example is only for the home page):
<rel="alternate" href="http://www.domain.eu" hreflang="x-default"><rel="alternate" href="http://www.domain.eu" hreflang="en"><rel="alternate" href="http://www.domain.com" hreflang="en-GB"><rel="alternate" href="http://www.domain.us" hreflang="en-US"><rel="alternate" href="http://www.domain.com.au" hreflang="en-AU"></rel="alternate"></rel="alternate"></rel="alternate"></rel="alternate"></rel="alternate">
With those annotations, you are telling Google to show the .com to users in Great Britain, the .us to users in United States, the .au to Australian ones and the .eu to all the other users using English in any other country.
That will mean that your .eu site surely will target also users in others European countries, both using english when searching (hreflang="en") and other languages (hreflang="x-default").
2 notes about the hreflang="x-default":
-
People living in the UK and searching in Spanish will see the .eu domain name, because it is the default domain name for searches in every language but English in GB, IE, AU and US;
-
Again, even if you pretend the .eu domain to target only European countries, that is impossible, because the .eu termination doesn't have any geotargeting power (and regions like Europe or Asia cannot be geotargeted via GSC). So it will be normal to see visit also from countries in others continents.
-
You're very welcome. Either way I'd be interested to see how this one progresses.
-
Hi Chris,
Thanks for your quick response and detailing out this well.
I have backdated and noticed that this occurs almost every six months. The US domain urls pop up in the UK SERPs for about 2 weeks and disappear after that. We are yet to implement the href lang tags on site and our SEO agency confirm that this should fix the issue.
Will keep this thread updated on the outcome.
Cheers,
RG
-
Whether or not this is an issue kind of depends on what your product or service is. If you provide a local-only service like a restaurant then your US site ranking in the UK would be unusual.
On the other hand, if you sell a physical product this may not be so unusual. For example, here in Australia we're quite limited when it comes to finding men's online clothing stores, most of it comes from the US or the UK so it's not uncommon to see something like the US Jackthreads show up in the SERPs here.
Since you do have separate domains for each location, this might be an indication that search engines aren't really understanding the different jurisdictions for each site; maybe they're not geo-targeted enough for the algorithm to comprehend the fact that each of the 3 sites server a unique area.
Some of the elements that can help define this, in no particular order:
- Server location
- HTML language ( e.g. lang="en-US")
- Regional language differences (e.g. US spelling vs UK)
- Location markup - on your location pages at the very least
- Location mentions throughout your content
While not specifically on-topic, Rand's Whiteboard Friday about scaling geo-targeting offers plenty of great advice that can be applied here as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site appearing and disappearing from google serps.
Hi, My website is normally on page 2-3 on google consistently. Over the past month it has been appearing and then completely disappearing from the serps. One day it will be on page 2, then the next day completely missing from the serps. When i check the index it seems to be indexed correctly when doing site:mysite.com. I don't understand why this keeps happening, any experience with this issue? It doesn't seem to be a google dance as far as I can tell. When my other sites dance they typically just go up or down a few ranks for a couple weeks until they stabilize. Not completely fall off the search engine.
Algorithm Updates | | Chris_www0 -
Why my Domain Authority (DA) is Decreased
Hello, I would like to know how the changes in domain authority is considered by MOZ? Domain Authority for my this domain https://factohr.com was 14 and it is decreased to 13 in this week. Though i have a very decent and good links going over to all my pages howcome my DA is affected and decreased. As its regularly being updated and has a high quality traffic! i would like to know the reason behind decrement in DA and is there any connection with redirection of .com domain? How can i increase DA for my website?
Algorithm Updates | | MyMoz710 -
More pages or less pages for best SEO practices?
Hi all, I would like to know the community's opinion on this. A website with more pages or less pages will rank better? Websites with more pages have an advantage of more landing pages for targeted keywords. Less pages will have advantage of holding up page rank with limited pages which might impact in better ranking of pages. I know this is highly dependent. I mean to get answers for an ideal website. Thanks,
Algorithm Updates | | vtmoz1 -
Sitemaps for landing pages
Good morning MOZ Community, We've been doing some re-vamping recently on our primary sitemap, and it's currently being reindexed by the search engines. We have also been developing landing pages, both for SEO and SEM. Specifically for SEO, the pages are focused on specific, long-tail search terms for a number of our niche areas of focus. Should I, or do I need to be considering a separate sitemap for these? Everything I have read about sitemaps simply indicates that if a site has over 50 thousand pages or so, then you need to split a sitemap. Do I need to worry about a sitemap for landing pages? Or simply add them to our primary sitemap? Thanks in advance for your insights and advice.
Algorithm Updates | | bwaller0 -
Does Google use dateModified or date Published in its SERPs?
I was curious as to the prioritization of dateCreated / datePublished and dateModified in our microdata and how it affects google search results. I have read some entries online that say Google prioritizes dateModified in SERPs, but others that claim they prioritize datePublished or dateCreated. Do you know (or could you point me to some resources) as to whether Google uses dateModified or date Published in its SERPs? Thanks!
Algorithm Updates | | Parse.ly0 -
Ecommerce good/bad? Showing product description on sub/category page?
Hi Mozers, I have a ecommerce furniture website, and I have been wondering for some time if showing the product descriptions on the sub/category page helps the website. If there is more content displayed on the subcategory, it should be more relevant, right? OR does it not matter, as it is duplicate content from the product page. I think showing the product descriptions on non-product pages is hurting my design/flow, but i worry that if I am to hide product content on sub/category pages my traffic will be hurt. Despite my searches I have not found an answer yet. Please take a look at my site and share your thoughts: http://www.ecustomfinishes.com/ Chris 27eVz
Algorithm Updates | | longdenc_gmail.com0 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0 -
Home page replaced by subpage in google SERP (good or bad)
SInce Panda, We have seen our home page drop from #2 in google.ie serp to page 3 but it has been replaced in the same position @#2 by our relevent sub page for the keyword that we ranked#2 for. Is this a good or bad thing from and seo point of view and is it better to have deep pages show in serp rather than the homepage of a site and what is the best line of action from here in relation to seo. Is it best to work on subpage or home page for that keyword and should link building for that phrase be directed towards the subpage or the homepage as the subpage is obviously more relevent in googles eyes for the search term. It is clear that all areas of the site should be looked at in relation to link building and deep links etc but now that google is obviously looking at relevancy very closely should all campaigns be sectioned into relevent content managed sections and the site likewise and treated on an individual basis. Any help that you may have would be very welcome. Paul
Algorithm Updates | | mcintyr0