Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Non US site pages indexed in US Google search
-
Hi,
We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results.
We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure?
Below are examples of two of our URLs for reference - one from Canada, the other from the US
/ca/en/prod4130078/2500058/catalog50008/
/us/en/prod4130078/2500058/catalog20038/
If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem?
Thank you.
Angie
-
John,
Thanks for adding all of these great suggestions - I don't do international that often so the full list of methods isn't always in my conscious awareness!
-
Here's all the things you can do to try geotarget your content for the search bots:
- Register each subfolder as a separate site in Google Webmaster Tools (e.g. example.com/ca/, example.com/us/), and geotarget it (see here).
- Set meta tags or http headers on each page to let Bing know the language and country (see here).
- For duplicate or near-duplicate pages across different English speaking localities, you can try out the hreflang tags to clue Google in that they're the same page, but geotargeting users in different locations. I haven't personally implemented this myself, so I can't speak to how well it works, but you can find more info about it hereand here.
Setting nofollows just stops PageRank from flowing, but bots can still follow these links, so I wouldn't do that.
-
Its absolutely possible that's what's happening. You cannot rely on Google's system being barred from crawling anything on your site, no matter how well you code it. Even if you blocked the URL with nofollow, it would not stop the bot.
Another factor is if all your content is in English (as your URL structure suggests it is). Google does a terrible job of discerning separation of international content when all the content is in the same language, on the same root domain.
Proper separation in a way Google can't confuse is vital. Since I expect you do not intend to change the language across sites, your best action would be to migrate international content to a completely different domain. At the very least you can then use GWT to inform Google that "this domain is for this country", however if you want to be even better off, you'd host that other content on a server in that country.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to interlink 16 different language versions of site?
I remember that Matt Cutts recommended against interlinking many language versions of a site.
International SEO | | lcourse
Considering that google now also crawls javascript links, what is best way to implement interlinking? I still see otherwhise extremely well optimized large sites interlinking to more than 10 different language versions e.g. zalando.de, but also booking.com (even though here on same domain). Currently we have an expandable css dropdown in the footer interlinking 16 different language versions with different TLD. Would you be concerned? What would you suggest how to interlink domains (for user link would be useful)?0 -
Auto-Redirecting Homepage on Multilingual Site
The website has an auto-redirecting homepage on a multilingual site. Here is some background: User visits the site for first time > sent to javascript age verification page with country of origin selector. If selects "France" then served French page (.com/fr-fr/). If selects any other country, then served English page (.com/en-int/). A cookie is set, and next time the user visits the site, they are automatically served the appropriate language URL. 1st Question: .com/ essentially does not exist. It is being redirected to .com/en-int/ as this is the default page. Should this be a 301 redirect since I want this to serve as the new homepage? 2nd Question:. In the multilingual sitemap, should I still set .com/ as the hreflang="x-default" even though the user is automatically redirected to a language directory? According to Google, as just released here: http://googlewebmastercentral.blogspot.com/2014/05/creating-right-homepage-for-your.html "automatically serve the appropriate HTML content to your users depending on their location and language settings. You will either do that by using server-side 302 redirects or by dynamically serving the right HTML content. Remember to use x-default rel-alternate-hreflang annotation on the homepage / generic page even if the latter is a redirect page that is not accessible directly for users." So, this is where I am not clear. If use a 302 redirect of .com/ to either .com/en-int/ or .com/fr-fr/, won't I then lose the inbound link value and DA/PA of .com/ if I just use a 302? Note: there is no .com/ at this moment. Any advice is appreciated. Thanks,Alex
International SEO | | Alex.Weintraub0 -
Blocking domestic Google's in Robots.txt
Hey, I want to block Google.co.uk from crawling a site but want Google.de to crawl it. I know how to configure the Robots.txt to block Google and other engines - is there a fix to block certain domestic crawlers? any ideas? Thanks B
International SEO | | Bush_JSM0 -
Massive jump in pages indexed (and I do mean massive)
Hello mozzers, I have been working in SEO for a number of years but never seen anything like a jump in pages indexed of this proportion (image is from the Index Status report in Google Webmaster Tools: http://i.imgur.com/79mW6Jl.png Has anyone has ever seen anything like this?
International SEO | | Lina-iWeb
Anyone have an idea about what happened? One thing that sprung to mind might be that the same pages are now getting indexed in several more google country sites (e.g. google.ca, google.co.uk, google.es, google.com.mx) but I don't know if the Index Status report in WMT works like that. A few notes to explain the context: It's an eCommerce website with service pages and around 9 different pages listing products. The site is small - only around 100 pages across three languages 1.5 months ago we migrated from three language subdomains to a single sub-domain with language directories. Before and after the migration I used hreflang tags across the board. We saw about 50% uplift in traffic from unbranded organic terms after the migration (although on day one it was more like +300%), especially from more language diversity. I had an issue where the 'sort' links on the product tables were giving rise to thousands of pages of duplicate content, although I had used the URL parameter handling to communicate to Google that these were not significantly different and only to index the representative URL. About 2 weeks ago I blocked them using the robots.txt (Disallow: *?sort). I never felt these were doing us too much harm in reality although many of them are indexed and can be found with a site:xxx.com search. At the same time as adding *?sort to the robots.txt, I made an hreflang sitemap for each language, and linked to them from an index sitemap and added these to WMT. I added some country specific alternate URLs as well as language just to see if I started getting more traffic from those countries (e.g. xxx.com/es/ for Spanish, xxx.com/es/ for Spain, xxx.xom/es/ for Mexico etc). I dodn't seem to get any benefit from this. Webmaster tools profile is for a URL that is the root domain xxx.com. We have a lot of other subdomains, including a blog that is far bigger than our main site. But looking at the Search Queries report, all the pages listed are on the core website so I don't think it is the blog pages etc. I have seen a couple of good days in terms of unbranded organic search referrals - no spike or drop off but a couple of good days in keeping with recent improvements in these kinds of referrals. We have some software mirror sub domains that are duplicated across two website: xxx.mirror.xxx.com and xxx.mirror.xxx.ca. Many of these don't even have sections and Google seemed to be handling the duplication, always preferring to show the .com URL despite no cross-site canonicals in place. Very interesting, I'm sure you will agree! THANKS FOR READING! 79mW6Jl.png0 -
Are my translated pages damaging my ranking?
Hi there, I have a site in English but with duplicates in different languages. The first problem is that these translated versions of my site receive no ranking on google stars (while the english does) - why is this? The second problem is that SEOmoz counts the errors on my site and then duplicates this error count for all the translated versions of my site - meaning I have a huge amount of errors (too many on-page links). Add this to the fact that I use affilite ID´s to track different types of traffic to my site - so all page urls in english and other languages, with an affiliate id on the end of the url, count as an error. This means I have a huge amount of on page errors indicated by SEOmoz, plus no ranking for my translated pages - I think this is really harming my overall ranking and site trust. What are your opinions on this?
International SEO | | sparkit0 -
Best domain for spanish language site targeting ALL spanish territories?
hi, we're have a strong .com domain and are looking to launch a site for spanish speakers (ie latin america + spain). we already have various subdirectories for some foreign language sites (eg. ourdomain.co.uk, us.ourdomain.com, ca.ourdomain.com, ourdomainchina.com, ourdomainindia.com etc) we already have a B2B site ourdomain.com-es which will remain the same. I'm thinking best practice would be to launch translated copy for the following: ourdomain.com/es ourdomain.com/cl ourdomain.com/mx ourdomain.com/pt etc etc firstly is this the best option? secondly, i'm really interested to hear whether there is a less time/resource intensive route that would give us visibility in ALL spanish speaking territories? Also - if we go with just one of the above (eg ourdomain.com/cl) how likely are we to get traction in other spanish speaking territories? any help much appreciated!
International SEO | | KevinDunne0 -
How can I see what my web site looks like from a different country?
I've tried a few proxy tools to try to see how my site looks from other global locations, but haven't found one that works very well yet -- or a list of reliable proxies around the world. I need to do this to test various geo-targetted ads and other optimizations. Can anyone make a recommendation? Thanks!
International SEO | | Dennis-529610 -
What countries does Google crawl from? Is it only US or do they crawl from Europe and Asia, etc.?
Where does Google crawl the web from? Is it in the US only, or do they do it from a European base too? The reason for asking is for GeoIP redirection. For example, if a website is using GeoIP redirection to redirect all US traffic to a .com site and all EU traffic to a .co.uk site, will Google ever see the .co.uk site?
International SEO | | Envoke-Marketing2