Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Non US site pages indexed in US Google search
-
Hi,
We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results.
We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure?
Below are examples of two of our URLs for reference - one from Canada, the other from the US
/ca/en/prod4130078/2500058/catalog50008/
/us/en/prod4130078/2500058/catalog20038/
If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem?
Thank you.
Angie
-
John,
Thanks for adding all of these great suggestions - I don't do international that often so the full list of methods isn't always in my conscious awareness!
-
Here's all the things you can do to try geotarget your content for the search bots:
- Register each subfolder as a separate site in Google Webmaster Tools (e.g. example.com/ca/, example.com/us/), and geotarget it (see here).
- Set meta tags or http headers on each page to let Bing know the language and country (see here).
- For duplicate or near-duplicate pages across different English speaking localities, you can try out the hreflang tags to clue Google in that they're the same page, but geotargeting users in different locations. I haven't personally implemented this myself, so I can't speak to how well it works, but you can find more info about it hereand here.
Setting nofollows just stops PageRank from flowing, but bots can still follow these links, so I wouldn't do that.
-
Its absolutely possible that's what's happening. You cannot rely on Google's system being barred from crawling anything on your site, no matter how well you code it. Even if you blocked the URL with nofollow, it would not stop the bot.
Another factor is if all your content is in English (as your URL structure suggests it is). Google does a terrible job of discerning separation of international content when all the content is in the same language, on the same root domain.
Proper separation in a way Google can't confuse is vital. Since I expect you do not intend to change the language across sites, your best action would be to migrate international content to a completely different domain. At the very least you can then use GWT to inform Google that "this domain is for this country", however if you want to be even better off, you'd host that other content on a server in that country.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International SEO - Targeting US and UK markets
Hi folks, i have a client who is based in italy and they set up a site that sells travel experiences in the sout of Italy (the site currently sit on a server in Italy). The site has been set up as gTLDs: www.example.com They only want to target the US and the UK market to promote their travel experiences and the site has only the english version (the site does not currently offer an italian version). If they decide to go for the gTLDs and not actually change to a ccTLDs (which would be ideal from my point of view) how are the steps to be taken to set this up correctly on GSC? They currently only have one property registered on GSC: www.exapmple.com therefore i guess the next steps are: Add new property - www.example.com/uk and and set up geo targeting for UK Existing property - www.example.com/ set up geo targeting for US In case the client does not have the budget to optimise the content for american and british languages, would still make sense to have 2 separate property in GSC (example.com for US market and example.com/uk for UK market)? Few considerations: Add canonical tag to avoid duplicate content across the two versions of the site (in the event there is no budget to optimise the content for US and UK market)? Thank you all in advance for looking into this David
International SEO | | Davide19840 -
Which Google does a .ie website get shown in?
Hi all, I am working on a .ie website and I was under the understanding that if you have a regional domain, like .ie you will limited to being shown in a search engine like google.ie When I go to International Targeting in Google SEarch Console it says the site is associated with: Ireland Am I limiting my ability to rank well in worldwide Google searches with this domain and if so, how can I counter this? Many thanks.
International SEO | | Bee1590 -
Issues with Baidu indexing
I have a few issues with one of my sites being indexed in Baidu and not too sure of how to resolve them; 1. Two subdomains were redirected to the root domain, but both (www. and another) subdomains are still indexed after ~4 months. 2. A development subdomain is indexed, despite no longer working (it was taken down a few months back). 3. There's conflicting information on what the best approach is to get HTTPS pages indexed in Baidu and we can't find a good solution. 4. There are hundreds of variations of the home page (and a few other pages) on the main site, where Baidu has indexed lots of parameters. There doesn't appear to be anywhere in their webmaster tools to stop that happening, unlike with Google. I'm not the one who deals directly with this site, but I believe that Baidu's equivalent of Webmaster Tools has been used where possible to correctly index the site. Has anyone else had similar issues and, if so, were you able to resolve them? Thanks
International SEO | | jobhuntinghq0 -
Massive jump in pages indexed (and I do mean massive)
Hello mozzers, I have been working in SEO for a number of years but never seen anything like a jump in pages indexed of this proportion (image is from the Index Status report in Google Webmaster Tools: http://i.imgur.com/79mW6Jl.png Has anyone has ever seen anything like this?
International SEO | | Lina-iWeb
Anyone have an idea about what happened? One thing that sprung to mind might be that the same pages are now getting indexed in several more google country sites (e.g. google.ca, google.co.uk, google.es, google.com.mx) but I don't know if the Index Status report in WMT works like that. A few notes to explain the context: It's an eCommerce website with service pages and around 9 different pages listing products. The site is small - only around 100 pages across three languages 1.5 months ago we migrated from three language subdomains to a single sub-domain with language directories. Before and after the migration I used hreflang tags across the board. We saw about 50% uplift in traffic from unbranded organic terms after the migration (although on day one it was more like +300%), especially from more language diversity. I had an issue where the 'sort' links on the product tables were giving rise to thousands of pages of duplicate content, although I had used the URL parameter handling to communicate to Google that these were not significantly different and only to index the representative URL. About 2 weeks ago I blocked them using the robots.txt (Disallow: *?sort). I never felt these were doing us too much harm in reality although many of them are indexed and can be found with a site:xxx.com search. At the same time as adding *?sort to the robots.txt, I made an hreflang sitemap for each language, and linked to them from an index sitemap and added these to WMT. I added some country specific alternate URLs as well as language just to see if I started getting more traffic from those countries (e.g. xxx.com/es/ for Spanish, xxx.com/es/ for Spain, xxx.xom/es/ for Mexico etc). I dodn't seem to get any benefit from this. Webmaster tools profile is for a URL that is the root domain xxx.com. We have a lot of other subdomains, including a blog that is far bigger than our main site. But looking at the Search Queries report, all the pages listed are on the core website so I don't think it is the blog pages etc. I have seen a couple of good days in terms of unbranded organic search referrals - no spike or drop off but a couple of good days in keeping with recent improvements in these kinds of referrals. We have some software mirror sub domains that are duplicated across two website: xxx.mirror.xxx.com and xxx.mirror.xxx.ca. Many of these don't even have sections and Google seemed to be handling the duplication, always preferring to show the .com URL despite no cross-site canonicals in place. Very interesting, I'm sure you will agree! THANKS FOR READING! 79mW6Jl.png0 -
Poor Google.co.uk ranking for a UK based .net, but great Google.com
I run an extremely popular news & community website at http://www.onedirection.net, but we're having a few ranking issues in Google.co.uk. The site gets most of its traffic from the USA which isnt a bad thing - but for our key term "one direction", we currently don't rank at all on Google.co.uk. The site is located on a server based in Manchester, UK, and we used to rank very well earlier this year - fluttering about in position 5-7 most of the time. However earlier this year, around July, we started to fall down to page 2 or 3, and at the start of this month we don't rank at all for "one direction" on Google.co.uk. On Google.com however we're very strong, always on page one. We're definitely indexed on .co.uk, just not for main search term - which I find a bit frustrating. All the content on our site is unique, and we write 2-4 stories every day. We have an active forum too, so a lot of our content is user-generated. We've never had any "unnatural link building" messages in Webmaster Tools, and our link profile looks fine to me. Do we just need more .co.uk links, or are we being penalised for something? (I can't imagine what though). It certainly seems that way though. Another site, "www.onedirection.co.uk" which is never updated and has a blatant ad for something completely unrelated on its homepage, ranks above us at the moment- which I find quite frankly appalling as our site is pretty much regarded as the worlds most popular One Direction news and fan site. We've spent the last few months improving the page-load times of our site, and we've reduced any unneccesary internal linking on the site. Approx 2 months ago we launched a new forum on the site, 301'ing all the old forum links to the new one, so that could have had an impact on rankings - but we'd expect to see an impact on Google.com as well if this was an issue. We definitely feel that we should be ranking higher on Google.co.uk. Does anyone have any ideas what the iproblems could be? Cheers, Chris.
International SEO | | PixelKicks0 -
Is .in domain affecting international traffic inflow to my site?
My holiday website http://seekandhide.in/ was completed and went live in Feb 2012. Last month I got 83% traffic from India and 3-5% each from USA and UK. The rest is a mixed bag from other countries. This is largely the trend since the last 3-4 months. I want to attract more organic traffic from UK and rest of Europe. My SEO consultant says that with a .in domain that will be difficult. My website currently features unique holiday properties in India that typically attract European tourists so I don't think it is a product issue. But both website visits and sales enquiries remain primarily Indian even though total number of visitors have increased gradually over the last 6 months.. My queries are 1. Is it only the .in domain that's affecting inflow of international traffic? 2. Is there anything that I can do to offset it? 3. I own seekandhide.co.uk too. Is there something I can do with that site without building a whole different website there? If I shift completely to .co.uk, I will have the same issue of being geographically limited and end up losing Indian traffic. 4. Is there something else that is not ok on the site that I am missing? 5. Advice that I get from a lot of consultants is to buy seekandhideindia.com but I plan to add international properties in a couple of years so that name would limit my appeal. Thanks in advance! Sudha
International SEO | | Sudha_Mathew0 -
Country name displayed after domain name in google SERP
our online shop targets clients in the US and worldwide (same URL - no subdirectories - currency changes based on IP). when searching in google.ie or google.no for our site google displays in the SERPS "US" or "United States" after the URL for our site, but for most other US competitors it does not show the country in the SERPS. I deleted our google places listing 2 weeks ago, since I suspected it may be related, but no change so far. In google webmaster tools we have targeted the shop domain to United States, which may be another factor. Unfortunately we can not undo this setting since without it our google US ranking for the most relevant competitive keyword drops from position 8 to position 100+. Server location is in Germany which despite lots of US links and US contact info and USD currency appparently makes google think that the site is not targeting the US. Does anybody know what triggers the country name in the SERPS (google places or webmaster tools or other) and can give advice if there is any way to get rid of it.
International SEO | | lcourse0 -
IP Redirection vs. cloaking: no clear directives from Google
Hi there, Here is our situation:we need to force an IP Redirection for our US users to www.domain.com and at the same time we have different country-specific subfolders with thei own language such as www.domain.com/fr. Our fear is that by forcing an IP redirection for US IP, we will prevent googlebot (which has an US IP) from crawling our country-specific subfolders. I didn't find any clear directives from Google representatives on that matter. In this video Matt Cutts says it's always better to show Googlebot the same content as your users http://www.youtube.com/watch?v=GFf1gwr6HJw&noredirect=1, but on the other hand in that other video he says "Google basically crawls from one IP address range worldwide because (they) have one index worldwide. (They) don't build different indices, one for each country". This seems a contradiction to me... Thank you for your help !! Matteo
International SEO | | H-FARM0