Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to get global search results on Google ? Also, is it possible to get results based on some other geographic location?
-
I don't want results based on my geographic location. When I am in India, I don't want local search results. In fact, I want results which are not dependent on my current location.
Also, can I change my current location to some other city and will it affect the results ? For eg: While I am in London, can my search results be modified as if I am sitting in New York ?
-
Hello,
1. Choose the Google domain: .com / .co.uk / .au and so on. If you are redirected type /ncr at the end of the address.
2. Search for something, then look at Show search tools - left of the screen.
3. At All Results you will have an option called: Custom Location
4. Write the city and search

Ps. It's better if you delete your history before that and log out of your account.
The end.
-
I do two things..
- If you use Google Adwords, use the 'Ad Preview and Diagnosis' tool. You can define Domains, Languages, Device and even the Location.
- Add /ncr at the end of the Google URL you want to use. For e.g. if you want to use Google.com (www.google.com/ncr) , Google.co.uk (www.google.co.uk/ncr), Google.de (www.google.de/ncr)
-
What you can also do is change the search domains. So go to google.co.uk and make sure you aren't still on Google.com or Google.in - I do this a lot when testing and works really well. For example, I have one client right now in Ireland so if I want to see what they do, I change to google.ie
Andy
-
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long does google takes to crawl a single site ?
lately i have been thinking , when a crawler visits an already visited site or indexed site, whats the duration of its scanning?
Algorithm Updates | | Sam09schulz0 -
Does Google ignores page title suffix?
Hi all, It's a common practice giving the "brand name" or "brand name & primary keyword" as suffix on EVERY page title. Well then it's just we are giving "primary keyword" across all pages and we expect "homepage" to rank better for that "primary keyword". Still Google ranks the pages accordingly? How Google handles it? The default suffix with primary keyword across all pages will be ignored or devalued by Google for ranking certain pages? Or by the ranking of website improves for "primary keyword" just because it has been added to all page titles?
Algorithm Updates | | vtmoz0 -
Using Google to find a discontinued product.
Hi Guys. I mostly use this forum for business questions, but now it's a personal one! I'm trying to find a supplier that might still have discontinued product. It's the Behritone C5A speaker monitor. All my searches bring up a plethora of pages that appear to sell the product... but they have no stock. (Wouldn't removing these pages make for a better internet?) No 2nd hand ones on eBay 😞 Do you have any suggestion about how I can get more relevant results... i.e find supplier that might still have stock? Any tips or trick I may be able to use to help me with this? Many thanks in advance to an awesome community 🙂 Isaac.
Algorithm Updates | | isaac6631 -
Remove spam url errors from search console
My site was hacked some time ago. I've since then redesigned it and obviously removed all the injection spam. Now I see in search console that I'm getting hundreds of url errors (from the spam links that no longer work). How do I remove them from the search console. The only option I see is "mark as fixed", but obviously they are not "fixed", rather removed. I've already uploaded a new sitemap and fetched the site, as well as submitted a reconsideration request that has been approved.
Algorithm Updates | | rubennunez0 -
Your search - site:domain.com - did not match any documents.
I've recently started work on a new clients website and done some preliminary work with on-page optimisation, and there is still plenty of work to be done and issues to resolve. They are ranking ok on Bing, but they are not getting any ranking on Google at all (except paid) - I tried the site:domain.com search and comes up with no results... so this confirms that something is going on with the google search rank! Can anyone shed light on what can cause this or why this would happen? My next step is to look at their webmaster tools (haven't had access yet), but if anyone has any tips to resolve this or where to look, that would be great! Thanks!
Algorithm Updates | | ElevateCreativeAU0 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
Troubleshooting Decline of Branded Keyword Searches
Hi, Over the past year, I have seen a huge change in the distribution of our organic keyword traffic. I'm trying to research why our branded keywords have gone down. Google analytics only shows me impressions for the past three months. Does anyone have ideas on how to explain this change in traffic? Please see the attached chart. Thanks! branded-v-nonbranded-organic-search.jpg
Algorithm Updates | | netdiva_amy0 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0