Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best and easiest Google Depersonalization method
-
Hello,
Moz hasn't written anything about depersonalization for years. This article has methods, but I don't know if they are valid anymore.
What's an easy, effective way to depersonalize Google search these days? I would just log out of Google, but that shows different ranking results than Moz's rank tracker for one of our main keywords, so I don't know if that method is correct.
Thanks
-
Thanks Rand, really appreciate it!
-
Hi Rand,
Thanks for jumping in and helping us all out. Your response is much appreciated.
Regards,
Vijay
-
I'm surprised at how well this still works, but it does:
- Use an incognito browser window to remove account personalization
- Use a query string like this: https://google.co.nz/search?q=your+keyword+terms&gl=us
With 2) above, you're removing the geographic bias of any particular region/IP address by searching in Google New Zealand, then re-geo-locating the search to the US. This will give you non-geo-biased results.
If you want to see how specific results look from a particular region, there's two semi decent options:
A) Use Google's Ad Preview Tool: https://adwords.google.com/apt/anon/AdPreview?__u=1000000000&__c=1000000000
B) Use the &near parameter, e.g. https://google.co.nz/search?q=your+keyword+terms&gl=us&near=seattle+wa -
Yes, this is one of many factors for de-personalization. Also, there can be many more hidden factors which we are yet to discover.
I have done a lot of research on this matter, I use a specific PC with VPN dedicated to checking keyword SERP ranks for my clients, as they are from many different countries and having a different target audience, we try to replicate the results for different scenarios.
I hope this helps.
-
So am I correct that logging out and adding &pws=0 is not enough?
-
Hi There,
In addition to the methods suggested for de-personalization of results, there are additional few more factors.You may also like to read a blog post I wrote on my website Impact of Personalized Search results .
Incognito window doesn't mean you have deleted history of the previous browsing, you will have to clean the browsing history and cookies.
Use VPN or proxy to get results from different locations and countries. This gives you best Idea about your SERP status in different countries.
I hope this helps, please feel free to ask more questions by responding.
Regards,
Vijay
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I discover the Google ranking number for a keyword in Brazil?
Hello, how can I discover the Google ranking number for a keyword in Brazil location. I need to know what is the position in Brazil location for the keyword "ligação internacional" in the Google search engine for the webpage www.solaristelecom.com/ligacao-internacional. I tried to use the Moz tools to discover it but only shows that I am not in the top 50, then I want to know where I am, and if I am listed or not. I tried to search it in my browser and didn't show the name of my website. Thank you.
Algorithm Updates | | lmoraes1 -
Does Google ignores page title suffix?
Hi all, It's a common practice giving the "brand name" or "brand name & primary keyword" as suffix on EVERY page title. Well then it's just we are giving "primary keyword" across all pages and we expect "homepage" to rank better for that "primary keyword". Still Google ranks the pages accordingly? How Google handles it? The default suffix with primary keyword across all pages will be ignored or devalued by Google for ranking certain pages? Or by the ranking of website improves for "primary keyword" just because it has been added to all page titles?
Algorithm Updates | | vtmoz0 -
Anyone experience google penalties for full-screen pop-ups?
Although we always recommend against onload pop-ups for clients, (we feel the effect the user experience) we do have a few clients that insist on them. I was reading this article the other day https://searchenginewatch.com/2016/05/17/how-do-i-make-sure-my-site-is-mobile-friendly/ which lead me to https://support.google.com/webmasters/answer/6101188 and I'm happy to see that Google is going to consider these types of content a downgrade when it comes to rank. My question is 2 fold: Has anyone experienced a drop in organic traffic on mobile due to this update? and do you think this will include user triggered content like photo galleries, bookings, email sign ups? We haven't noticed any drops yet but it is something we will be keeping a close eye on in the next little while. Let's hear what the community has to say 🙂
Algorithm Updates | | VERBInteractive1 -
US domain pages showing up in Google UK SERP
Hi, Our website which was predominantly for UK market was setup with a .com extension and only two years ago other domains were added - US (.us) , IE (.ie), EU (.eu) & AU (.com.au) Last year in July, we noticed that few .us domain urls were showing up in UK SERPs and we realized the sitemap for .us site was incorrectly referring to UK (.com) so we corrected that and the .us domain urls stopped appearing in the SERP. Not sure if this actually fixed the issue or was such coincidental. However in last couple of weeks more than 3 .us domain urls are showing for each brand search made on Google UK and sometimes it replaces the .com results all together. I have double checked the PA for US pages, they are far below the UK ones. Has anyone noticed similar behaviour &/or could anyone please help me troubleshoot this issue? Thanks in advance, R
Algorithm Updates | | RaksG0 -
Deindexed from Google images Sep17th
We have a travel website that has been ranked in Google for 12-14years. The site produces original images with branding on them and have been for years ranking well. There's been no site changes. We have a Moz spamscore 1/17 and Domain Authority 59. Sep 17th all our images just disappeared from Google Image Search. Even searching for our domain with keyword photo results in nothing. I've checked our Search console and no email from Google and I see no postings on Moz and others relating to search algo changes with Images. I'm at a loss here.. does anyone have some advice?
Algorithm Updates | | danta2 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
Does google index non-public pages ie. members logged in page
hi, I was trying to locate resources on the topics regarding how much the google bot indexes in order to qualify a 'good' site on their engine. For example, our site has many pages that are associated with logged in users and not available to the public until they acquire a login username and password. Although those pages show up in google analytics, they should not be made public in the google index which is what happens. In light of Google trying to qualify a site according to how 'engaged' a user is on the site, I would feel that the activities on those member pages are very important. Can anyone offer suggestions on how Google treats those pages since we are planning to do further SEO optimization of those pages. Thanks
Algorithm Updates | | jumpdates0 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0