Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What is the best way to execute a geo redirect?
- 
					
					
					
					
 Based on what I've read, it seems like everyone agrees an IP-based, server side redirect is fine for SEO if you have content that is "geo" in nature. What I don't understand is how to actually do this. It seems like after a bit of research there are 3 options: - 
You can do a 301 which it seems like most sites do, but that basically means if google crawls you in different US areas (which it may or may not) it essentially thinks you have multiple homepages. Does google only crawl from SF-based IPs? 
- 
302 passes no juice, so probably don't want to do that. 
- 
Yelp does a 303 redirect, which it seems like nobody else does, but Yelp is obviously very SEO-savvy. Is this perhaps a better way that solves for the above issues? 
 Thoughts on what is best approach here? 
- 
- 
					
					
					
					
 You are welcome. Hmmmm.. don't know about Yelp, I've seen others using 303 too, but still 302 seems to be the way to go. 
- 
					
					
					
					
 Thanks Federico. Any insight as to why Yelp, who is very seo savvy, uses a 303? 
- 
					
					
					
					
 Well, personally I would go with a 302. The reasons are: 301: the browser "remembers" that 301, so next time the user request that page, their browser will automatically redirect as the last time it accessed the page. However, the 302, as a temporary redirect will let the browser know that it should re-request the page. Say your website www.example.com holds an english version in the root, and then a german version in www.example.com/de. If a german user accesses the site for the first time, you do the geolocation check and redirect to german version while saving a session/cookie of the chosen version. Then if the user chooses to switch to the english version you update that cookie/session to save the one that the user chose to navigate and make a 302 redirection. Next time the user accesses, having the cookie will automatically show/redirect to the appropriate language. Using the same example, if you did a 301, then even if the user changed the language, as the browser already has a 301-permanent redirect, he will be redirected to the "first version served". SEOwise, if we take a quick look on Google, they use 302 to redirect users to the "appropriate" version, so I guess that should be ok as long as you use rel="alternate" to point to the other versions of your site: https://support.google.com/webmasters/answer/189077?hl=en EDIT: link juice flows to the page that the link is pointing. Say a link points to www.example.com then the juice goes to www.example.com, even if that page has a redirection to the german version (when accessed from germany). Anyway, it is said that 302s also pass some pagerank. Hope that helped. 
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		How to remove subdomains in a clean way?
 Hello, I have a main domain example.com where I have my main content and then I created 3 subdomains one.example.com, two.example.com and three.example.com I think the low ranking of my subdomains is affecting the ranking of my main domain, the one I care the most. So, I decided to get rid of the subdomains. The thing is that only for one.example.com I could transfer the content to my main domain and create 301 redirects. For the other two subdomains I cannot integrate the content in my main domain as it doesn't make sense. Whats the cleanest way to make them dissapear? (just put a redirect to my main domain even if the content is not the same) or just change the robots to "noindex" and put a 404 page in the index of each subdomain. I want to use the way that will harm the least the performance with Google. Regards! On-Page Optimization | | Gaolga0
- 
		
		
		
		
		
		How long should I leave an existing web page up after a 301 redirect?
 I've been reading through a few of blog posts here on moz and can't seem to find the answer to these two questions: How long should I leave an existing page up after a 301 redirect? The page old page is no longer needed but has pretty high page authority. If I take the old page down—the one that I'm redirecting from—immediately after I set up the 301 redirect, will link juice still be passed to the new page? My second question is, right now, on my index.html page I have both a 301 redirect and a rel canonical tag in the head. They were both put in place to redirect and pass link equity respectively. I did this a couple years back after someone recommended that I do both just to be safe, but from what I've gathered reading the articles here on moz is that your supposed to pick one or the other depending on whether or not it's permanent. Should I remove the rel conanical tag or would it be better to just leave it be? On-Page Optimization | | ScottMcPherson0
- 
		
		
		
		
		
		Best practice for Portfolio Links
 I have a client with a really large project portfolio (over 500 project images), which causes their portfolio page to have well over the 100 links that are recommended. How can I reduce this without reducing the number of photos they can upload? On-Page Optimization | | HochKaren0
- 
		
		
		
		
		
		Is there a way to prevent Google Alerts from picking up old press releases?
 I have a client that wants a lot of old press releases (pdfs) added to their news page, but they don't want these to show up in Google Alerts. Is there a way for me to prevent this? On-Page Optimization | | IdeaGarden0
- 
		
		
		
		
		
		What's the best practice for handling duplicate content of product descriptions with a drop-shipper?
 We write our own product descriptions for merchandise we sell on our website. However, we also work with drop-shippers, and some of them simply take our content and post it on their site (same photos, exact ad copy, etc...). I'm concerned that we'll loose the value of our content because Google will consider it duplicated. We don't want the value of our content undermined... What's the best practice for avoiding any problems with Google? Thanks, Adam On-Page Optimization | | Adam-Perlman0
- 
		
		
		
		
		
		What is the best setup for conical Links
 Should I have the conical link state: 1. www.autoinsurancefremontca.com 2. www.autoinsurancefremontca.com/index.html 3. autoinsurancefremontca.com Also do you need a conical link on each page if you have more than one page on your site? On-Page Optimization | | Greenpeak0
- 
		
		
		
		
		
		301 redirects from several sub-pages to one sub-page
 Hi! I have 14 sub-pages i deleted earlier today. But ofcourse Google can still find them, and gives everyone that gives them a go a 404 error. I have come to the understading that this wil hurt the rest of my site, at least as long as Google have them indexed. These sub-pages lies in 3 different folders, and i want to redirect them to a sub-page in a folder number 4. I have already an htaccess file, but i just simply cant get it to work! It is the same file as i use for redirecting trafic from mydomain.no to www.mydomain.no, and i have tried every kind of variation i can think of with the sub-pages. Has anyone perhaps had the same problem before, or for any other reason has the solution, and can help me with how to compose the htaccess file? 🙂 You have to excuse me if i'm using the wrong terms, missing something i should have seen under water while wearing a blindfold, or i am misspelling anything. I am neither very experienced with anything surrounding seo or anything else that has with internet to do, nor am i from an englishspeaking country. Hope someone here can light up my path 🙂 Thats at least something you can say in norwegian... On-Page Optimization | | MarieA1
- 
		
		
		
		
		
		Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
 We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc) On-Page Optimization | | smaavie
 Cooking Method (fry, bake, boil, steam, etc)
 Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
 find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
 find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5
 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				