Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What are the effects of having Multiple Redirects for pages under the same domain
-
Dear Mozers,
First of all let me wish you all a Very Happy, Prosperous, Healthy, Joyous & Successful New Year !
I'm trying to analyze one of the website's Web Hosting UK Com Ltd. and during this process I've had this question running through my mind. This project has been live since the year 2003 and since then there have be changes made to the website (obviously). There have also been new pages been added, the same way some new pages have even been over-written with changes in the url structures too.
Now, coming back to the question, if I've have a particular url structure in the past when the site was debuted and until date the structure has been changes thrice (for example) with a 301 redirect to every back dated structure, WOULD it impact the sites performance SEOwise ? And let's say that there's hundreds of such redirections under the same domain, don't you think that after a period of time we should remove the past pages/urls from the server ? That'd certainly increase the 404 (page not found) errors, but that can be taken care of.
How sensible would it be to keep redirecting the bots from one url to the other when they only visit a site for a short stipulated time?
To make it simple let me explain it with a real life scenario. Say if I was staying a place A then switched to a different location in another county say B and then to C and so on, and finally got settled at a place G. When I move from one place to another, I place a note of the next destination I'm moving to so that any courier/mail etc. can be delivered to my current whereabouts. In such a case there's a less chance that the courier would travel all the destinations to deliver the package. Similarly, when a bot visits a domain and it finds multiple redirects, don't you think that it'd loose the efficiency in crawling the site?
Ofcourse, imo. the redirects are important, BUT it should be there (in htaccess) for only a period of say 3-6 months. Once the search engine bots know about the latest pages, the past pages/redirects should be removed.
What are your opinions about this ?
-
Both answers so far get to one of the points that I was going to make, always update redirects so that there is not a chain, but I wanted to add something else. You only need redirects as long as someone is linking to those pages. You should be taking time to fix any internal references to changed URLs and contacting websites that link to the old URLs and asking them to change the URLs. That should be a part of any site URL change.
If you have only revised your URLs once, you only need redirects for 3-6 months while the search engines reindex everything. In that time, you should have changed all links to the old URLs.
In your case, I'd drop all old redirects except for the last one and see what 404s you get. Find the referring site, and contact them to change the link to your site. Once that is all done, then you can work on this latest revision to change those links.
Hope that helps!
-
It is always best to do a one to one redirect instead of a chain. As Federico said, there is some pagerank loss when doing a redirect (though the exact amount is debatable and may be neglible) and redirecting A to B to C compounds the problem. On top of that, too many redirects in a chain will lead Googlebot to stop crawling the chain. One or two is fine, three or more is not. In this older video http://youtu.be/r1lVPrYoBkA Matt Cutts started talking about redirect chains at around 2:48 and mentions that one, two and maybe three in a chain is fine. This Whiteboard Interview from 2010 with Matt Cutts http://a-moz.groupbuyseo.org/blog/whiteboard-interview-googles-matt-cutts-on-redirects-trust-more also states the 1 or 2 301s in a chain. So if you're redirecting A -> B -> C -> D -> E -> F... you're possibly hurting yourself. Where possible you should change the redirects so its A to F, B to F, C to F, D to F and E to F. As for removing the redirects after a certain number of months, I'd check to see how many people are still linking in with that older URL. You'd want to ask sites linking in to update to the newest URL before you 404 it and lose those links. And if you're still getting tons of direct traffic coming in on an old 301 then you might want to do some digging & research before you cut off that traffic. Odds are though after a few months you wouldn't be getting as much traffic coming through on the older URL but there is always the possibility.
-
Every time you make a redirect, 301, some of the pagerank is diluted. So following your example, from going from A to C you should redirect both A and B to C, not A -> B -> C as you double the loss.
Redirects are just fine, and in my opinion, they should say for as long as the pages being redirected still get organic traffic (backlinks, search, etc.). The moment you see no more traffic, and the links pointing to that redirected page fixed (point to the new page) you can safely remove the redirection. For as on the amount of redirects, it won't be a problem if you have lots of them, unless you do multiple redirects from A to G going from one page to the other until reaching the final, working version.
If that's not your scenario and A redirects directly to G, then you are fine. Monitor traffic on A and see if at some point you can remove the redirection, otherwise just leave is there (I personally have redirects that have been there for over 3 years as the pages are still getting organic traffic (mainly from links).
Hope that helps! And a happy new year to you too!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is an iframe redirect on the same Domain bad for SEO
Good morning. We have a vendor that has created a landing page with content that we want to use. Because of the way we built the site, the only way to use the content is to create an i-frame. The i-frame is re-directingon the same Domain. Would we benefit from the SEO Content?
Intermediate & Advanced SEO | | jdenbo_edf0 -
Should I redirect a domain we control but which has been labeled 'toxic' or just shut it down?
Hi Mozzers: We recently launched a site for a client which involved bringing in and redirecting content which formerly had been hosted on different domains. One of these domains still existed and we have yet to bring over the content from it. It has also been flagged as a suspicious/toxic backlink source to our new domain. Would I be wise to redirect this old domain or should I just shut it down? None of the pages seem to have particular equity as link sources. Part of me is asking myself 'Why would we redirect a domain deemed toxic, why not just shut it down.' Thanks in advance, dave
Intermediate & Advanced SEO | | Daaveey0 -
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Consolidating Multiple Domains into A Single Domain
I have a client who's website is an amalgamation of multiple domains. jacksonhole.net is the main domain but the site passes traffic back and forth from the following domains/sites. My questions is, would it it be better for SEO to consolidate all of these domains under the single high authority domain and 301 redirect the rest or is that a really bad idea? Thanks for your help. jacksonhole.net (Domain Authority 31) jackson-hole-rental-condos.com (Domain Authority 22) jackson-hole-rental-homes.com (Domain Authority 21) j acksonholehotelguide.com (Domain Authority 19)
Intermediate & Advanced SEO | | dbaxa-2613381 -
Multiple Landing Pages and Backlinks
I have a client that does website contract work for about 50 governmental county websites. The client has the ability to add a link back in the footer of each of these websites. I am wanting my client to get backlink juice for a different key phrase from each of the 50 agencies (basically just my keyphrase with the different county name in it). I also want a different landing page to rank for each term. The 50 different landing pages would be a bit like location pages for local search. Each one targets a different county. However, I do not have a lot of unique content for each page. Basically each page would follow the same format (but reference a different county name, and 10 different links from each county website). Is this a good SEO back link strategy? Do I need more unique content for each landing page in order to prevent duplicate content flags?
Intermediate & Advanced SEO | | shauna70840 -
301 redirection pointing to noindexed pages
I have rather an unusual situation where a recently launched affiliate site does not have any unique content as its all syndicated content. For that reason we are currently using the noindex,nofollow meta tags to keep the pages out of the search engines index until we create unique content for the pages. The problem is that due to a very tight timeframe with rebranding, we are looking at 301 redirecting (on a page to page basis) another high authority legacy domain to this new site before we have had a chance to add unique content to it and remove the noindex,nofollow tags. I would assume that any link authority normally passed through the 301 would be lost in this scenario but Im uncertain of what the broader impact might be. Has anyone dealt with a similar scenario? I know this scenario is not ideal and I would rather wait until the unique content is up and noindex tags are removed before launching the 301 redirect of the legacy domain but there are a number of competing priorities at play outside of SEO.
Intermediate & Advanced SEO | | LosNomads0 -
Any way to find which domains are 301 redirected to competitors' websites?
By looking at the work from an SEO collegue it became clear that his weak linkbuilding graph probably is not the cause for his good rankings for a pretty competitive keyword. (also no social mentions where found) I was wondering what it could be, site structure and other on page optimization factors seems to be ok and I don't think there will be exceptionally good or bad user behavior... Finally I looked at the competitors and found that they have more links, better content en better design, so I got a little stuck. The only reason I can think of is that he is doing 301 redirects (or is rel=canonical tags). Is there a way to trace these redirects back to the source in order to include this important variable in your competitor research? thnx
Intermediate & Advanced SEO | | djingel10 -
Cookies and redirects - what are the negative effects?
I am advising a client who wants to streamline their online customers experience through the use of cookies. The first time someone visits mysite.com, they will visit the normal index page, and on that page will be asked to identify themselves as a Personal or Business customer - and taken through to a relevant page. This will result in a cookie being added. The next time they come back to mysite.com, the cookie will automatically direct them from the index page to mysite.com/personal/ or mysite.com/business/. My question is, what are the SEO implications of this, especially given the fact the index page is their primary landing page for almost all organic traffic? Bots I realise that googlebot etc do not store cookies, so this should result in no change from the bots perspective (i.e. no redirect) but is it that simple? In effect we'll be showing the bot one thing and second time + visitors something else. Is this not effectively cloaking? All advice gratefully received!
Intermediate & Advanced SEO | | seomasters0