Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How long should I keep 301 redirects?
-
I have modified a the URL structure of a whole section of a website and used mod_rewrite 301 redirect to match the new structure. Now that was around 3 months ago and I was wondering how long should I keep this redirect for?
As it is a new website I am quite sure that there are no links around with the old URL structure but still I can see the google bot trying from time to time to access the old URL structure. Shouldn't the google bot learn from this 301 redirect and not go anymore for the old URL?
-
Thanks to both of you for your answers. I will then leave my rewrite rules just in case. I guess they are not slowing down access to the website as they are really simple and quite standard. I suppose that google takes longer to "learn" the new URLs through the redirects.
-
I would agree with Marcus' answer on this one.
Another question as an off-shoot from this one would be: What is the performance loss in checking each request against your htaccess rules?
If the answer is none, or very little, then perhaps keeping the 301 forever isn't a bad idea at all. You never know how long people will be linking towards those 301'd pages. I personally have some pages just like this which have been redirecting visitors for as many as 5 years, because it's better to keep the link juice than to end up as 404s.
-
The simple answer is... forever, unless you have some pressing reason to remove them. Certainly, I would keep them there till I knew there was no indexation of these other links and no requests for them for a say, 12 month period.
If you have external links pointing to any of these old pages, then again, that is another argument for keeping them in place.
Just leave them there, comment them as legacy in redirection script and forget about them and focus on something else.
Hope this helps.
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
1000 Pages on old website. What to do with the 301 redirects for this domain?
Hi Moz Community, I have a 301 redirect question... I just acquired an old domain: Totally in my niche Domain is 14 years old Website exists of 1000 pages Great amount of backlinks Website is offline since about 2 weeks Will place a new website online asap with new url structure For the 50 best scoring pages I wrote a new, but fully comparable/related article. I will put a 301 redirect from those old to the new pages. My question: What to do with the 950 other url's? Should I put a 301 redirect to the homepage? Should I forward those pages to the 404 page? Should I divide the 950 url's with a 301 redirect to the 50 new ones? Another solution maybe? Any idea what would be the best solution so we can save as much Google juice as possible? Thanks in advance!
Technical SEO | | snorkel0 -
301 Redirects Relating to Your XML Sitemap
Lets say you've got a website and it had quite a few pages that for lack of a better term were like an infomercial, 6-8 pages of slightly different topics all essentially saying the same thing. You could all but call it spam. www.site.com/page-1 www.site.com/page-2 www.site.com/page-3 www.site.com/page-4 www.site.com/page-5 www.site.com/page-6 Now you decided to consolidate all of that information into one well written page, and while the previous pages may have been a bit spammy they did indeed have SOME juice to pass through. Your new page is: www.site.com/not-spammy-page You then 301 redirect the previous 'spammy' pages to the new page. Now the question, do I immediately re-submit an updated xml sitemap to Google, which would NOT contain all of the old URL's, thus making me assume Google would miss the 301 redirect/seo juice. Or do I wait a week or two, allow Google to re-crawl the site and see the existing 301's and once they've taken notice of the changes submit an updated sitemap? Probably a stupid question I understand, but I want to ensure I'm following the best practices given the situation, thanks guys and girls!
Technical SEO | | Emory_Peterson0 -
Increase 404 errors or 301 redirects?
Hi all, I'm working on an e-commerce site that sells products that may only be available for a certain period of time. Eg. A product may only be selling for 1 year and then be permanently out of stock. When a product goes out of stock, the page is removed from the site regardless of any links it may have gotten over time. I am trying to figure out the best way to handle these permanently out of stock pages. At the moment, the site is set up to return a 404 page for each of these products. There are currently 600 (and increasing) instances of this appearing on Google Webmasters. I have read that too many 404 errors may have a negative impact on your site, and so thought I might 301 redirect these URLs to a more appropriate page. However I've also read that too many 301 redirects may have a negative impact on your site. I foresee this to be an issue several years down the road when the site has thousands of expired products which will result in thousands of 404 errors or 301 redirects depending on which route I take. Which would be the better route? Is there a better solution?
Technical SEO | | Oxfordcomma0 -
Can I remove 301 redirects after some time?
Hello, We have an very large number of 301 redirects on our site and would like to find a way to remove some of them. Is there a time frame after which Google does not need a 301 any more? For example if A is 301 redirected to B, does Google know after a while not to serve A any more, and replaces any requests for A with B? How about any links that go to A? Or: Is the only option to have all links that pointed to A point to B and then the 301 can be removed after some time? Thank you for you you help!
Technical SEO | | Veva0 -
Do search engines treat 307 redirects differently from 302 redirects?
We will need to send our users to an alternate version of our homepage for a few hours for a certain event. The SEO task at hand is to minimize the chance of the special homepage getting crawled and cached in the search engines in place of our normal homepage. (This has happened in the past so the concern is not imaginary.) Among other options, 302 and 307 redirects are being discussed. IE, redirecting www.domain.com to www.domain.com/specialpage. Having used 302s and 301s in the past, I am well aware of how search engines treat them. A 302 effectively says "Hey, Google! Please get rid of the old content on www.domain.com and replace it with the content on /specialpage!" Which is exactly what we don't want. My question is: do the search engines handle 307s any differently? I am hearing that the 307 does NOT result in the content of the second page being cached with the first URL. But I don't see that in the definition below (from w3.org). Then again, why differentiate it from the 302? 307 Temporary Redirect The requested resource resides temporarily under a different URI. Since the redirection MAY be altered on occasion, the client SHOULD continue to use the Request-URI for future requests. This response is only cacheable if indicated by a Cache-Control or Expires header field. The temporary URI SHOULD be given by the Location field in the response. Unless the request method was HEAD, the entity of the response SHOULD contain a short hypertext note with a hyperlink to the new URI(s) , since many pre-HTTP/1.1 user agents do not understand the 307 status. Therefore, the note SHOULD contain the information necessary for a user to repeat the original request on the new URI. If the 307 status code is received in response to a request other than GET or HEAD, the user agent MUST NOT automatically redirect the request unless it can be confirmed by the user, since this might change the conditions under which the request was issued.
Technical SEO | | CarsProduction0 -
Any way around buying hosting for an old domain to 301 redirect to a new domain?
Howdy. I have just read this QA thread, so I think I have my answer. But I'm going to ask anyway! Basically DomainA.com is being retired, and DomainB.com is going to be launched. We're going to have to redirect numerous URLs from DomainA.com to DomainB.com. I think the way to go about this is to continue paying for hosting for DomainA.com, serving a .htaccess from that hosting account, and then hosting DomainB.com separately. Anybody know of a way to avoid paying for hosting a .htaccess file on DomainA.com? Thanks!
Technical SEO | | SamTurri0 -
Trailing Slashes In Url use Canonical Url or 301 Redirect?
I was thinking of using 301 redirects for trailing slahes to no trailing slashes for my urls. EG: www.url.com/page1/ 301 redirect to www.url.com/page1 Already got a redirect for non-www to www already. Just wondering in my case would it be best to continue using htacces for the trailing slash redirect or just go with Canonical URLs?
Technical SEO | | upick-1623910 -
How to safely reduce the number of 301 redirects / should we be adding so many?
Hi All, We lost a lot of good rankings over the weekend with no obvious cause. Our top keyword went from p3 to p12, for example. Site speed is pretty bad (slower than 92% of sites!) but it has always been pretty bad. I'm on to the dev team to try and crunch this (beyond image optimisation) but I know that something I can effect is the number of 301 redirects we have in place. We have hundreds of 301s because we've been, perhaps incorrectly, adding one every time we find a new crawl error in GWT and it isn't because of a broken link on our site or on an external site where we can't track down the webmaster to fix the link. Is this bad practice, and should we just ignore 404s caused by external broken URLs? If we wanted to reduce these numbers, should we think about removing ones that are only in place due to external broken URLs? Any other tips for safely reducing the number of 301s? Thanks, all! Chris
Technical SEO | | BaseKit0