Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Multiple 301 redirects for a HTTPS URL. Good or bad?
-
I'm working on an ecommerce website that has a few snags and issues with it's coding.
They're using https, and when you access the website through domain.com, theres a 301 redirect to http://www.domain.com and then this, in turn, redirected to https://www.domain.com.
Would this have a deterimental effect or is that considered the best way to do it. Have the website redirect to http and then all http access is redirected to the https URL?
Thanks
-
My personal rule of thumb - as few redirect jumps as possible. Three main reasons:
1. User journey + Browsers - Sometimes when there are too many redirects taking place, some browsers find it difficult to follow through and would simply not load the page. Also, even if there were only 2-3, the browser may load, but users on slower connections may find it tiresome waiting for content to load.
2. As ThompsonPaul highlights, you COULD lose some link value due to dilution through 301 redirects.
3. Multiple 301 redirects are often used by spammers and I foresee in the near future these causing a lot of ranking headaches. The older the site, the longer the chain might end up - for example, imagine you had a product at:
https://domain.com/product1
Links to that page exist at domain.com/product1The journey would be: domain.com/product1 >http://domain.com/product1 > https://domain.com/product1
Now imagine a year down the line, product 1 is discontinued and you decide to redirect https://domain.com/product1 to domain.com/product2
Imagine your journey now:
domain.com/product1 >http://domain.com/product1 > https://domain.com/product1 > domain.com/product2 >http://domain.com/product2 > https://domain.com/product2
This could carry on indefinitely in the lifetime of the site...
Best solution: Decide what version of the site you want to use and simply try and use only one redirect, not a chain. Periodically check for chained redirects and resolve as you go along. (I try and do this bi annually).
-
To answer your specific question, Jason, yes, there's an issue with those URLs going through two consecutive redirects.
Each redirect, like any link, costs a little bit of "link juice". So running through two consecutive redirects is wasting twice as much link juice as if the origin URL redirects immediately to the final URL without the intermediate step. It's not a massive difference, but on an e-commerce site especially, there's no point in wasting any. (Some folks reckon the loss could be as high as 15% per link/redirect.) Plus, I've occasionally seen problems with referrer data being maintained across multiple redirects (anecdotal).
Hope that answers your specific question?
Paul
-
I agree with Jane. Unless there are reasons why the whole site needs to be secure, it makes more sense for just the areas where sensitive information is being submitted to be SSL encrypted.
http: requests are processed more quickly than https: ones due to the SSL handshake required to produce the cryptographic parameters for the user's session - so your site would be a little quicker if you weren't using SSL.
However, if you do decide to use http: rather than https: for the product & category pages like Jane has suggested - you'd need to ensure that the https: versions of these pages redirect to http:... again to avoid duplicate content.
-
Hi Jason,
To add to what Yusuf has said, is there a specific reason why the whole site has to use SSL, rather than just the parts of the website where sensitive information is passed? If so, I would be tempted to recommend that the e-commerce pages (products, categories, etc.) remain on HTTP URLs.
Cheers,
Jane
-
Hi Jason,
It's fine to 301 redirect from http: to https: and it's quite common for sites that use SSL. It's exactly the same principle as redirecting from a non-www to www (e.g. http://example.com to http://www.example.com) - which is considered to be good practice. But there should only be a single redirect. So you should ensure that http://example.com redirects to https://www.example.com without first redirecting to http://www.example.com.
I would also make sure that all pages (not just the homepage) redirect from http: to https: too to ensure there are no duplicate content issues on the rest of the site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We are redirecting http and non www versions of our website. Should all versions http (non www version and www version) and https (non www version) should just have 1 redirect to the https www version?
We are redirecting http and non www versions of our website. Should all versions http (non www version and www version) and https (non www version) should just have 1 redirect to the https www version? Thant way all forms of the website are pointing to one version?
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
410 or 301 after URL update?
Hi there, A site i'm working on atm has a thousand "not found" errors on google console (of course, I'm sure there are thousands more it's not showing us!). The issue is a lot of them seem to come from a URL change. Damage has been done, the URLs have been changed and I can't stop that... but as you can imagine, i'm keen to fix as many as humanly possible. I don't want to go mad with 301s - but for external links in, this seems like the best solution? On the other hand, Google is reading internal links that simply aren't there anymore. Is it better to hunt down the new page and 301-it anyway? OR should I 410 and grit my teeth while google crawls and recrawls it, warning me that this page really doesn't exist? Essentially I guess I'm asking, how many 301s are too many and will affect our DA? And what's the best solution for dealing with mass 404 errors - many of which aren't attached or linked to from any other pages anymore? Thanks for any insights 🙂
Intermediate & Advanced SEO | | Fubra0 -
Images Returning 404 Error Codes. 301 Redirects?
We're working with a site that has gone through a lot of changes over the years - ownership, complete site redesigns, different platforms, etc. - and we are finding that there are both a lot of pages and individual images that are returning 404 error codes in the Moz crawls. We're doing 301 redirects for the pages, but what would the best course of action be for the images? The images obviously don't exist on the site anymore and are therefore returning the 404 error codes. Should we do a 301 redirect to another similar image that is on the site now or redirect the images to an actual page? Or is there another solution that I'm not considering (besides doing nothing)? We'll go through the site to make sure that there aren't any pages within the site that are still linking to those images, which is probably where the 404 errors are coming from. Based on feedback below it sounds like once we do that, leaving them alone is a good option.
Intermediate & Advanced SEO | | garrettkite0 -
301 Redirect of subdomain?
Fellow Mozzers, I'm having a hard time wrapping my brain around a redirect issue and thought it was worth posing the question to the Moz community. I did a search first but couldn't find the exact answer I was looking for. How does a 301 redirect work when you redirect a sub domain example.homepage.com to www.homepage.com but you keep the sub directories of example.homepage.com/page-1 active and are trying to rank them? I'm dealing with a current project where this is happening and this doesn't make sense to me, to redirect the subdomain if you're also trying to rank/create search traffic for pages, sub directories on example.homepage.com. This also get's into the debate of if a sub domain site is viewed as it's own website and therefore has to rank itself. If this is true, it seems like we're kind of killing the authority of the site by redirecting it. Additionally, www.homepage.com has a much stronger link profile than example.homepage.com I hope this makes sense. Any thoughts are appreciated. Thanks for your time.
Intermediate & Advanced SEO | | SMG-Texas0 -
What are the effects of having Multiple Redirects for pages under the same domain
Dear Mozers, First of all let me wish you all a Very Happy, Prosperous, Healthy, Joyous & Successful New Year ! I'm trying to analyze one of the website's Web Hosting UK Com Ltd. and during this process I've had this question running through my mind. This project has been live since the year 2003 and since then there have be changes made to the website (obviously). There have also been new pages been added, the same way some new pages have even been over-written with changes in the url structures too. Now, coming back to the question, if I've have a particular url structure in the past when the site was debuted and until date the structure has been changes thrice (for example) with a 301 redirect to every back dated structure, WOULD it impact the sites performance SEOwise ? And let's say that there's hundreds of such redirections under the same domain, don't you think that after a period of time we should remove the past pages/urls from the server ? That'd certainly increase the 404 (page not found) errors, but that can be taken care of. How sensible would it be to keep redirecting the bots from one url to the other when they only visit a site for a short stipulated time? To make it simple let me explain it with a real life scenario. Say if I was staying a place A then switched to a different location in another county say B and then to C and so on, and finally got settled at a place G. When I move from one place to another, I place a note of the next destination I'm moving to so that any courier/mail etc. can be delivered to my current whereabouts. In such a case there's a less chance that the courier would travel all the destinations to deliver the package. Similarly, when a bot visits a domain and it finds multiple redirects, don't you think that it'd loose the efficiency in crawling the site? Ofcourse, imo. the redirects are important, BUT it should be there (in htaccess) for only a period of say 3-6 months. Once the search engine bots know about the latest pages, the past pages/redirects should be removed. What are your opinions about this ?
Intermediate & Advanced SEO | | eukmark0 -
Url structure for multiple search filters applied to products
We have a product catalog with several hundred similar products. Our list of products allows you apply filters to hone your search, so that in fact there are over 150,000 different individual searches you could come up with on this page. Some of these searches are relevant to our SEO strategy, but most are not. Right now (for the most part) we save the state of each search with the fragment of the URL, or in other words in a way that isn't indexed by the search engines. The URL (without hashes) ranks very well in Google for our one main keyword. At the moment, Google doesn't recognize the variety of content possible on this page. An example is: http://www.example.com/main-keyword.html#style=vintage&color=blue&season=spring We're moving towards a more indexable URL structure and one that could potentially save the state of all 150,000 searches in a way that Google could read. An example would be: http://www.example.com/main-keyword/vintage/blue/spring/ I worry, though, that giving so many options in our URL will confuse Google and make a lot of duplicate content. After all, we only have a few hundred products and inevitably many of the searches will look pretty similar. Also, I worry about losing ground on the main http://www.example.com/main-keyword.html page, when it's ranking so well at the moment. So I guess the questions are: Is there such a think as having URLs be too specific? Should we noindex or set rel=canonical on the pages whose keywords are nested too deep? Will our main keyword's page suffer when it has to share all the inbound links with these other, more specific searches?
Intermediate & Advanced SEO | | boxcarpress0 -
Changing a parent category and 301 redirecting
I have a set of three pages that are subpages of a parent. The structure is as follows: mysite.com/directory/personal-widgets mysite.com/directory/commercial-widgets mysite.com/directory/widgets-services The partent page name "directory" really isn't working for where I want these pages to evolve. So I want to change it to "guides" In a world without worrying about google, I would simply change the parent page to guides, so they look like this, and be done with it: mysite.com/guides/personal-widgets But, the obvious problem is that I have external links to the page now. And the pages have a nice PR. And they also have Facebook page Likes and I don't know if I'll lose those. I know that if I should do this I should redirect the pages to the new pages of course. My question is: Will redirecting the old URL to the new URL with a 301 cause anything negative to happen that I might not be expecting? Does Google dislike Redirects for any reason, or understand they are sometimes necessary?
Intermediate & Advanced SEO | | bizzer0 -
Any way to find which domains are 301 redirected to competitors' websites?
By looking at the work from an SEO collegue it became clear that his weak linkbuilding graph probably is not the cause for his good rankings for a pretty competitive keyword. (also no social mentions where found) I was wondering what it could be, site structure and other on page optimization factors seems to be ok and I don't think there will be exceptionally good or bad user behavior... Finally I looked at the competitors and found that they have more links, better content en better design, so I got a little stuck. The only reason I can think of is that he is doing 301 redirects (or is rel=canonical tags). Is there a way to trace these redirects back to the source in order to include this important variable in your competitor research? thnx
Intermediate & Advanced SEO | | djingel10