Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Do 404 Pages from Broken Links Still Pass Link Equity?
-
Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this.
When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost?
We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name).
Anyone have a clear answer? Thanks!
-
First off, thanks everyone for your replies

I'm well versed in best practices of 301 redirects, sitemaps, etc, etc. In other words, I fully know the optimal way to handle this. But, this is one of those situations where there are so many redirects involved (thousands) for a large site, that I want to make sure that what we are doing is fully worth the development time.
We are migrating a large website that was already migrated to a different CMS several years ago. There are thousands of legacy 301 redirects already in place for the current site, and many of those pages that are being REDIRECTED TO (from the old URL versions) receive very little/if any traffic. We need to decide if the work of redirecting them is worth it.
I'm not as worried about broken links for pages that don't get any traffic (although we ideally want 0 broken links). What I am most worried about, however, is losing domain authority and the whole site potentially ranking a little bit lower overall as a result.
Nakul's response (and Frederico's) are closest to what I am asking...but everyone is suggesting the same thing...that we will lose domain authority (example measurement: SEOmoz's OpenSiteExplorer domain authority score) if we don't keep those redirects in place (but of course, avoiding double redirects).
So, thanks again to everyone on this thread
If anyone has a differing opinion, I'd love to hear it...but this is pretty much what I expected: everyone's best educated assessment is that you will lose domain authority when 301 redirects are lifted and broken links are the end result. -
Great question Dan. @Jesse, you are on the right track. I think the question was misunderstood.
The question is, if seomoz.org links to Amazon.com/nakulgoyal and that page does not exist, is there link juice flow ? Think about it. It's like thinking about a citation. If seomoz.org mentions amazon.com/nakulgoyal, but does not actually have the hyperlink, is there citation flow.
So my question to the folks is, is there citation flow ? In my opinion, the answer is yes. There's some DA that will get passed along. Eventually, the site owner might identify the 404, "which they should" and setup a 301 redirect from Amazon.com/nakulgoyal to whatever pages makes most sense for the user, in which case there will be a proper link juice flow.
So to clarify what I said:
-
Scenario 1:
SiteA.com links to SiteB.com/urldoesnotexist - There is some (maybe close to negligible) domain authority flow. from siteA.com to siteB.com (Sort of like a link citation). There may not be a proper link juice flow, because the link is broken. -
Scenario 2:
SiteA.com links to SiteB.com/urldoesnotexist and this URL is 301 redirected SiteB.com/urlexists - In this case, there is both a authority flow and a link juice flow from SiteA.com to SiteB.com/urlexists
**That's my opinion. Think about it, the 301 redirect from /urldoesnotexist to /urlexists might get added 1 year from now and might be mistakenly removed at some point temporarily. There's going to be an affect in both cases. So in my opinion, the crux is, watch your 404's and redirect them when you and when it makes sense for the user. That way you have a good user experience and you can have the link juice flow where it should. **
-
-
Ideally you want to keep the number of 404 pages low because it tells the search engine that the page is a dead end, ask any SEO, it's best to keep the number of 404's as low as possible.
Link equity tells Google why to rank a page or give the root domain more authority. However, Google does not want users to end up on dead pages. So it will not help the site, rather hurt it. My recommendation is to create a sitemap and submit to Google WMT with the pages you want the spiders to index.
Limit the 404's as much as possible and try to 301 them if possible to a relevant page (from a user perspective).
-
I think, and correct me if I'm wrong Dan, you guys are misunderstanding the question.
He means that if you do actually create a 404 page for all your broken links to land on, will the juice pass from there to your domain (housing the 404 page) and on to whatever internal links you've built into said 404 page.
The answer, I think, is no. Reason for this is 404 is a status code returned before the 404 page is produced. Link juice can pass through either links (200) or redirects (301).
Again... I THINK.
Was this more what you were asking?
-
Equity is passed to a 404 page, which does not exist, therefore that equity is lost.
-
Thanks, Bryan. This doesn't really answer the exact question, though: is link equity still passed (and domain authority preserved) by broken links producing 404 Error Pages?
-
No they don't. Search engine spiders follow the link as a user, if the pages no longer exist and you cannot forward the user to a better page then create a good 404 page that will keep the users intrigued.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My last site crawl shows over 700 404 errors all with void(0 added to the ends of my posts/pages.
Hello, My last site crawl shows over 700 404 errors all with void(0 added to the ends of my posts/pages. I have contacted my theme company but not sure what could have done this. Any ideas? The original posts/pages are still correct and working it just looks like it did duplicates and added void(0 to the end of each post/page. Questions: There is no way to undo this correct? Do I have to do a redirect on each of these? Will this hurt my rankings and domain authority? Any suggestions would be appreciated. Thanks, Wade
Intermediate & Advanced SEO | | neverenoughmusic.com0 -
What's the best way to noindex pages but still keep backlinks equity?
Hello everyone, Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages? For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page? The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?
Intermediate & Advanced SEO | | fablau3 -
When should you 410 pages instead of 404
Hi All, We have approx 6,000 - 404 pages. These are for categories etc we don't do anymore and there is not near replacement etc so basically no reason or benefit to have them at all. I can see in GWT , these are still being crawled/found and therefore taking up crawler bandwidth. Our SEO agency said we should 410 these pages?.. I am wondering what the difference is and how google treats them differently ?. Do anyone know When should you 410 pages instead of 404 ? thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Is this link follow or nofollow? Does it pass linkjuice?
I have been seeing conflicting opinions about how Google would treat links using 'onclick'. For the example provided below: Would Google follow this link and pass the appropriate linking metrics(it is internal and points to a deeper level in our visnav)? =-=-=-=-=-=-= <div id='<a class="attribute-value">navBoxContainer</a>' class="<a class="attribute-value">textClass</a>"> <div id="<a class="attribute-value">boxTitle</a>" onclick="<a class="attribute-value">location.href='bla</a>h.example.com"> <div class="<a class="attribute-value">boxTitleContent</a>" title="<a class="attribute-value">Text Here</a>"><a href<a class="attribute-value">Text Here</a>"><a ="blah.exam.cpleom">Text Herea>div> ``` =-=-=-=-=-=-= An simple yes/no would be alright, but any detail/explination you could provide would be helpful and very much appreciated. Thank you all for your time and responses.
Intermediate & Advanced SEO | | TLM0 -
Best possible linking on site with 100K indexed pages
Hello All, First of all I would like to thank everybody here for sharing such great knowledge with such amazing and heartfelt passion.It really is good to see. Thank you. My story / question: I recently sold a site with more than 100k pages indexed in Google. I was allowed to keep links on the site.These links being actual anchor text links on both the home page as well on the 100k news articles. On top of that, my site syndicates its rss feed (Just links and titles, no content) to this page. However, the new owner made a mess, and now the site could possibly be seen as bad linking to my site. Google tells me within webmasters that this particular site gives me more than 400K backlinks. I have NEVER received one single notice from Google that I have bad links. That first. But, I was worried that this page could have been the reason why MY site tanked as bad as it did. It's the only source linking so massive to me. Just a few days ago, I got in contact with the new site owner. And he has taken my offer to help him 'better' his site. Although getting the site up to date for him is my main purpose, since I am there, I will also put effort in to optimizing the links back to my site. My question: What would be the best to do for my 'most SEO gain' out of this? The site is a news paper type of site, catering for news within the exact niche my site is trying to rank. Difference being, his is a news site, mine is not. It is commercial. Once I fix his site, there will be regular news updates all within the niche we both are in. Regularly as in several times per day. It's news. In the niche. Should I leave my rss feed in the side bars of all the content? Should I leave an achor text link on the sidebar (on all news etc.) If so: there can be just one keyword... 407K pages linking with just 1 kw?? Should I keep it to just one link on the home page? I would love to hear what you guys think. (My domain is from 2001. Like a quality wine. However, still tanked like a submarine.) ALL SEO reports I got here are now Grade A. The site is finally fully optimized. Truly nice to have that confirmation. Now I hope someone will be able to tell me what is best to do, in order to get the most SEO gain out of this for my site. Thank you.
Intermediate & Advanced SEO | | richardo24hr0 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0 -
100 + links on a scrolling page
Can you add more than 100 links on your webpage If you have a webpage that adds more content from a database as a visitor scrolls down the page. If you look at the page source the 100 + links do not show up, only the first 20 links. As you scroll down it adds more content and links to the bottom of the page so its a continuos flowing page if you keep scrolling down. Just wanted to know how the 100 links maximum fits into this scenario ?
Intermediate & Advanced SEO | | jlane90 -
Generating 404 Errors but the Pages Exist
Hey I have recently come across an issue with several of a sites urls being seen as a 404 by bots such as Xenu, SEOMoz, Google Web Tools etc. The funny thing is, the pages exist and display fine. This happens on many of the pages which use the Modx CMS, but the index is fine. The wordpress blog in /blog/ all works fine. The only thing I can think of is that I have a conflict in the htaccess, but troubleshooting this is difficult, any tool I have found online seem useless. Have tried to rollback to previous versions but still does not work. Anyone had any experience of similar issues? Many thanks K.
Intermediate & Advanced SEO | | Found0