Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Do 404 Pages from Broken Links Still Pass Link Equity?
-
Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this.
When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost?
We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name).
Anyone have a clear answer? Thanks!
-
First off, thanks everyone for your replies

I'm well versed in best practices of 301 redirects, sitemaps, etc, etc. In other words, I fully know the optimal way to handle this. But, this is one of those situations where there are so many redirects involved (thousands) for a large site, that I want to make sure that what we are doing is fully worth the development time.
We are migrating a large website that was already migrated to a different CMS several years ago. There are thousands of legacy 301 redirects already in place for the current site, and many of those pages that are being REDIRECTED TO (from the old URL versions) receive very little/if any traffic. We need to decide if the work of redirecting them is worth it.
I'm not as worried about broken links for pages that don't get any traffic (although we ideally want 0 broken links). What I am most worried about, however, is losing domain authority and the whole site potentially ranking a little bit lower overall as a result.
Nakul's response (and Frederico's) are closest to what I am asking...but everyone is suggesting the same thing...that we will lose domain authority (example measurement: SEOmoz's OpenSiteExplorer domain authority score) if we don't keep those redirects in place (but of course, avoiding double redirects).
So, thanks again to everyone on this thread
If anyone has a differing opinion, I'd love to hear it...but this is pretty much what I expected: everyone's best educated assessment is that you will lose domain authority when 301 redirects are lifted and broken links are the end result. -
Great question Dan. @Jesse, you are on the right track. I think the question was misunderstood.
The question is, if seomoz.org links to Amazon.com/nakulgoyal and that page does not exist, is there link juice flow ? Think about it. It's like thinking about a citation. If seomoz.org mentions amazon.com/nakulgoyal, but does not actually have the hyperlink, is there citation flow.
So my question to the folks is, is there citation flow ? In my opinion, the answer is yes. There's some DA that will get passed along. Eventually, the site owner might identify the 404, "which they should" and setup a 301 redirect from Amazon.com/nakulgoyal to whatever pages makes most sense for the user, in which case there will be a proper link juice flow.
So to clarify what I said:
-
Scenario 1:
SiteA.com links to SiteB.com/urldoesnotexist - There is some (maybe close to negligible) domain authority flow. from siteA.com to siteB.com (Sort of like a link citation). There may not be a proper link juice flow, because the link is broken. -
Scenario 2:
SiteA.com links to SiteB.com/urldoesnotexist and this URL is 301 redirected SiteB.com/urlexists - In this case, there is both a authority flow and a link juice flow from SiteA.com to SiteB.com/urlexists
**That's my opinion. Think about it, the 301 redirect from /urldoesnotexist to /urlexists might get added 1 year from now and might be mistakenly removed at some point temporarily. There's going to be an affect in both cases. So in my opinion, the crux is, watch your 404's and redirect them when you and when it makes sense for the user. That way you have a good user experience and you can have the link juice flow where it should. **
-
-
Ideally you want to keep the number of 404 pages low because it tells the search engine that the page is a dead end, ask any SEO, it's best to keep the number of 404's as low as possible.
Link equity tells Google why to rank a page or give the root domain more authority. However, Google does not want users to end up on dead pages. So it will not help the site, rather hurt it. My recommendation is to create a sitemap and submit to Google WMT with the pages you want the spiders to index.
Limit the 404's as much as possible and try to 301 them if possible to a relevant page (from a user perspective).
-
I think, and correct me if I'm wrong Dan, you guys are misunderstanding the question.
He means that if you do actually create a 404 page for all your broken links to land on, will the juice pass from there to your domain (housing the 404 page) and on to whatever internal links you've built into said 404 page.
The answer, I think, is no. Reason for this is 404 is a status code returned before the 404 page is produced. Link juice can pass through either links (200) or redirects (301).
Again... I THINK.
Was this more what you were asking?
-
Equity is passed to a 404 page, which does not exist, therefore that equity is lost.
-
Thanks, Bryan. This doesn't really answer the exact question, though: is link equity still passed (and domain authority preserved) by broken links producing 404 Error Pages?
-
No they don't. Search engine spiders follow the link as a user, if the pages no longer exist and you cannot forward the user to a better page then create a good 404 page that will keep the users intrigued.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 Errors flaring on nonexistent or unpublished pages – should we be concerned for SEO?
Hello! We keep getting "critical crawler" notifications on Moz because of firing 404 codes. We've checked each page and know that we are not linking to them anywhere on our site, they are not published and they are not indexed on Google. It's only happened since we migrated our blog to Hubspot so we think it has something to do with the test pages their developers had set up and that they are just lingering in our code somewhere. However, we are still concerned having these codes fire implies negative consequences for our SEO. Is this the case? Should we be concerned about these 404 codes despite the pages from those URLs not actually existing? Thank you!
Intermediate & Advanced SEO | | DebFF
Chloe0 -
Passing link juice via javascript?
Hello Client got website with javascript generated content. All links there (from mainpage to some deeper page) are js generated. In code there're only javascripts and other basic typical code but no text links (<a href...="" ).<="" p=""></a> <a href...="" ).<="" p="">The question is: are those js links got the same "seo power" as typical html href links?.For example majestic.com can't scan website properly and can't show seo metrics for pages. I know google crawls them (links and pages) but are they as good as typical links?</a> <a href...="" ).<="" p="">Regards,</a>
Intermediate & Advanced SEO | | PenaltyHammer0 -
Location Pages On Website vs Landing pages
We have been having a terrible time in the local search results for 20 + locations. I have Places set up and all, but we decided to create location pages on our sites for each location - brief description and content optimized for our main service. The path would be something like .com/location/example. One option that has came up in question is to create landing pages / "mini websites" that would probably be location-example.url.com. I believe that the latter option, mini sites for each location, would be a bad idea as those kinds of tactics were once spammy in the past. What are are your thoughts and and resources so I can convince my team on the best practice.
Intermediate & Advanced SEO | | KJ-Rodgers0 -
URL Value: Menu Links vs Body Content Links
Hi All, I'm a little confused. I have read a number of articles from authority sites that give mixed signals over the importance of menu links vs body content links. It is suggested that whilst all menu links spread link juice equally, Google does not see them as favourably. Inserting a link within the body will add more link juice value to the desired page. Any thoughts would be appreciated. Thanks Mark
Intermediate & Advanced SEO | | Mark_Ch0 -
Link Research Tools - Detox Links
Hi, I was doing a little research on my link profile and came across a tool called "LinkRessearchTools.com". I bought a subscription and tried them out. Doing the report they advised a low risk but identified 78 Very High Risk to Deadly (are they venomous?) links, around 5% of total and advised removing them. They also advised of many suspicious and low risk links but these seem to be because they have no knowledge of them so default to a negative it seems. So before I do anything rash and start removing my Deadly links, I was wondering if anyone had a). used them and recommend them b). recommend detoxing removing the deadly links c). would there be any cases in which so called Deadly links being removed cause more problems than solve. Such as maintaining a normal looking profile as everyone would be likely to have bad links etc... (although my thinking may be out on that one...). What do you think? Adam
Intermediate & Advanced SEO | | NaescentAdam0 -
Links from new sites with no link juice
Hi Guys, Do backlinks from a bunch of new sites pass any value to our site? I've heard a lot from some "SEO experts" say that it is an effective link building strategy to build a bunch of new sites and link them to our main site. I highly doubt that... To me, a new site is a new site, which means it won't have any backlinks in the beginning (most likely), so a backlink from this site won't pass too much link juice. Right? In my humble opinion this is not a good strategy any more...if you build new sites for the sake of getting links. This is just wrong. But, if you do have some unique content and you want to share with others on that particular topic, then you can definitely create a blog and write content and start getting links. And over time, the domain authority will increase, then a backlink from this site will become more valuable? I am not a SEO expert myself, so I am eager to hear your thoughts. Thanks.
Intermediate & Advanced SEO | | witmartmarketing0 -
Disallowed Pages Still Showing Up in Google Index. What do we do?
We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?
Intermediate & Advanced SEO | | udemy0 -
External 404 vs Internal 404
Which one is bad? External - when someone adds an incorrect link to your site, maybe does a typo when linking to an inner page. This page never existed on your site, google shows this as a 404 in Webmaster tools. Internal - a page existed, google indexed it, and you deleted it and didnt add a 301. Internal ones are in the webmaster's control, and i can understand if google gets upset if it sees a 404 for a URL that existed before, however surely "externally created" 404 shoudnt cause any harm cause that page never existed. And someone has inserted an incorrect link to your site.
Intermediate & Advanced SEO | | SamBuck0