Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Do backlinks need to be clicked to pass linkjuice?
-
Hi all:
Do backlinks need to be clicked to pass linkjuice? Is so, can someone explain how much traffic is needed from a backlink to count as linkjuice?
Thanks for the help.
Audrey.
-
Backlinks do not have to be clicked in order for them to count as linkjuice. Recently my org (missionquest.org) joined MOZ and it helped our backlinks and improved our SEO.
-
I would be surprised.
Google knows a lot, but not everything. Unless GA tracking code is installed google shall not know about things such a user click.
If they were passing page juices only for clicked backlink they would be ruling out a too big chunk of the web. It doesn't sound logic to me.
Also it doesn't sound realistic to analyze all users click in the world when refreshing google index, they do have a lot of metal, but not that much.
-
So, are you saying that a link having traffic kind of disqualifies it as spammy? Or at least in the eyes of Google?
-
Absolutely not. Spam links still work fantastic for ranking a site (temporarily). Those are links that never get seen or clicked, they pretty much just get crawled. Don't go the spam route, but also don't worry too much about people clicking links. I've gotten a ton of great links that have sent very, very little referral traffic, meaning links on popular posts still don't guarantee getting any/many clicks.
-
I don't think so. I usually fetch and render then submit my pages anytime I add one to my site, or make a significant change, like adding content or changing images. Nothing unnatural about it.
-
Good idea. I wonder if it would seem "un-natural" however?
-
Submitting the page to Google for Indexing doesn't guarantee that the backlinks will be crawled, but it can be a good way to try to force them to be crawled.
-
In that case, wouldn't it be ideal to submit the page to google indexing right after it's published?
-
I think it's about Page popularity and users engagements. Popularity in search results means a lot of spiders in the page. And, when a user clicks the link, there's a spider follows him to the new page. And it's all about the spider discovered your page and your link as well (as I think).
-
In fact, it's not like that.
I will tell you a very important rule about backlinks and really hard to find it. Tha main point is that the link need to be discovered by Google. And, the page which contain the link must have popularity in Google search results which mean a lot of people entering the page through search results. This what we call "the Quality of the link"
Keep up with your link building journey.
-
The way that I understand it is that the click helps the link to be found faster than if it had not been clicked. It might have equity and pass link juice prior, but before Google finds it, it might not be counted as a link to your site. Does that make sense? The link needs to be discovered before the link juice is actually counted. At least that is the way that I understand it.
I do know a few professionals who believe that if a link isn't clicked link juice is never passed. I don't know if that is necessarily true. It makes sense that a link could be discovered but not have any equity because it isn't being used. I wonder if someone has a better idea of whether or not that is true, or if it another secret Google keeps

Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If a page ranks in the wrong country and is redirected, does that problem pass to the new page?
Hi guys, I'm having a weird problem: A new multilingual site was launched about 2 months ago. It has correct hreflang tags and Geo targetting in GSC for every language version. We redirected some relevant pages (with good PA) from another website of our client's. It turned out that the pages were not ranking in the correct country markets (for example, the en-gb page ranking in the USA). The pages from our site seem to have the same problem. Do you think they inherited it due to the redirects? Is it possible that Google will sort things out over some time, given the fact that the new pages have correct hreflangs? Is there stuff we could do to help ranking in the correct country markets?
Intermediate & Advanced SEO | | ParisChildress1 -
Thought FRED penalty - Now see new spammy image backlinks what to do?
Hi, So starting about March 9 I started seeing huge losses in ranking for a client. These rankings continue to drop every week since and we changed nothing on the site. At first I thought it must be the FRED update, so we have started rewriting and adding product descriptions to our pages (which is a good thing regardless). I also checked our backlink profile using OSE on MOZ and still saw the few linking root domains we had. Another Odd thing on this is that webmasters tools showed many more domains. So today I bought a subscriptions to ahrefs and instantly saw that on the same timeline (starting March 1 2017) until now, we have literally doubled in inbound links from very spammy type sites. BUT the incoming links are not to content, people seem to be ripping off our images. So my question is, do spammy inbound image links count against us the same as if someone linked actual written content or non image urls? Is FRED something I should still be looking into? Should i disavow a list of inbound image links? Thanks in advance!
Intermediate & Advanced SEO | | plahpoy0 -
Lazy Loading of products on an E-Commerce Website - Options Needed
Hi Moz Fans. We are in the process of re-designing our product pages and we need to improve the page load speed. Our developers have suggested that we load the associated products on the page using Lazy Loading, While I understand this will certainly have a positive impact on the page load speed I am concerned on the SEO impact. We can have upwards of 50 associated products on a page so need a solution. So far I have found the following solution online which uses Lazy Loading and Escaped Fragments - The concern here is from serving an alternate version to search engines. The solution was developed by Google not only for lazy loading, but for indexing AJAX contents in general.
Intermediate & Advanced SEO | | JBGlobalSEO
Here's the official page: Making AJAX Applications Crawlable. The documentation is simple and clear, but in a few words the solution is to use slightly modified URL fragments.
A fragment is the last part of the URL, prefixed by #. Fragments are not propagated to the server, they are used only on the client side to tell the browser to show something, usually to move to a in-page bookmark.
If instead of using # as the prefix, you use #!, this instructs Google to ask the server for a special version of your page using an ugly URL. When the server receives this ugly request, it's your responsibility to send back a static version of the page that renders an HTML snapshot (the not indexed image in our case). It seems complicated but it is not, let's use our gallery as an example. Every gallery thumbnail has to have an hyperlink like: http://www.idea-r.it/...#!blogimage=<image-number></image-number> When the crawler will find this markup will change it to
http://www.idea-r.it/...?_escaped_fragment_=blogimage=<image-number></image-number> Let's take a look at what you have to answer on the server side to provide a valid HTML snapshot.
My implementation uses ASP.NET, but any server technology will be good. var fragment = Request.QueryString[``"_escaped_fragment_"``];``if (!String.IsNullOrEmpty(fragment))``{``var escapedParams = fragment.Split(``new``[] { ``'=' });``if (escapedParams.Length == 2)``{``var imageToDisplay = escapedParams[1];``// Render the page with the gallery showing ``// the requested image (statically!)``...``}``} What's rendered is an HTML snapshot, that is a static version of the gallery already positioned on the requested image (server side).
To make it perfect we have to give the user a chance to bookmark the current gallery image.
90% comes for free, we have only to parse the fragment on the client side and show the requested image if (window.location.hash)``{``// NOTE: remove initial #``var fragmentParams = window.location.hash.substring(1).split(``'='``);``var imageToDisplay = fragmentParams[1]``// Render the page with the gallery showing the requested image (dynamically!)``...``} The other option would be to look at a recommendation engine to show a small selection of related products instead. This would cut the total number of related products down. The concern with this one is we are removing a massive chunk of content from he existing pages, Some is not the most relevant but its content. Any advice and discussion welcome 🙂0 -
301 Redirect Showing Up as Thousands Of Backlinks?
Hi Everyone, I'm currently doing quite a large back link audit on my company's website and there's one thing that's bugging me. Our website used to be split into two domains for separate areas of the business but since we have merged them together into one domain and have 301 redirected the old domain the the main one. But now, both GWT and Majestic are telling me that I've got 12,000 backlinks from that domain? This domain didn't even have 12,000 pages when it was live and I only did specific 301 redirects (ie. for specific URL's and not an overall domain level 301 redirect) for about 50 of the URL's with all the rest being redirected to the homepage. Therefore I'm quite confused about why its showing up as so many backlinks - Old redirects I've done don't usually show as a backlink at all. UPDATE: I've got some more info on the specific back links. But now my question is - is having this many backlinks/redirects from a single domain going to be viewed negatively in Google's eyes? I'm currently doing a reconsideration request and would look to try and fix this issue if having so many backlinks from a single domain would be against Google's guidelines. Does anybody have any ideas? Probably somthing very obvious. Thanks! Sam
Intermediate & Advanced SEO | | Sandicliffe0 -
Are backlinks the most important factor in SEO?
I have had an agency state that "Backlinks are the most important factor in SEO". That is how they are justifying their strategy of approaching bloggers. I believe there are a lot more factors than that including Target Market definition, Keyword identification an build content based on these factors. What's everyone's thoughts?
Intermediate & Advanced SEO | | AndySalmons0 -
Does a 302 redirect pass penalties?
I'm having problems finding a definitive answer to this question, there is a lot of rumour and gossip out there but nothing I can rely on. I'm working with a site that received an unnatural links notice followed by a massive drop in search traffic. Looking at the link profile it's pretty much jacked beyond repair and I have recommended that we move over to a fresh domain. However, it's an established brand with many more sources of traffic than organic search. There's no way we can burn all their repeat visits, loyal customers, brand recognition that they've built up over the years so I want to redirect from the old domain to the new. This is not to try and make any SEO gain from the previous site, frankly we don't give a crap about that. We just want to maintain the brand. A 302 is a temporary redirect, this will be a permanent move BUT a 301 will pass on the penalty. So can we safely use a 302 redirect in this situation or is there a better alternative (meta refresh?) Thanks for your help! MB.
Intermediate & Advanced SEO | | MattBarker0 -
Do 404 Pages from Broken Links Still Pass Link Equity?
Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this. When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost? We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name). Anyone have a clear answer? Thanks!
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Does 302 pass link juice?
Hi! We have our content under two subdomains, one for the English language and one for Spanish. Depending on the language of the browser, there's a 302 redirecting to one of this subdomains. However, our main domain (which has no content) is receiving a lot of links - people rather link to mydomain.com than to en.mydomain.com. Does the 302 passing any link juice? If so, to which subdomain? Thank you!
Intermediate & Advanced SEO | | bodaclick0