Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What's the best way to noindex pages but still keep backlinks equity?
-
Hello everyone,
Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages?
For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page?
The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?
-
Thank you Chris for your in-depth answer, you just confirmed what I suspected.
To clarify though, what I am trying to save here by noindexing those subsequent pages is "indexing budget" not "crawl budget". You know the famous "indexing cap"? And also, tackling possible "duplicate" or "thin" content issues with such "similar but different" pages... fact is, our website has been hit by Panda several times, we recovered several times as well, but we have been hit again with the latest quality update of last June, and we are trying to find a way to get out of it once for all. Hence my attempt to reduce the number of similar indexed pages as much as we can.
I have just opened a discussion on this "Panda-non-sense" issue, and I'd like to know your opinion about it:
https://a-moz.groupbuyseo.org/community/q/panda-rankings-and-other-non-sense-issues
Thank you again.
-
Hi Fabrizo,
That's a tricky one given the sheer volume of pages/music on the site. Typically the cleanest way to handle all of this is to offer up a View All page and Canonical back to that but in your case, a View All pages would scroll on forever!
Canonical is not the answer here. It's made for handling duplicate pages like this:
www.website.com/product1.html
www.website.com/product1.html&sid=12432In this instance, both pages are 100% identical so the canonical tag tells Google that any variation of product1.html is actually just that page and should be counted as such. What you've got here is pagination so while the pages are mostly the same, they're not identical.
Instead, this is exactly what rel=prev/next is for which you've already looked into. It's very hard to find recent information on this topic but the traditional advice from Google has been to implement prev/next and they will infer the most important page (typically page one) from the fact that it's the only page that has a rel=next but no rel=prev (because there is no previous page). Apologies if you already knew all of this; just making sure I didn't skim over anything here. Google also says these pages will essentially be seen as a single unit from that point and so all link equity will be consolidated toward that block of pages.
Canonical and rel=next/prev do act separately so by all means if you have search filters or anything else that may alter the URL, a canonical tag can be used as well but each page here would just point back to itself, not back to page 1.
This clip from Google's Maile Ohye is quite old but the advice in here clears a few things up and is still very relevant today.
With that said, the other point you raised is very valid - what to do about crawl budget. Google also suggests just leaving them as-is since you're only linking to the first 5 pages and any links beyond that are buried so deep in the hierarchy they're seen as a low priority and will barely be looked at.
From my understanding (though I'm a little hesitant on this one) is that noindexed pages do retain their link equity. Noindex doesn't say 'don't crawl me' (also meaning it won't help your crawl budget, this would have to be done through Robots.txt), it says 'don't include me in your index'. So on this logic it would make sense that links pointing to a noindexed page would still be counted.
-
You are right, hard to give advice without the specific context.
Well, here is the problem that I am facing: we have an e-commerce website and each category has several hundreds if not thousands of pages... now, I want just the first page of each category page to appear in the index in order to not waste the index cap and avoid possible duplicate issues, therefore I want to noindex all subsequent pages, and index just the first page (which is also the most rich).
Here is an example from our website, our piano sheet music category page:
http://www.virtualsheetmusic.com/downloads/Indici/Piano.html
I want that first page to be in the index, but not the subsequent ones:
http://www.virtualsheetmusic.com/downloads/Indici/Piano.html?cp=2
http://www.virtualsheetmusic.com/downloads/Indici/Piano.html?cp=3
etc...
After playing with canonicals and rel,next, I have realized that Google still keeps those unuseful pages in the index, whereas by removing them could help with both index cap issues and possible Panda penalties (too many similar and not useful pages). But is there any way to keep any possible link-equity of those subsequent pages by noindexing them? Or maybe the link equity is anyway preserved on those pages and on the overall domain as well? And, better, is there a way to move all that possible link equity to the first page in some way?
I hope this makes sense. Thank you for your help!
-
Apologies for the indirect answer but I would have to ask "why"?
If these pages are almost identical and you only want one of them to be indexed, in most situations the users would probably benefit from there only being that one main page. Cutting down on redundant pages is great for UX, crawl budget and general site quality.
Maybe there is a genuine reason for it but without knowing the context it's hard to give accurate info on the best way to handle it

Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Should you allow an auto dealer's inventory to be indexed?
Due to the way most auto dealership website populate inventory pages, should you allow inventory to be indexed at all? The main benefit us more content. The problem is it creates duplicate, or near duplicate content. It also creates a ton of crawl errors since the turnover is so short and fast. I would love some help on this. Thanks!
Intermediate & Advanced SEO | | Gauge1230 -
Putting "noindex" on a page that's in an iframe... what will that mean for the parent page?
If I've got a page that is being called in an iframe, on my homepage, and I don't want that called page to be indexed.... so I put a noindex tag on the called page (but not on the homepage) what might that mean for the homepage? Nothing? Will Google, Bing, Yahoo, or anyone else, potentially see that as a noindex tag on my homepage?
Intermediate & Advanced SEO | | Philip-DiPatrizio0 -
Best way to get pages indexed fast?
Any suggestion on best ways to get new sites pages indexed? Was thinking getting high pr inbound links on fiverr but always a little risky right? Thanks for your opinions.
Intermediate & Advanced SEO | | mweidner27820 -
Removing Dynamic "noindex" URL's from Index
6 months ago my clients site was overhauled and the user generated searches had an index tag on them. I switched that to noindex but didn't get it fast enough to avoid being 100's of pages indexed in Google. It's been months since switching to the noindex tag and the pages are still indexed. What would you recommend? Google crawls my site daily - but never the pages that I want removed from the index. I am trying to avoid submitting hundreds of these dynamic URL's to the removal tool in webmaster tools. Suggestions?
Intermediate & Advanced SEO | | BeTheBoss0 -
Tool to calculate the number of pages in Google's index?
When working with a very large site, are there any tools that will help you calculate the number of links in the Google index? I know you can use site:www.domain.com to see all the links indexed for a particular url. But what if you want to see the number of pages indexed for 100 different subdirectories (i.e. www.domain.com/a, www.domain.com/b)? is there a tool to help automate the process of finding the number of pages from each subdirectory in Google's index?
Intermediate & Advanced SEO | | nicole.healthline0 -
To subnav or NOT to subnav... that's my question.... :)
We are working on a new website that is golf related and wondering about whether or not we should set up a subnavigation dropdown menu from the main menu. For example: GOLF PACKAGES
Intermediate & Advanced SEO | | JamesO
>> 2 Round Packages
>> 3 Round Packages
>> 4 Round Packages
>> 5 Round Packages GOLF COURSES
>> North End Courses
>> Central Courses
>> South End Courses This would actually be very beneficial to our users from a usability standpoint, BUT what about from an SEO standpoint? Is diverting all the link juice to these inner pages from the main site navigation harmful? Should we just create a page for GOLF PACKAGES and break it down on that page?0 -
What's your best hidden SEO secret?
Don't take that question too serious but all answers are welcome 😉 Answer to all:
Intermediate & Advanced SEO | | petrakraft
"Gentlemen, I see you did you best - at least I hope so! But after all I suppose I am stuck here to go on reading the SEOmoz blog if I can't sqeeze more secrets from you!9