Skip to content
    Moz logo Menu open Menu close
    • Products
      • Moz Pro
      • Moz Pro Home
      • Moz Local
      • Moz Local Home
      • STAT
      • Moz API
      • Moz API Home
      • Compare SEO Products
      • Moz Data
    • Free SEO Tools
      • Domain Analysis
      • Keyword Explorer
      • Link Explorer
      • Competitive Research
      • MozBar
      • More Free SEO Tools
    • Learn SEO
      • Beginner's Guide to SEO
      • SEO Learning Center
      • Moz Academy
      • MozCon
      • Webinars, Whitepapers, & Guides
    • Blog
    • Why Moz
      • Digital Marketers
      • Agency Solutions
      • Enterprise Solutions
      • Small Business Solutions
      • The Moz Story
      • New Releases
    • Log in
    • Log out
    • Products
      • Moz Pro

        Your all-in-one suite of SEO essentials.

      • Moz Local

        Raise your local SEO visibility with complete local SEO management.

      • STAT

        SERP tracking and analytics for enterprise SEO experts.

      • Moz API

        Power your SEO with our index of over 44 trillion links.

      • Compare SEO Products

        See which Moz SEO solution best meets your business needs.

      • Moz Data

        Power your SEO strategy & AI models with custom data solutions.

      Track AI Overviews in Keyword Research
      Moz Pro

      Track AI Overviews in Keyword Research

      Try it free!
    • Free SEO Tools
      • Domain Analysis

        Get top competitive SEO metrics like DA, top pages and more.

      • Keyword Explorer

        Find traffic-driving keywords with our 1.25 billion+ keyword index.

      • Link Explorer

        Explore over 40 trillion links for powerful backlink data.

      • Competitive Research

        Uncover valuable insights on your organic search competitors.

      • MozBar

        See top SEO metrics for free as you browse the web.

      • More Free SEO Tools

        Explore all the free SEO tools Moz has to offer.

      NEW Keyword Suggestions by Topic
      Moz Pro

      NEW Keyword Suggestions by Topic

      Learn more
    • Learn SEO
      • Beginner's Guide to SEO

        The #1 most popular introduction to SEO, trusted by millions.

      • SEO Learning Center

        Broaden your knowledge with SEO resources for all skill levels.

      • On-Demand Webinars

        Learn modern SEO best practices from industry experts.

      • How-To Guides

        Step-by-step guides to search success from the authority on SEO.

      • Moz Academy

        Upskill and get certified with on-demand courses & certifications.

      • MozCon

        Save on Early Bird tickets and join us in London or New York City

      Unlock flexible pricing & new endpoints
      Moz API

      Unlock flexible pricing & new endpoints

      Find your plan
    • Blog
    • Why Moz
      • Digital Marketers

        Simplify SEO tasks to save time and grow your traffic.

      • Small Business Solutions

        Uncover insights to make smarter marketing decisions in less time.

      • Agency Solutions

        Earn & keep valuable clients with unparalleled data & insights.

      • Enterprise Solutions

        Gain a competitive edge in the ever-changing world of search.

      • The Moz Story

        Moz was the first & remains the most trusted SEO company.

      • New Releases

        Get the scoop on the latest and greatest from Moz.

      Surface actionable competitive intel
      New Feature

      Surface actionable competitive intel

      Learn More
    • Log in
      • Moz Pro
      • Moz Local
      • Moz Local Dashboard
      • Moz API
      • Moz API Dashboard
      • Moz Academy
    • Avatar
      • Moz Home
      • Notifications
      • Account & Billing
      • Manage Users
      • Community Profile
      • My Q&A
      • My Videos
      • Log Out

    The Moz Q&A Forum

    • Forum
    • Questions
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. Home
    2. SEO Tactics
    3. Intermediate & Advanced SEO
    4. What's the best way to noindex pages but still keep backlinks equity?

    Moz Q&A is closed.

    After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

    What's the best way to noindex pages but still keep backlinks equity?

    Intermediate & Advanced SEO
    2
    5
    2151
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with question management privileges can see it.
    • fablau
      fablau last edited by

      Hello everyone,

      Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages?

      For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page?

      The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?

      1 Reply Last reply Reply Quote 3
      • fablau
        fablau @ChrisAshton last edited by

        Thank you Chris for your in-depth answer, you just confirmed what I suspected.

        To clarify though, what I am trying to save here by noindexing those subsequent pages is "indexing budget" not "crawl budget". You know the famous "indexing cap"? And also, tackling possible "duplicate" or "thin" content issues with such "similar but different" pages... fact is, our website has been hit by Panda several times, we recovered several times as well, but we have been hit again with the latest quality update of last June, and we are trying to find a way to get out of it once for all. Hence my attempt to reduce the number of similar indexed pages as much as we can.

        I have just opened a discussion on this "Panda-non-sense" issue, and I'd like to know your opinion about it:

        https://a-moz.groupbuyseo.org/community/q/panda-rankings-and-other-non-sense-issues

        Thank you again.

        1 Reply Last reply Reply Quote 0
        • ChrisAshton
          ChrisAshton @fablau last edited by

          Hi Fabrizo,

          That's a tricky one given the sheer volume of pages/music on the site. Typically the cleanest way to handle all of this is to offer up a View All page and Canonical back to that but in your case, a View All pages would scroll on forever!

          Canonical is not the answer here. It's made for handling duplicate pages like this:

          www.website.com/product1.html
               www.website.com/product1.html&sid=12432

          In this instance, both pages are 100% identical so the canonical tag tells Google that any variation of product1.html is actually just that page and should be counted as such. What you've got here is pagination so while the pages are mostly the same, they're not identical.

          Instead, this is exactly what rel=prev/next is for which you've already looked into. It's very hard to find recent information on this topic but the traditional advice from Google has been to implement prev/next and they will infer the most important page (typically page one) from the fact that it's the only page that has a rel=next but no rel=prev (because there is no previous page). Apologies if you already knew all of this; just making sure I didn't skim over anything here. Google also says these pages will essentially be seen as a single unit from that point and so all link equity will be consolidated toward that block of pages.

          Canonical and rel=next/prev do act separately so by all means if you have search filters or anything else that may alter the URL, a canonical tag can be used as well but each page here would just point back to itself, not back to page 1.

          This clip from Google's Maile Ohye is quite old but the advice in here clears a few things up and is still very relevant today.

          With that said, the other point you raised is very valid - what to do about crawl budget. Google also suggests just leaving them as-is since you're only linking to the first 5 pages and any links beyond that are buried so deep in the hierarchy they're seen as a low priority and will barely be looked at.

          From my understanding (though I'm a little hesitant on this one) is that noindexed pages do retain their link equity. Noindex doesn't say 'don't crawl me' (also meaning it won't help your crawl budget, this would have to be done through Robots.txt), it says 'don't include me in your index'. So on this logic it would make sense that links pointing to a noindexed page would still be counted.

          fablau 1 Reply Last reply Reply Quote 2
          • fablau
            fablau @ChrisAshton last edited by

            You are right, hard to give advice without the specific context.

            Well, here is the problem that I am facing: we have an e-commerce website and each category has several hundreds if not thousands of pages... now, I want just the first page of each category page to appear in the index in order to not waste the index cap and avoid possible duplicate issues, therefore I want to noindex all subsequent pages, and index just the first page (which is also the most rich).

            Here is an example from our website, our piano sheet music category page:

            http://www.virtualsheetmusic.com/downloads/Indici/Piano.html

            I want that first page to be in the index, but not the subsequent ones:

            http://www.virtualsheetmusic.com/downloads/Indici/Piano.html?cp=2

            http://www.virtualsheetmusic.com/downloads/Indici/Piano.html?cp=3

            etc...

            After playing with canonicals and rel,next, I have realized that Google still keeps those unuseful pages in the index, whereas by removing them could help with both index cap issues and possible Panda penalties (too many similar and not useful pages). But is there any way to keep any possible link-equity of those subsequent pages by noindexing them? Or maybe the link equity is anyway preserved on those pages and on the overall domain as well? And, better, is there a way to move all that possible link equity to the first page in some way?

            I hope this makes sense. Thank you for your help!

            ChrisAshton 1 Reply Last reply Reply Quote 0
            • ChrisAshton
              ChrisAshton last edited by

              Apologies for the indirect answer but I would have to ask "why"?

              If these pages are almost identical and you only want one of them to be indexed, in most situations the users would probably benefit from there only being that one main page. Cutting down on redundant pages is great for UX, crawl budget and general site quality.

              Maybe there is a genuine reason for it but without knowing the context it's hard to give accurate info on the best way to handle it 🙂

              fablau 1 Reply Last reply Reply Quote 0
              • 1 / 1
              • First post
                Last post

              Got a burning SEO question?

              Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


              Start my free trial


              Browse Questions

              Explore more categories

              • Moz Tools

                Chat with the community about the Moz tools.

              • SEO Tactics

                Discuss the SEO process with fellow marketers

              • Community

                Discuss industry events, jobs, and news!

              • Digital Marketing

                Chat about tactics outside of SEO

              • Research & Trends

                Dive into research and trends in the search industry.

              • Support

                Connect on product support and feature requests.

              • See all categories

              Related Questions

              • Markbwc

                New Subdomain & Best Way To Index

                We have an ecommerce site, we'll say at https://example.com. We have created a series of brand new landing pages, mainly for PPC and Social at https://sub.example.com, but would also like for these to get indexed. These are built on Unbounce so there is an easy option to simply uncheck the box that says "block page from search engines", however I am trying to speed up this process but also do this the best/correct way. I've read a lot about how we should build landing pages as a sub-directory, but one of the main issues we are dealing with is long page load time on https://example.com, so I wanted a kind of fresh start. I was thinking a potential solution to index these quickly/correctly was to make a redirect such as https://example.com/forward-1 -> https:sub.example.com/forward-1 then submit https://example.com/forward-1 to Search Console but I am not sure if that will even work. Another possible solution was to put some of the subdomain links accessed on the root domain say right on the pages or in the navigation. Also, will I definitely be hurt by 'starting over' with a new website? Even though my MozBar on my subdomain https://sub.example.com has the same domain authority (DA) as the root domain https://example.com? Recommendations and steps to be taken are welcome!

                Intermediate & Advanced SEO | | Markbwc
                0
              • SDCMarketing

                Change Google's version of Canonical link

                Hi My website has millions of URLs and some of the URLs have duplicate versions. We did not set canonical all these years. Now we wanted to implement it  and fix all the technical SEO issues. I wanted to consolidate and redirect all the variations of a URL to the highest pageview version and use that as the canonical because all of these variations have the same content. While doing this, I found in Google search console that Google has already selected another variation of URL as canonical and not the highest pageview version. My questions: I have millions of URLs for which I have to do 301 and set canonical. How can I find all the canonical URLs that Google has autoselected? Search Console has a daily quota of 100 or something. Is it possible to override Google's version of Canonical? Meaning, if I set a variation as Canonical and it is different than what Google has already selected, will it change overtime in Search Console? Should I just do a 301 to highest pageview variation of the URL and not set canonicals at all? This way the canonical that Google auto selected might get redirected to the highest pageview variation of the URL. Any advice or help would be greatly appreciated.

                Intermediate & Advanced SEO | | SDCMarketing
                0
              • pdrama231

                Should I Add Location to ALL of My Client's URLs?

                Hi Mozzers, My first Moz post! Yay! I'm excited to join the squad 🙂 My client is a full service entertainment company serving the Washington DC Metro area (DC, MD & VA) and offers a host of services for those wishing to throw events/parties. Think DJs for weddings, cool photo booths, ballroom lighting etc. I'm wondering what the right URL structure should be. I've noticed that some of our competitors do put DC area keywords in their URLs, but with the moves of SERPs to focus a lot more on quality over keyword density, I'm wondering if we should focus on location based keywords in traditional areas on page (e.g. title tags, headers, metas, content etc) instead of having keywords in the URLs alongside the traditional areas I just mentioned. So, on every product related page should we do something like: example.com/weddings/planners-washington-dc-md-va
                example.com/weddings/djs-washington-dc-md-va
                example.com/weddings/ballroom-lighting-washington-dc-md-va OR example.com/weddings/planners
                example.com/weddings/djs
                example.com/weddings/ballroom-lighting In both cases, we'd put the necessary location based keywords in the proper places on-page. If we follow the location-in-URL tactic, we'd use DC area terms in all subsequent product page URLs as well. Essentially, every page outside of the home page would have a location in it. Thoughts? Thank you!!

                Intermediate & Advanced SEO | | pdrama231
                0
              • esiow2013

                May know what's the meaning of these parameters in .htaccess?

                Begin HackRepair.com Blacklist RewriteEngine on Abuse Agent Blocking RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Bolt\ 0 [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} CazoodleBot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Custo [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Default\ Browser\ 0 [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^DIIbot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^DISCo [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} discobot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^eCatch [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ecxi [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^FlashGet [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^GetRight [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^GrabNet [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Grafula [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} GT::WWW [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} heritrix [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^HMView [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} HTTP::Lite [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ia_archiver [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} IDBot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} id-search [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} id-search.org [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^InterGET [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} IRLbot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ISC\ Systems\ iRc\ Search\ 2.1 [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Java [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^JetCar [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^larbin [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} libwww-perl [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Link [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} LinksManager.com_bot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} linkwalker [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} lwp-trivial [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Maxthon$ [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} MFC_Tear_Sample [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^microsoft.url [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} Microsoft\ URL\ Control [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} Missigua\ Locator [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Mozilla.NEWT [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Navroad [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^NearSite [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^NetAnts [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^NetSpider [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^NetZIP [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Octopus [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} panscient.com [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^pavuk [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} PECL::HTTP [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^PeoplePal [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} PHPCrawl [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} PleaseCrawl [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^RealDownload [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^ReGet [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Rippers\ 0 [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} SBIder [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^SeaMonkey$ [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} Snoopy [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} Steeler [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^SuperBot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Surfbot [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Toata\ dragostea\ mea\ pentru\ diavola [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} URI::Fetch [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} urllib [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} User-Agent [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} Web\ Sucker [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} webalta [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebAuto [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} WebCollage [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebCopier [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebFetch [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebReaper [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebSauger [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebStripper [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WebZIP [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} Wells\ Search\ II [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} WEP\ Search [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Widow [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WWW-Mechanize [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} zermelo [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ^(.)Zeus.Webster [NC,OR]
                RewriteCond %{HTTP_USER_AGENT} ZyBorg [NC]
                RewriteRule ^. - [F,L] Abuse bot blocking rule end End HackRepair.com Blacklist

                Intermediate & Advanced SEO | | esiow2013
                1
              • WebServiceConsulting.com

                NoIndexing Massive Pages all at once: Good or bad?

                If you have a site with a few thousand high quality and authoritative pages, and tens of thousands with search results and tags pages with thin content, and noindex,follow the thin content pages all at once, will google see this is a good or bad thing? I am only trying to do what Google guidelines suggest, but since I have so many pages index on my site, will throwing the noindex tag on ~80% of thin content pages negatively impact my site?

                Intermediate & Advanced SEO | | WebServiceConsulting.com
                0
              • M_D_Golden_Peak

                Do 404 Pages from Broken Links Still Pass Link Equity?

                Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this. When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost? We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name). Anyone have a clear answer? Thanks!

                Intermediate & Advanced SEO | | M_D_Golden_Peak
                0
              • PottyScotty

                Creating 100,000's of pages, good or bad idea

                Hi Folks, Over the last 10 months we have focused on quality pages but have been frustrated with competition websites out ranking us because they have bigger sites.  Should we focus on the long tail again? One option for us is to take every town across the UK and create pages using our activities.  e.g. Stirling
                Stirling paintball
                Stirling Go Karting
                Stirling Clay shooting We are not going to link to these pages directly from our main menus but from the site map. These pages would then show activities that were in a 50 mile radius of the towns.  At the moment we have have focused our efforts on Regions, e.g. Paintball Scotland, Paintball Yorkshire focusing all the internal link juice to these regional pages, but we don't rank high for towns that the activity sites are close to. With 45,000 towns and 250 activities we could create over a million pages which seems very excessive!  Would creating 500,000 of these types of pages damage our site? This is my main worry, or would it make our site rank even higher for the tougher keywords and also get lots of traffic from the long tail like we used to get. Is there a limit to how big a site should be? edit

                Intermediate & Advanced SEO | | PottyScotty
                0
              • EugeneF

                Best way to merge 2 ecommerce sites

                Our Client owns two ecommerce websites. Website A sells 20 related brands.  Website has improving search rank, but not normally on the second to fourth page of google. Website B was purchased from a competitor.  It has 1 brand (also sold on site A).  Search results are normally high on the first page of google. Client wants to consider merging the two sites.   We are looking at options. Option 1:  Do nothing, site B dominates it’s brand, but this will not do anything to boost site A. Option 2: keep both sites running, but put lots of canonical tags on site B pointing to site A Option 3: close down site B and make a lot of 301 redirects to site A Option 4: ??? Any thoughts on this would be great.  We want to do this in a way that boosts site A as much as possible without losing sales on the one brand that site B sells.

                Intermediate & Advanced SEO | | EugeneF
                0

              Get started with Moz Pro!

              Unlock the power of advanced SEO tools and data-driven insights.

              Start my free trial
              Products
              • Moz Pro
              • Moz Local
              • Moz API
              • Moz Data
              • STAT
              • Product Updates
              Moz Solutions
              • SMB Solutions
              • Agency Solutions
              • Enterprise Solutions
              • Digital Marketers
              Free SEO Tools
              • Domain Authority Checker
              • Link Explorer
              • Keyword Explorer
              • Competitive Research
              • Brand Authority Checker
              • Local Citation Checker
              • MozBar Extension
              • MozCast
              Resources
              • Blog
              • SEO Learning Center
              • Help Hub
              • Beginner's Guide to SEO
              • How-to Guides
              • Moz Academy
              • API Docs
              About Moz
              • About
              • Team
              • Careers
              • Contact
              Why Moz
              • Case Studies
              • Testimonials
              Get Involved
              • Become an Affiliate
              • MozCon
              • Webinars
              • Practical Marketer Series
              • MozPod
              Connect with us

              Contact the Help team

              Join our newsletter
              Moz logo
              © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
              • Accessibility
              • Terms of Use
              • Privacy

              Looks like your connection to Moz was lost, please wait while we try to reconnect.