• BBgmoro

        See all notifications

        Skip to content
        Moz logo Menu open Menu close
        • Products
          • Moz Pro
          • Moz Pro Home
          • Moz Local
          • Moz Local Home
          • STAT
          • Moz API
          • Moz API Home
          • Compare SEO Products
          • Moz Data
        • Free SEO Tools
          • Domain Analysis
          • Keyword Explorer
          • Link Explorer
          • Competitive Research
          • MozBar
          • More Free SEO Tools
        • Learn SEO
          • Beginner's Guide to SEO
          • SEO Learning Center
          • Moz Academy
          • MozCon
          • Webinars, Whitepapers, & Guides
        • Blog
        • Why Moz
          • Digital Marketers
          • Agency Solutions
          • Enterprise Solutions
          • Small Business Solutions
          • The Moz Story
          • New Releases
        • Log in
        • Log out
        • Products
          • Moz Pro

            Your all-in-one suite of SEO essentials.

          • Moz Local

            Raise your local SEO visibility with complete local SEO management.

          • STAT

            SERP tracking and analytics for enterprise SEO experts.

          • Moz API

            Power your SEO with our index of over 44 trillion links.

          • Compare SEO Products

            See which Moz SEO solution best meets your business needs.

          • Moz Data

            Power your SEO strategy & AI models with custom data solutions.

          Turn SEO data into actionable content briefs

          Turn SEO data into actionable content briefs

          Learn more
        • Free SEO Tools
          • Domain Analysis

            Get top competitive SEO metrics like DA, top pages and more.

          • Keyword Explorer

            Find traffic-driving keywords with our 1.25 billion+ keyword index.

          • Link Explorer

            Explore over 40 trillion links for powerful backlink data.

          • Competitive Research

            Uncover valuable insights on your organic search competitors.

          • MozBar

            See top SEO metrics for free as you browse the web.

          • More Free SEO Tools

            Explore all the free SEO tools Moz has to offer.

          Let your business shine with Listings AI

          Let your business shine with Listings AI

          Get found
        • Learn SEO
          • Beginner's Guide to SEO

            The #1 most popular introduction to SEO, trusted by millions.

          • SEO Learning Center

            Broaden your knowledge with SEO resources for all skill levels.

          • On-Demand Webinars

            Learn modern SEO best practices from industry experts.

          • How-To Guides

            Step-by-step guides to search success from the authority on SEO.

          • Moz Academy

            Upskill and get certified with on-demand courses & certifications.

          • MozCon

            Save on Early Bird tickets and join us in London or New York City

          Access 20 years of data with flexible pricing
          Moz API

          Access 20 years of data with flexible pricing

          Find your plan
        • Blog
        • Why Moz
          • Digital Marketers

            Simplify SEO tasks to save time and grow your traffic.

          • Small Business Solutions

            Uncover insights to make smarter marketing decisions in less time.

          • Agency Solutions

            Earn & keep valuable clients with unparalleled data & insights.

          • Enterprise Solutions

            Gain a competitive edge in the ever-changing world of search.

          • The Moz Story

            Moz was the first & remains the most trusted SEO company.

          • New Releases

            Get the scoop on the latest and greatest from Moz.

          Surface actionable competitive intel
          New Feature

          Surface actionable competitive intel

          Learn More
        • Log in
          • Moz Pro
          • Moz Local
          • Moz Local Dashboard
          • Moz API
          • Moz API Dashboard
          • Moz Academy
        • Avatar
          • Moz Home
          • Notifications
          • Account & Billing
          • Manage Users
          • Community Profile
          • My Q&A
          • My Videos
          • Log Out

        The Moz Q&A Forum

        • Forum
        • Questions
        • My Q&A
        • Users
        • Ask the Community

        Welcome to the Q&A Forum

        Browse the forum for helpful insights and fresh discussions about all things SEO.

        1. Home
        2. SEO Tactics
        3. Intermediate & Advanced SEO
        4. Soft 404's from pages blocked by robots.txt -- cause for concern?

        Moz Q&A is closed.

        After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

        Soft 404's from pages blocked by robots.txt -- cause for concern?

        Intermediate & Advanced SEO
        3
        6
        3057
        Loading More Posts
        • Watching

          Notify me of new replies.
          Show question in unread.

        • Not Watching

          Do not notify me of new replies.
          Show question in unread if category is not ignored.

        • Ignoring

          Do not notify me of new replies.
          Do not show question in unread.

        • Oldest to Newest
        • Newest to Oldest
        • Most Votes
        Reply
        • Reply as question
        Locked
        This topic has been deleted. Only users with question management privileges can see it.
        • nicole.healthline
          nicole.healthline last edited by

          We're seeing soft 404 errors appear in our google webmaster tools section on pages that are blocked by robots.txt (our search result pages).

          Should we be concerned? Is there anything we can do about this?

          1 Reply Last reply Reply Quote 4
          • CleverPhD
            CleverPhD @CleverPhD last edited by

            Me too.  It was that video that helped to clear things up for me.  Then I could see when to use robots.txt vs the noindex meta tag.  It has made a big difference in how I manage sites that have large amounts of content that can be sorted in a huge number of ways.

            1 Reply Last reply Reply Quote 0
            • Highland
              Highland @CleverPhD last edited by

              Good stuff. I was always under the impression they still crawled them (otherwise, how would you know if the block was removed).

              1 Reply Last reply Reply Quote 0
              • CleverPhD
                CleverPhD @Highland last edited by

                Take a look at

                http://www.youtube.com/watch?v=KBdEwpRQRD0

                to see what I am talking about.

                Robots.txt does prevent crawling according to Matt Cutts.

                Highland CleverPhD 2 Replies Last reply Reply Quote 1
                • Highland
                  Highland last edited by

                  Robots.txt prevents indexation, not crawling. The good news is that Googlebot stops crawling 404s.

                  CleverPhD 1 Reply Last reply Reply Quote 0
                  • CleverPhD
                    CleverPhD last edited by

                    Just a couple of under the hood things to check.

                    1. Are you sure your robots.txt is setup correctly. Check in GWT to see that Google is reading it.

                    2. This may be a timing issue.  Errors take 30-60 days to drop out (as what I have seen) so did they show soft 404 and then you added them to robots.txt?

                    If that was the case, this may be a sequence issue.  If Google finds a soft 404 (or some other error) then it comes back to spider and is not able to crawl the page due to robots.txt - it does not know what the current status of the page is so it may just leave the last status that it  found.

                    1. I tend to see soft 404 for pages that you have a 301 redirect on where you have a many to one association.  In other words, you have a bunch of pages that are 301ing to a single page.  You may want to consider changing where some of the 301s redirect so that they going to a specific page vs an index page.

                    2. If you have a page in robots.txt - you do not want them in Google, here is what I would do.   Show a 200 on that page but then put in the meta tags a noindex nofollow.

                    http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710

                    "When we see the noindex meta tag on a page, Google will completely drop the page from our search results, even if other pages link to it"

                    Let Google spider it so that it can see the 200 code - you get rid of the soft 404 errors.  Then toss in the noindex nofollow meta tags to have the page removed from the Google index.  It sounds backwards that you have to let Google spider to get it to remove stuff, but it works it you walk through the logic.

                    Good luck!

                    1 Reply Last reply Reply Quote 1
                    • 1 / 1
                    • First post
                      Last post

                    Browse Questions

                    Explore more categories

                    • Moz Tools

                      Chat with the community about the Moz tools.

                    • SEO Tactics

                      Discuss the SEO process with fellow marketers

                    • Community

                      Discuss industry events, jobs, and news!

                    • Digital Marketing

                      Chat about tactics outside of SEO

                    • Research & Trends

                      Dive into research and trends in the search industry.

                    • Support

                      Connect on product support and feature requests.

                    • See all categories

                    Related Questions

                    • SearchStan

                      Over-optimizing Internal Linking: Is this real and, if so, what's the happy medium?

                      I have heard a lot about having a solid internal linking structure so that Google can easily discover pages and understand your page hierarchies and correlations and equity can be passed. Often, it's mentioned that it's good to have optimized anchor text, but not too optimized. You hear a lot of warnings about how over-optimization can be perceived as spammy: https://neilpatel.com/blog/avoid-over-optimizing/ But you also see posts and news like this saying that the internal link over-optimization warnings are unfounded or outdated:
                      https://www.seroundtable.com/google-no-internal-linking-overoptimization-penalty-27092.html So what's the tea? Is internal linking overoptimization a myth? If it's true, what's the tipping point? Does it have to be super invasive and keyword stuffy to negatively impact rankings? Or does simple light optimization of internal links on every page trigger this?

                      Intermediate & Advanced SEO | | SearchStan
                      1
                    • andyheath

                      Magento: Should we disable old URL's or delete the page altogether

                      Our developer tells us that we have a lot of 404 pages that are being included in our sitemap and the reason for this is because we have put 301 redirects on the old pages to new pages. We're using Magento and our current process is to simply disable, which then makes it a a 404. We then redirect this page using a 301 redirect to a new relevant page. The reason for redirecting these pages is because the old pages are still being indexed in Google. I understand 404 pages will eventually drop out of Google's index, but was wondering if we were somehow preventing them dropping out of the index by redirecting the URL's, causing the 404 pages to be added to the sitemap. My questions are: 1. Could we simply delete the entire unwanted page, so that it returns a 404 and drops out of Google's index altogether? 2. Because the 404 pages are in the sitemap, does this mean they will continue to be indexed by Google?

                      Intermediate & Advanced SEO | | andyheath
                      0
                    • ThomasHarvey

                      What do you add to your robots.txt on your ecommerce sites?

                      We're looking at expanding our robots.txt, we currently don't have the ability to noindex/nofollow. We're thinking about adding the following: Checkout Basket Then possibly: Price Theme Sortby other misc filters. What do you include?

                      Intermediate & Advanced SEO | | ThomasHarvey
                      0
                    • jmorehouse

                      Disallow URLs ENDING with certain values in robots.txt?

                      Is there any way to disallow URLs ending in a certain value? For example, if I have the following product page URL: http://website.com/category/product1, and I want to disallow /category/product1/review, /category/product2/review, etc. without disallowing the product pages themselves, is there any shortcut to do this, or must I disallow each gallery page individually?

                      Intermediate & Advanced SEO | | jmorehouse
                      0
                    • Aikijeff

                      What are Soft 404's and are they a problem

                      Hi, I have some old pages that were coming up in google WMT as a 404.  These had links into them so i thought i'd do a 301 back to either the home page or to a relevant category or page. However these are now listed in WMT as soft 404's. I'm not sure what this means and whether google is saying it doesn't like this? Any advice welcomed.

                      Intermediate & Advanced SEO | | Aikijeff
                      0
                    • esiow2013

                      May know what's the meaning of these parameters in .htaccess?

                      Begin HackRepair.com Blacklist RewriteEngine on Abuse Agent Blocking RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Bolt\ 0 [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} CazoodleBot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Custo [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Default\ Browser\ 0 [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^DIIbot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^DISCo [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} discobot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^eCatch [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ecxi [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^FlashGet [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^GetRight [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^GrabNet [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Grafula [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} GT::WWW [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} heritrix [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^HMView [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} HTTP::Lite [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ia_archiver [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} IDBot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} id-search [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} id-search.org [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^InterGET [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} IRLbot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ISC\ Systems\ iRc\ Search\ 2.1 [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Java [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^JetCar [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^larbin [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} libwww-perl [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Link [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} LinksManager.com_bot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} linkwalker [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} lwp-trivial [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Maxthon$ [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} MFC_Tear_Sample [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^microsoft.url [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Microsoft\ URL\ Control [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Missigua\ Locator [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Mozilla.NEWT [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Navroad [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^NearSite [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^NetAnts [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^NetSpider [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^NetZIP [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Octopus [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} panscient.com [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^pavuk [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} PECL::HTTP [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^PeoplePal [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} PHPCrawl [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} PleaseCrawl [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^RealDownload [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^ReGet [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Rippers\ 0 [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} SBIder [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^SeaMonkey$ [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Snoopy [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Steeler [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^SuperBot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Surfbot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Toata\ dragostea\ mea\ pentru\ diavola [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} URI::Fetch [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} urllib [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} User-Agent [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Web\ Sucker [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} webalta [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebAuto [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} WebCollage [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebCopier [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebFetch [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebReaper [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebSauger [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebStripper [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebZIP [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Wells\ Search\ II [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} WEP\ Search [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Widow [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WWW-Mechanize [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} zermelo [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^(.)Zeus.Webster [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ZyBorg [NC]
                      RewriteRule ^. - [F,L] Abuse bot blocking rule end End HackRepair.com Blacklist

                      Intermediate & Advanced SEO | | esiow2013
                      1
                    • seo123456

                      Using 2 wildcards in the robots.txt file

                      I have a URL string which I don't want to be indexed. it includes the characters _Q1 ni the middle of the string. So in the robots.txt can I use 2 wildcards in the string to take out all of the URLs with that in it?  So something like /_Q1.  Will that pickup and block every  URL with those characters in the string? Also, this is not directly of the root, but in a secondary directory, so .com/.../_Q1.  So do I have to format the robots.txt as //_Q1* as it will be in the second folder or just using /_Q1 will pickup everything no matter what folder it is on? Thanks.

                      Intermediate & Advanced SEO | | seo123456
                      0
                    • nicole.healthline

                      Tool to calculate the number of pages in Google's index?

                      When working with a very large site, are there any tools that will help you calculate the number of links in the Google index? I know you can use site:www.domain.com to see all the links indexed for a particular url. But what if you want to see the number of pages indexed for 100 different subdirectories (i.e. www.domain.com/a, www.domain.com/b)? is there a tool to help automate the process of finding the number of pages from each subdirectory in Google's index?

                      Intermediate & Advanced SEO | | nicole.healthline
                      0

                    Get started with Moz Pro!

                    Unlock the power of advanced SEO tools and data-driven insights.

                    Start my free trial
                    Products
                    • Moz Pro
                    • Moz Local
                    • Moz API
                    • Moz Data
                    • STAT
                    • Product Updates
                    Moz Solutions
                    • SMB Solutions
                    • Agency Solutions
                    • Enterprise Solutions
                    • Digital Marketers
                    Free SEO Tools
                    • Domain Authority Checker
                    • Link Explorer
                    • Keyword Explorer
                    • Competitive Research
                    • Brand Authority Checker
                    • Local Citation Checker
                    • MozBar Extension
                    • MozCast
                    Resources
                    • Blog
                    • SEO Learning Center
                    • Help Hub
                    • Beginner's Guide to SEO
                    • How-to Guides
                    • Moz Academy
                    • API Docs
                    About Moz
                    • About
                    • Team
                    • Careers
                    • Contact
                    Why Moz
                    • Case Studies
                    • Testimonials
                    Get Involved
                    • Become an Affiliate
                    • MozCon
                    • Webinars
                    • Practical Marketer Series
                    • MozPod
                    Connect with us

                    Contact the Help team

                    Join our newsletter
                    Moz logo
                    © 2021 - 2026 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                    • Accessibility
                    • Terms of Use
                    • Privacy

                    Looks like your connection to Moz was lost, please wait while we try to reconnect.