Skip to content
    Moz logo Menu open Menu close
    • Products
      • Moz Pro
      • Moz Pro Home
      • Moz Local
      • Moz Local Home
      • STAT
      • Moz API
      • Moz API Home
      • Compare SEO Products
      • Moz Data
    • Free SEO Tools
      • Domain Analysis
      • Keyword Explorer
      • Link Explorer
      • Competitive Research
      • MozBar
      • More Free SEO Tools
    • Learn SEO
      • Beginner's Guide to SEO
      • SEO Learning Center
      • Moz Academy
      • MozCon
      • Webinars, Whitepapers, & Guides
    • Blog
    • Why Moz
      • Digital Marketers
      • Agency Solutions
      • Enterprise Solutions
      • Small Business Solutions
      • The Moz Story
      • New Releases
    • Log in
    • Log out
    • Products
      • Moz Pro

        Your all-in-one suite of SEO essentials.

      • Moz Local

        Raise your local SEO visibility with complete local SEO management.

      • STAT

        SERP tracking and analytics for enterprise SEO experts.

      • Moz API

        Power your SEO with our index of over 44 trillion links.

      • Compare SEO Products

        See which Moz SEO solution best meets your business needs.

      • Moz Data

        Power your SEO strategy & AI models with custom data solutions.

      Track AI Overviews in Keyword Research
      Moz Pro

      Track AI Overviews in Keyword Research

      Try it free!
    • Free SEO Tools
      • Domain Analysis

        Get top competitive SEO metrics like DA, top pages and more.

      • Keyword Explorer

        Find traffic-driving keywords with our 1.25 billion+ keyword index.

      • Link Explorer

        Explore over 40 trillion links for powerful backlink data.

      • Competitive Research

        Uncover valuable insights on your organic search competitors.

      • MozBar

        See top SEO metrics for free as you browse the web.

      • More Free SEO Tools

        Explore all the free SEO tools Moz has to offer.

      NEW Keyword Suggestions by Topic
      Moz Pro

      NEW Keyword Suggestions by Topic

      Learn more
    • Learn SEO
      • Beginner's Guide to SEO

        The #1 most popular introduction to SEO, trusted by millions.

      • SEO Learning Center

        Broaden your knowledge with SEO resources for all skill levels.

      • On-Demand Webinars

        Learn modern SEO best practices from industry experts.

      • How-To Guides

        Step-by-step guides to search success from the authority on SEO.

      • Moz Academy

        Upskill and get certified with on-demand courses & certifications.

      • MozCon

        Save on Early Bird tickets and join us in London or New York City

      Unlock flexible pricing & new endpoints
      Moz API

      Unlock flexible pricing & new endpoints

      Find your plan
    • Blog
    • Why Moz
      • Digital Marketers

        Simplify SEO tasks to save time and grow your traffic.

      • Small Business Solutions

        Uncover insights to make smarter marketing decisions in less time.

      • Agency Solutions

        Earn & keep valuable clients with unparalleled data & insights.

      • Enterprise Solutions

        Gain a competitive edge in the ever-changing world of search.

      • The Moz Story

        Moz was the first & remains the most trusted SEO company.

      • New Releases

        Get the scoop on the latest and greatest from Moz.

      Surface actionable competitive intel
      New Feature

      Surface actionable competitive intel

      Learn More
    • Log in
      • Moz Pro
      • Moz Local
      • Moz Local Dashboard
      • Moz API
      • Moz API Dashboard
      • Moz Academy
    • Avatar
      • Moz Home
      • Notifications
      • Account & Billing
      • Manage Users
      • Community Profile
      • My Q&A
      • My Videos
      • Log Out

    The Moz Q&A Forum

    • Forum
    • Questions
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. Home
    2. SEO Tactics
    3. Intermediate & Advanced SEO
    4. Crawled page count in Search console

    Moz Q&A is closed.

    After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

    Crawled page count in Search console

    Intermediate & Advanced SEO
    2
    9
    2592
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with question management privileges can see it.
    • Bob_van_Biezen
      Bob_van_Biezen last edited by

      Hi Guys,

      I'm working on a project (premium-hookahs.nl) where I stumble upon a situation I can’t address. Attached is a screenshot of the crawled pages in Search Console.

      History:

      Doing to technical difficulties this webshop didn’t always no index filterpages resulting in thousands of duplicated pages. In reality this webshops has less than 1000 individual pages. At this point we took the following steps to result this:

      1. Noindex filterpages.
      2. Exclude those filterspages in Search Console and robots.txt.
      3. Canonical the filterpages to the relevant categoriepages.

      This however didn’t result in Google crawling less pages. Although the implementation wasn’t always sound (technical problems during updates) I’m sure this setup has been the same for the last two weeks. Personally I expected a drop of crawled pages but they are still sky high. Can’t imagine Google visits this site 40 times a day.

      To complicate the situation:

      We’re running an experiment to gain positions on around 250 long term searches. A few filters will be indexed (size, color, number of hoses and flavors) and three of them can be combined. This results in around 250 extra pages. Meta titles, descriptions, h1 and texts are unique as well.

      Questions:

      1. -          Excluding in robots.txt should result in Google not crawling those pages right?
      2. -          Is this number of crawled pages normal for a website with around 1000 unique pages?
      3. -          What am I missing?

      BxlESTT

      1 Reply Last reply Reply Quote 0
      • donford
        donford @Bob_van_Biezen last edited by

        Ben,

        I doubt that crawlers are going to access the robots.txt file for each request, but they still have to validate any url they find against the list of the blocked ones.

        Glad to help,

        Don

        1 Reply Last reply Reply Quote 1
        • Bob_van_Biezen
          Bob_van_Biezen @donford last edited by

          Hi Don,

          Thanks for the clear explanation. I always though disallow in robots.txt would give a sort of map to Google (at the start of a site crawl) with the pages on the site that shouldn’t be crawled. So he therefore didn’t have to “check the locked cars”.

          If I understand you correctly, google checks the robots.txt with every single page load?

          That could definitely explain high number of crawled pages per day.

          Thanks a lot!

          donford 1 Reply Last reply Reply Quote 0
          • donford
            donford @Bob_van_Biezen last edited by

            Hi Bob,

            About the nofollow vs blocked. In the end I suppose you have the same results, but in practice it works a little differently. When you nofollow a link it tells the crawler as soon as it encounters the link not to request or follow that link path. When you block it via robots the crawler still attempts to access the url only to find it not accessible.

            Imagine if I said go to the parking lot and collect all the loose change in all the unlocked cars. Now imagine how much easier that task would be if all the locked cars had a sign in the window that said "Locked", you could easily ignore the locked cars and go directly to the unlocked ones. Without the sign you would have to physically go check each car to see if it will open.

            About link juice, if you have a link, juice will be passed regardless of the type of link. (You used to be able to use nofollow to preserve link juice but no longer). This is bit unfortunate for sites that use search filters because they are such a valuable tool for the users.

            Don

            Bob_van_Biezen 1 Reply Last reply Reply Quote 1
            • Bob_van_Biezen
              Bob_van_Biezen @donford last edited by

              Hi Don,

              You're right about the sitemap, noted it on the to do list!

              Your point about nofollow is intersting. Isn't excluding in robots.txt giving the same result?

              Before we went on with the robots.txt we didn't implant nofollow because we didn't want any linkjuice to pass away. Since we got robots.txt I assume this doesn’t matter anymore since Google won’t crawl those pages anyway.

              Best regards,

              Bob

              donford 1 Reply Last reply Reply Quote 0
              • donford
                donford last edited by

                Hi Bob,

                You can "suggest" a crawl rate to Google by logging into your webmasters tools on Google and adjusting it there.

                As for indexing pages.. I looked at your robots and site. It really looks like you need to employ some No Follow on some of your internal linking, specifically on the product page filters, that alone could reduce the total number of URLS that the crawlers even attempts to look at.

                Additionally your sitemap http://premium-hookahs.nl/sitemap.xml shows a change frequency of daily, and probably should be broken out between Pages / Images so you end up using two sitemaps one for images and one for pages. You may also want to review what is in there. Using ScreamingFrog (free) the sitemap I made (link) only shows about 100 urls.

                Hope it helps,

                Don

                Bob_van_Biezen 1 Reply Last reply Reply Quote 1
                • Bob_van_Biezen
                  Bob_van_Biezen @donford last edited by

                  Hi Don,

                  Just wanted to add a quick note: your input made go through the indexation state of the website again which was worse than I through it was. I will take some steps to get this resolved, thanks!

                  Would love to hear your input about the number of crawled pages.

                  Best regards,

                  Bob

                  1 Reply Last reply Reply Quote 0
                  • Bob_van_Biezen
                    Bob_van_Biezen @donford last edited by

                    Hello Don,

                    Thanks for your advice. What would your advice be if the main goal would be the reduction of crawled pages per day? I think we got the right pages in the index and the old duplicates are mostly deindexed. At this point I’m mostly worried about Google spending it’s crawlbudget on the right pages. Somehow it still crawls 40.000 pages per day while we only got around 1000 pages that should be crawled. Looking at the current setup (with almost everything excluded though robots.txt) I can’t think of pages it does crawl to reach the 40k. And 40 times a day sounds like way to many crawled pages for a normal webshop.

                    Hope to hear from you!

                    1 Reply Last reply Reply Quote 0
                    • donford
                      donford last edited by

                      Hello Bob,

                      Here is some food for thought. If you disallow a page in Robots.txt, google for example will not crawl that page. That does not however mean they will remove it from the index if it had previously been crawled. It simply treats it as inaccessible and moves on. It will take some time, months before Google finally says, we have no fresh crawls of page x, its time to remove it from the index.

                      On the other hand if you specifically allow Google to crawl those pages and show a no-index tag on it, Google now has a new directive it can act upon immediately.

                      So my evaluation of the situation would be to do 1 of 2 things.

                      1. Remove the disallow from robots and allow Google to crawl the pages again. However, this time use no-index, no-follow tags.

                      2. Remove the disallow from robots and allow Google to crawl the pages again, but use canonical tags to the main "filter" page to prevent further indexing the specific filter pages.

                      Which option is best depends on the amount of urls being indexed, a few thousand canonical would be my choice. A few hundred thousand, then no index would make more sense.

                      Whichever option, you will have to insure Google re-crawls, and then allow them time to re-index appropriately. Not a quick fix, but a fix none the less.

                      My thoughts and I hope it makes sense,

                      Don

                      Bob_van_Biezen 2 Replies Last reply Reply Quote 1
                      • 1 / 1
                      • First post
                        Last post

                      Got a burning SEO question?

                      Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


                      Start my free trial


                      Browse Questions

                      Explore more categories

                      • Moz Tools

                        Chat with the community about the Moz tools.

                      • SEO Tactics

                        Discuss the SEO process with fellow marketers

                      • Community

                        Discuss industry events, jobs, and news!

                      • Digital Marketing

                        Chat about tactics outside of SEO

                      • Research & Trends

                        Dive into research and trends in the search industry.

                      • Support

                        Connect on product support and feature requests.

                      • See all categories

                      Related Questions

                      • shabbirmoosa

                        Rel canonical tag from shopify page to wordpress site page

                        shopify wordpress canonical

                        We have pages on our shopify site example - https://shop.example.com/collections/cast-aluminum-plaques/products/cast-aluminum-address-plaque That we want to put a rel canonical tag on to direct to our wordpress site page - https://www.example.com/aluminum-plaques/ We have links form the wordpress page to the shop page, and over time ahve found that google has ranked the shop pages over the wp pages, which we do not want. So we want to put rel canonical tags on the shop pages to say the wp page is the authority. I hope that makes sense, and I would appreciate your feeback and best solution. Thanks! Is that possible?

                        Intermediate & Advanced SEO | | shabbirmoosa
                        0
                      • yaelslater

                        Images on their own page?

                        Hi Mozers, We have images on their own separate pages that are then pulled onto content pages. Should the standalone pages be indexable? On the one hand, it seems good to have an image on it's own page, with it's own title. On the other hand, it may be better SEO for crawler to find the image on a content page dedicated to that topic. Unsure. Would appreciate any guidance! Yael

                        Intermediate & Advanced SEO | | yaelslater
                        1
                      • AL123al

                        Page rank and menus

                        Hi, My client has a large website and has a navigation with main categories. However, they also have a hamburger type navigation in the top right. If you click it it opens to a massive menu with every category and page visible. Do you know if having a navigation like this bleeds page rank? So if all deep pages are visible from the hamburger navigation this means that page rank is not being conserved to the main categories. If you click a main category in the main navigation (not the hamburger) you can see the sub pages. I think this is the right structure but the client has installed this huge menu to make it easier for people to see what there is. From a technical SEO  is this not bad?

                        Intermediate & Advanced SEO | | AL123al
                        0
                      • lzhao

                        Would you rate-control Googlebot? How much crawling is too much crawling?

                        One of our sites is very large - over 500M pages.   Google has indexed 1/8th of the site - and they tend to crawl between 800k and 1M pages per day. A few times a year, Google will significantly increase their crawl rate - overnight hitting 2M pages per day or more.  This creates big problems for us, because at 1M pages per day Google is consuming 70% of our API capacity, and the API overall is at 90% capacity.   At 2M pages per day, 20% of our page requests are 500 errors. I've lobbied for an investment / overhaul of the API configuration to allow for more Google  bandwidth without compromising user experience.   My tech team counters that it's a wasted investment - as Google will crawl to our capacity whatever that capacity is. Questions to Enterprise SEOs: *Is there any validity to the tech team's claim?  I thought Google's crawl rate was based on a combination of PageRank and the frequency of page updates.   This indicates there is some upper limit - which we perhaps haven't reached - but which would stabilize once reached. *We've asked Google to rate-limit our crawl rate in the past.   Is that harmful?  I've always looked at a robust crawl rate as a good problem to have. Is 1.5M Googlebot API calls a day desirable, or something any reasonable Enterprise SEO would seek to throttle back? *What about setting a longer refresh rate in the sitemaps?   Would that reduce the daily crawl demand?  We could set increase it to a month, but at 500M pages Google could still have a ball at the 2M pages/day rate. Thanks

                        Intermediate & Advanced SEO | | lzhao
                        0
                      • lcourse

                        Is it a problem to use a 301 redirect to a 404 error page, instead of serving directly a 404 page?

                        We are building URLs dynamically with apache rewrite.
                        When we detect that an URL is matching some valid patterns, we serve a script which then may detect that the combination of parameters in the URL does not exist. If this happens we produce a 301 redirect to another URL which serves a 404 error page, So my doubt is the following: Do I have to worry about not serving directly an 404, but redirecting (301) to a 404 page? Will this lead to the erroneous original URL staying longer in the google index than if I would serve directly a 404? Some context. It is a site with about 200.000 web pages and we have currently 90.000 404 errors reported in webmaster tools (even though only 600 detected last month).

                        Intermediate & Advanced SEO | | lcourse
                        0
                      • PottyScotty

                        Does an H1 have to be at the top of a page?

                        Because H1 "may" carry some  weight with Google does it have to be placed at the top of the page?  Can I place it towards the bottom of the page instead in normal body size?  My goal is to keep the main keywords in the H1 but create a much friendlier title for the customer to read at the top of the page.

                        Intermediate & Advanced SEO | | PottyScotty
                        0
                      • chunkyvittles

                        Why does my home page show up in search results instead of my target page for a specific keyword?

                        I am using Wordpress and am targeting a specific keyword..and am using Yoast SEO if that question comes up.. and I am at 100% as far as what they recommend for on page optimization. The target html page is a "POST" and not a "Page" using Wordpress definitions. Also, I am using this Pinterest style theme here http://pinclone.net/demo/ - which makes the post a sort of "pop-up"  - but I started with a different theme and the results below were always the case..so I don't know if that is a factor or not. (I promise .. this is not a clever spammy attempt to promote their theme - in fact parts of it don't even work for me yet so I would not recommend it just yet...) I DO show up on the first page for my keyword.. however.. instead of Google showing the page www.mywebsite.com/this-is-my-targeted-keyword-page.htm Google shows www.mywebsite.com in the results instead. The problem being - if the traffic goes only to my home page.. they will be less likely to stay if they dont find what they want immediately and have to search for it.. Any suggestions would be appreciated!

                        Intermediate & Advanced SEO | | chunkyvittles
                        0
                      • Bio-RadAbs

                        Dynamic pages - ecommerce product pages

                        Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks

                        Intermediate & Advanced SEO | | Bio-RadAbs
                        0

                      Get started with Moz Pro!

                      Unlock the power of advanced SEO tools and data-driven insights.

                      Start my free trial
                      Products
                      • Moz Pro
                      • Moz Local
                      • Moz API
                      • Moz Data
                      • STAT
                      • Product Updates
                      Moz Solutions
                      • SMB Solutions
                      • Agency Solutions
                      • Enterprise Solutions
                      • Digital Marketers
                      Free SEO Tools
                      • Domain Authority Checker
                      • Link Explorer
                      • Keyword Explorer
                      • Competitive Research
                      • Brand Authority Checker
                      • Local Citation Checker
                      • MozBar Extension
                      • MozCast
                      Resources
                      • Blog
                      • SEO Learning Center
                      • Help Hub
                      • Beginner's Guide to SEO
                      • How-to Guides
                      • Moz Academy
                      • API Docs
                      About Moz
                      • About
                      • Team
                      • Careers
                      • Contact
                      Why Moz
                      • Case Studies
                      • Testimonials
                      Get Involved
                      • Become an Affiliate
                      • MozCon
                      • Webinars
                      • Practical Marketer Series
                      • MozPod
                      Connect with us

                      Contact the Help team

                      Join our newsletter
                      Moz logo
                      © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                      • Accessibility
                      • Terms of Use
                      • Privacy

                      Looks like your connection to Moz was lost, please wait while we try to reconnect.