• ramc-7JcUnB

        See all notifications

        Skip to content
        Moz logo Menu open Menu close
        • Products
          • Moz Pro
          • Moz Pro Home
          • Moz Local
          • Moz Local Home
          • STAT
          • Moz API
          • Moz API Home
          • Compare SEO Products
          • Moz Data
        • Free SEO Tools
          • Domain Analysis
          • Keyword Explorer
          • Link Explorer
          • Competitive Research
          • MozBar
          • More Free SEO Tools
        • Learn SEO
          • Beginner's Guide to SEO
          • SEO Learning Center
          • Moz Academy
          • MozCon
          • Webinars, Whitepapers, & Guides
        • Blog
        • Why Moz
          • Digital Marketers
          • Agency Solutions
          • Enterprise Solutions
          • Small Business Solutions
          • The Moz Story
          • New Releases
        • Log in
        • Log out
        • Products
          • Moz Pro

            Your all-in-one suite of SEO essentials.

          • Moz Local

            Raise your local SEO visibility with complete local SEO management.

          • STAT

            SERP tracking and analytics for enterprise SEO experts.

          • Moz API

            Power your SEO with our index of over 44 trillion links.

          • Compare SEO Products

            See which Moz SEO solution best meets your business needs.

          • Moz Data

            Power your SEO strategy & AI models with custom data solutions.

          Enhance Keyword Discovery with Bulk Analysis
          Moz Pro

          Enhance Keyword Discovery with Bulk Analysis

          Learn more
        • Free SEO Tools
          • Domain Analysis

            Get top competitive SEO metrics like DA, top pages and more.

          • Keyword Explorer

            Find traffic-driving keywords with our 1.25 billion+ keyword index.

          • Link Explorer

            Explore over 40 trillion links for powerful backlink data.

          • Competitive Research

            Uncover valuable insights on your organic search competitors.

          • MozBar

            See top SEO metrics for free as you browse the web.

          • More Free SEO Tools

            Explore all the free SEO tools Moz has to offer.

          NEW Keyword Suggestions by Topic
          Moz Pro

          NEW Keyword Suggestions by Topic

          Learn more
        • Learn SEO
          • Beginner's Guide to SEO

            The #1 most popular introduction to SEO, trusted by millions.

          • SEO Learning Center

            Broaden your knowledge with SEO resources for all skill levels.

          • On-Demand Webinars

            Learn modern SEO best practices from industry experts.

          • How-To Guides

            Step-by-step guides to search success from the authority on SEO.

          • Moz Academy

            Upskill and get certified with on-demand courses & certifications.

          • MozCon

            Save on Early Bird tickets and join us in London or New York City

          Access 20 years of data with flexible pricing
          Moz API

          Access 20 years of data with flexible pricing

          Find your plan
        • Blog
        • Why Moz
          • Digital Marketers

            Simplify SEO tasks to save time and grow your traffic.

          • Small Business Solutions

            Uncover insights to make smarter marketing decisions in less time.

          • Agency Solutions

            Earn & keep valuable clients with unparalleled data & insights.

          • Enterprise Solutions

            Gain a competitive edge in the ever-changing world of search.

          • The Moz Story

            Moz was the first & remains the most trusted SEO company.

          • New Releases

            Get the scoop on the latest and greatest from Moz.

          Surface actionable competitive intel
          New Feature

          Surface actionable competitive intel

          Learn More
        • Log in
          • Moz Pro
          • Moz Local
          • Moz Local Dashboard
          • Moz API
          • Moz API Dashboard
          • Moz Academy
        • Avatar
          • Moz Home
          • Notifications
          • Account & Billing
          • Manage Users
          • Community Profile
          • My Q&A
          • My Videos
          • Log Out

        The Moz Q&A Forum

        • Forum
        • Questions
        • My Q&A
        • Users
        • Ask the Community

        Welcome to the Q&A Forum

        Browse the forum for helpful insights and fresh discussions about all things SEO.

        1. Home
        2. SEO Tactics
        3. Intermediate & Advanced SEO
        4. Soft 404's from pages blocked by robots.txt -- cause for concern?

        Moz Q&A is closed.

        After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

        Soft 404's from pages blocked by robots.txt -- cause for concern?

        Intermediate & Advanced SEO
        3
        6
        2998
        Loading More Posts
        • Watching

          Notify me of new replies.
          Show question in unread.

        • Not Watching

          Do not notify me of new replies.
          Show question in unread if category is not ignored.

        • Ignoring

          Do not notify me of new replies.
          Do not show question in unread.

        • Oldest to Newest
        • Newest to Oldest
        • Most Votes
        Reply
        • Reply as question
        Locked
        This topic has been deleted. Only users with question management privileges can see it.
        • nicole.healthline
          nicole.healthline last edited by

          We're seeing soft 404 errors appear in our google webmaster tools section on pages that are blocked by robots.txt (our search result pages).

          Should we be concerned? Is there anything we can do about this?

          1 Reply Last reply Reply Quote 4
          • CleverPhD
            CleverPhD @CleverPhD last edited by

            Me too.  It was that video that helped to clear things up for me.  Then I could see when to use robots.txt vs the noindex meta tag.  It has made a big difference in how I manage sites that have large amounts of content that can be sorted in a huge number of ways.

            1 Reply Last reply Reply Quote 0
            • Highland
              Highland @CleverPhD last edited by

              Good stuff. I was always under the impression they still crawled them (otherwise, how would you know if the block was removed).

              1 Reply Last reply Reply Quote 0
              • CleverPhD
                CleverPhD @Highland last edited by

                Take a look at

                http://www.youtube.com/watch?v=KBdEwpRQRD0

                to see what I am talking about.

                Robots.txt does prevent crawling according to Matt Cutts.

                Highland CleverPhD 2 Replies Last reply Reply Quote 1
                • Highland
                  Highland last edited by

                  Robots.txt prevents indexation, not crawling. The good news is that Googlebot stops crawling 404s.

                  CleverPhD 1 Reply Last reply Reply Quote 0
                  • CleverPhD
                    CleverPhD last edited by

                    Just a couple of under the hood things to check.

                    1. Are you sure your robots.txt is setup correctly. Check in GWT to see that Google is reading it.

                    2. This may be a timing issue.  Errors take 30-60 days to drop out (as what I have seen) so did they show soft 404 and then you added them to robots.txt?

                    If that was the case, this may be a sequence issue.  If Google finds a soft 404 (or some other error) then it comes back to spider and is not able to crawl the page due to robots.txt - it does not know what the current status of the page is so it may just leave the last status that it  found.

                    1. I tend to see soft 404 for pages that you have a 301 redirect on where you have a many to one association.  In other words, you have a bunch of pages that are 301ing to a single page.  You may want to consider changing where some of the 301s redirect so that they going to a specific page vs an index page.

                    2. If you have a page in robots.txt - you do not want them in Google, here is what I would do.   Show a 200 on that page but then put in the meta tags a noindex nofollow.

                    http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710

                    "When we see the noindex meta tag on a page, Google will completely drop the page from our search results, even if other pages link to it"

                    Let Google spider it so that it can see the 200 code - you get rid of the soft 404 errors.  Then toss in the noindex nofollow meta tags to have the page removed from the Google index.  It sounds backwards that you have to let Google spider to get it to remove stuff, but it works it you walk through the logic.

                    Good luck!

                    1 Reply Last reply Reply Quote 1
                    • 1 / 1
                    • First post
                      Last post

                    Browse Questions

                    Explore more categories

                    • Moz Tools

                      Chat with the community about the Moz tools.

                    • SEO Tactics

                      Discuss the SEO process with fellow marketers

                    • Community

                      Discuss industry events, jobs, and news!

                    • Digital Marketing

                      Chat about tactics outside of SEO

                    • Research & Trends

                      Dive into research and trends in the search industry.

                    • Support

                      Connect on product support and feature requests.

                    • See all categories

                    Related Questions

                    • jamiegriz

                      SEO Best Practices regarding Robots.txt disallow

                      I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!

                      Intermediate & Advanced SEO | | jamiegriz
                      0
                    • SS.Digital

                      Forwarded vanity domains, suddenly resolving to 404 with appended URL's ending in random 5 characters

                      We have several vanity domains that forward to various pages on our primary domain.
                      e.g. www.vanity.com (301)--> www.mydomain.com/sub-page (200) These forwards have been in place for months or even years and have worked fine.  As of yesterday, we have seen the following problem.  We have made no changes in the forwarding settings. Now, inconsistently, they sometimes resolve and sometimes they do not.  When we load the vanity URL with Chrome Dev Tools (Network Pane) open, it shows the following redirect chains, where xxxxx represents a random 5 character string of lower and upper case letters.  (e.g. VGuTD) EXAMPLE:
                      www.vanity.com                                  (302, Found) -->
                      www.vanity.com/xxxxx                        (302, Found) -->
                      www.vanity.com/xxxxx                        (302, Found) -->
                      www.vanity.com/xxxxx/xxxxx               (302, Found) -->
                      www.mydomain.com/sub-page/xxxxx (404, Not Found) This is just one example, the amount of redirects, vary wildly.  Sometimes there is only 1 redirect, sometimes there are as many as 5. Sometimes the request will ultimately resolve on the correct mydomain.com/sub-page, but usually it does not (as in the example above). We have cross-checked across every browser, device, private/non-private, cookies cleared, on and off of our network etc...   This leads us to believe that it is not at the device or host level. Our Registrar is Godaddy.  They have not encountered this issue before, and have no idea what this 5 character string is from.  I tend to believe them because per our analytics, we have determined that this problem only started yesterday. Our primary question is, has anybody else encountered this problem either in the last couple days, or at any time in the past?  We have come up with a solution that works to alleviate the problem, but to implement it across hundreds of vanity domains will take us an inordinate amount of time.  Really hoping to fix the cause of the problem instead of just treating the symptom.

                      Intermediate & Advanced SEO | | SS.Digital
                      0
                    • Andy.Drinkwater

                      How long to re-index a page after being blocked

                      Morning all! I am doing some research at the moment and am trying to find out, just roughly, how long you have ever had to wait to have a page re-indexed by Google. For this purpose, say you had blocked a page via meta noindex or disallowed access by robots.txt, and then opened it back up. No right or wrong answers, just after a few numbers 🙂 Cheers, -Andy

                      Intermediate & Advanced SEO | | Andy.Drinkwater
                      0
                    • PeteC12

                      When should you 410 pages instead of 404

                      Hi All, We have approx 6,000 - 404 pages. These are for categories etc we don't do anymore and there is not near replacement etc so basically no reason or benefit to have them at all. I can see in GWT , these are still being crawled/found and therefore taking up crawler bandwidth. Our SEO agency said we should 410 these pages?.. I am wondering what the difference is and how google treats them differently ?. Do anyone know When should you 410 pages instead of 404 ? thanks Pete

                      Intermediate & Advanced SEO | | PeteC12
                      0
                    • Aikijeff

                      What are Soft 404's and are they a problem

                      Hi, I have some old pages that were coming up in google WMT as a 404.  These had links into them so i thought i'd do a 301 back to either the home page or to a relevant category or page. However these are now listed in WMT as soft 404's. I'm not sure what this means and whether google is saying it doesn't like this? Any advice welcomed.

                      Intermediate & Advanced SEO | | Aikijeff
                      0
                    • esiow2013

                      May know what's the meaning of these parameters in .htaccess?

                      Begin HackRepair.com Blacklist RewriteEngine on Abuse Agent Blocking RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Bolt\ 0 [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} CazoodleBot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Custo [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Default\ Browser\ 0 [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^DIIbot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^DISCo [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} discobot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^eCatch [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ecxi [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^FlashGet [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^GetRight [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^GrabNet [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Grafula [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} GT::WWW [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} heritrix [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^HMView [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} HTTP::Lite [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ia_archiver [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} IDBot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} id-search [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} id-search.org [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^InterGET [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} IRLbot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ISC\ Systems\ iRc\ Search\ 2.1 [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Java [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^JetCar [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^larbin [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} libwww-perl [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Link [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} LinksManager.com_bot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} linkwalker [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} lwp-trivial [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Maxthon$ [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} MFC_Tear_Sample [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^microsoft.url [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Microsoft\ URL\ Control [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Missigua\ Locator [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Mozilla.NEWT [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Navroad [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^NearSite [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^NetAnts [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^NetSpider [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^NetZIP [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Octopus [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} panscient.com [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^pavuk [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} PECL::HTTP [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^PeoplePal [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} PHPCrawl [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} PleaseCrawl [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^RealDownload [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^ReGet [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Rippers\ 0 [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} SBIder [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^SeaMonkey$ [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Snoopy [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Steeler [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^SuperBot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Surfbot [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Toata\ dragostea\ mea\ pentru\ diavola [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} URI::Fetch [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} urllib [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} User-Agent [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Web\ Sucker [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} webalta [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebAuto [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} WebCollage [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebCopier [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebFetch [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebReaper [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebSauger [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebStripper [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WebZIP [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} Wells\ Search\ II [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} WEP\ Search [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Widow [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WWW-Mechanize [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} zermelo [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ^(.)Zeus.Webster [NC,OR]
                      RewriteCond %{HTTP_USER_AGENT} ZyBorg [NC]
                      RewriteRule ^. - [F,L] Abuse bot blocking rule end End HackRepair.com Blacklist

                      Intermediate & Advanced SEO | | esiow2013
                      1
                    • HD_Leona

                      Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search

                      Hi! I have pages within my forum where visitors can upload photos.  When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed.  The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such:  domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results.  This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can  I disallow Googlebot specifically rather than just using User-agent:  * which would then allow googlebot-image to pick up the photos?  I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona

                      Intermediate & Advanced SEO | | HD_Leona
                      0
                    • JamesO

                      To subnav or NOT to subnav... that's my question.... :)

                      We are working on a new website that is golf related and wondering about whether or not we should set up a subnavigation dropdown menu from the main menu. For example: GOLF PACKAGES
                        >> 2 Round Packages
                        >> 3 Round Packages
                        >> 4 Round Packages
                        >> 5 Round Packages GOLF COURSES
                        >> North End Courses
                        >> Central Courses
                        >> South End Courses This would actually be very beneficial to our users from a usability standpoint, BUT what about from an SEO standpoint? Is diverting all the link juice to these inner pages from the main site navigation harmful?  Should we just create a page for GOLF PACKAGES and break it down on that page?

                      Intermediate & Advanced SEO | | JamesO
                      0

                    Get started with Moz Pro!

                    Unlock the power of advanced SEO tools and data-driven insights.

                    Start my free trial
                    Products
                    • Moz Pro
                    • Moz Local
                    • Moz API
                    • Moz Data
                    • STAT
                    • Product Updates
                    Moz Solutions
                    • SMB Solutions
                    • Agency Solutions
                    • Enterprise Solutions
                    • Digital Marketers
                    Free SEO Tools
                    • Domain Authority Checker
                    • Link Explorer
                    • Keyword Explorer
                    • Competitive Research
                    • Brand Authority Checker
                    • Local Citation Checker
                    • MozBar Extension
                    • MozCast
                    Resources
                    • Blog
                    • SEO Learning Center
                    • Help Hub
                    • Beginner's Guide to SEO
                    • How-to Guides
                    • Moz Academy
                    • API Docs
                    About Moz
                    • About
                    • Team
                    • Careers
                    • Contact
                    Why Moz
                    • Case Studies
                    • Testimonials
                    Get Involved
                    • Become an Affiliate
                    • MozCon
                    • Webinars
                    • Practical Marketer Series
                    • MozPod
                    Connect with us

                    Contact the Help team

                    Join our newsletter
                    Moz logo
                    © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                    • Accessibility
                    • Terms of Use
                    • Privacy

                    Looks like your connection to Moz was lost, please wait while we try to reconnect.