• ramc-7JcUnB

        See all notifications

        Skip to content
        Moz logo Menu open Menu close
        • Products
          • Moz Pro
          • Moz Pro Home
          • Moz Local
          • Moz Local Home
          • STAT
          • Moz API
          • Moz API Home
          • Compare SEO Products
          • Moz Data
        • Free SEO Tools
          • Domain Analysis
          • Keyword Explorer
          • Link Explorer
          • Competitive Research
          • MozBar
          • More Free SEO Tools
        • Learn SEO
          • Beginner's Guide to SEO
          • SEO Learning Center
          • Moz Academy
          • MozCon
          • Webinars, Whitepapers, & Guides
        • Blog
        • Why Moz
          • Digital Marketers
          • Agency Solutions
          • Enterprise Solutions
          • Small Business Solutions
          • The Moz Story
          • New Releases
        • Log in
        • Log out
        • Products
          • Moz Pro

            Your all-in-one suite of SEO essentials.

          • Moz Local

            Raise your local SEO visibility with complete local SEO management.

          • STAT

            SERP tracking and analytics for enterprise SEO experts.

          • Moz API

            Power your SEO with our index of over 44 trillion links.

          • Compare SEO Products

            See which Moz SEO solution best meets your business needs.

          • Moz Data

            Power your SEO strategy & AI models with custom data solutions.

          Enhance Keyword Discovery with Bulk Analysis
          Moz Pro

          Enhance Keyword Discovery with Bulk Analysis

          Learn more
        • Free SEO Tools
          • Domain Analysis

            Get top competitive SEO metrics like DA, top pages and more.

          • Keyword Explorer

            Find traffic-driving keywords with our 1.25 billion+ keyword index.

          • Link Explorer

            Explore over 40 trillion links for powerful backlink data.

          • Competitive Research

            Uncover valuable insights on your organic search competitors.

          • MozBar

            See top SEO metrics for free as you browse the web.

          • More Free SEO Tools

            Explore all the free SEO tools Moz has to offer.

          NEW Keyword Suggestions by Topic
          Moz Pro

          NEW Keyword Suggestions by Topic

          Learn more
        • Learn SEO
          • Beginner's Guide to SEO

            The #1 most popular introduction to SEO, trusted by millions.

          • SEO Learning Center

            Broaden your knowledge with SEO resources for all skill levels.

          • On-Demand Webinars

            Learn modern SEO best practices from industry experts.

          • How-To Guides

            Step-by-step guides to search success from the authority on SEO.

          • Moz Academy

            Upskill and get certified with on-demand courses & certifications.

          • MozCon

            Save on Early Bird tickets and join us in London or New York City

          Access 20 years of data with flexible pricing
          Moz API

          Access 20 years of data with flexible pricing

          Find your plan
        • Blog
        • Why Moz
          • Digital Marketers

            Simplify SEO tasks to save time and grow your traffic.

          • Small Business Solutions

            Uncover insights to make smarter marketing decisions in less time.

          • Agency Solutions

            Earn & keep valuable clients with unparalleled data & insights.

          • Enterprise Solutions

            Gain a competitive edge in the ever-changing world of search.

          • The Moz Story

            Moz was the first & remains the most trusted SEO company.

          • New Releases

            Get the scoop on the latest and greatest from Moz.

          Surface actionable competitive intel
          New Feature

          Surface actionable competitive intel

          Learn More
        • Log in
          • Moz Pro
          • Moz Local
          • Moz Local Dashboard
          • Moz API
          • Moz API Dashboard
          • Moz Academy
        • Avatar
          • Moz Home
          • Notifications
          • Account & Billing
          • Manage Users
          • Community Profile
          • My Q&A
          • My Videos
          • Log Out

        The Moz Q&A Forum

        • Forum
        • Questions
        • My Q&A
        • Users
        • Ask the Community

        Welcome to the Q&A Forum

        Browse the forum for helpful insights and fresh discussions about all things SEO.

        1. Home
        2. SEO Tactics
        3. Intermediate & Advanced SEO
        4. Partial Match or RegEx in Search Console's URL Parameters Tool?

        Moz Q&A is closed.

        After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

        Partial Match or RegEx in Search Console's URL Parameters Tool?

        Intermediate & Advanced SEO
        4
        15
        3338
        Loading More Posts
        • Watching

          Notify me of new replies.
          Show question in unread.

        • Not Watching

          Do not notify me of new replies.
          Show question in unread if category is not ignored.

        • Ignoring

          Do not notify me of new replies.
          Do not show question in unread.

        • Oldest to Newest
        • Newest to Oldest
        • Most Votes
        Reply
        • Reply as question
        Locked
        This topic has been deleted. Only users with question management privileges can see it.
        • Ria_
          Ria_ last edited by

          So I currently have approximately 1000 of these URLs indexed, when I only want roughly 100 of them.

          Let's say the URL is www.example.com/page.php?par1=ABC123=&par2=DEF456=&par3=GHI789=

          All the indexed URLs follow that same kinda format, but I only want to index the URLs that have a par1 of ABC (but that could be ABC123 or ABC456 or whatever). Using URL Parameters tool in Search Console, I can ask Googlebot to only crawl URLs with a specific value. But is there any way to get a partial match, using regex maybe?

          Am I wasting my time with Search Console, and should I just disallow any page.php without par1=ABC in robots.txt?

          1 Reply Last reply Reply Quote 0
          • Andy.Drinkwater
            Andy.Drinkwater @Ria_ last edited by

            No problem 🙂

            Hope you get it sorted!

            -Andy

            1 Reply Last reply Reply Quote 0
            • Ria_
              Ria_ @DirkC last edited by

              Thank you! 😄

              1 Reply Last reply Reply Quote 0
              • Ria_
                Ria_ @Andy.Drinkwater last edited by

                Haha, I think the train passed the station on that one. I would have realised eventually... XD

                Thanks for your help!

                Andy.Drinkwater 1 Reply Last reply Reply Quote 1
                • DirkC
                  DirkC last edited by

                  Don't forget that . & ? have a specific meaning within regex - if you want to use them for pattern matching you will have to escape them. Also be aware that not all bots are capable of interpreting regex in robots.txt - you might want to be more explicit on the user agent - only using regex for Google bot.

                  User-agent: Googlebot

                  #disallowing page.php and any parameters after it

                  disallow: /page.php

                  #but leaving anything that starts with par1=ABC

                  allow: page.php?par1=ABC

                  Dirk

                  Ria_ 1 Reply Last reply Reply Quote 1
                  • Andy.Drinkwater
                    Andy.Drinkwater @Ria_ last edited by

                    Ah sorry I missed that bit!

                    -Andy

                    1 Reply Last reply Reply Quote 0
                    • Andy.Drinkwater
                      Andy.Drinkwater @Ria_ last edited by

                      Disallowing them would be my first priority really, before removing from index.

                      The trouble with this is that if you disallow first, Google won't be able to crawl the page to act on the noindex. If you add a noindex flag, Google won't index them the next time it comes-a-crawling and then you will be good to disallow 🙂

                      I'm not actually sure of the best way for you to get the noindex in to the page header of those pages though.

                      -Andy

                      Ria_ 1 Reply Last reply Reply Quote 0
                      • Ria_
                        Ria_ @Andy.Drinkwater last edited by

                        Yep, have done. (Briefly mentioned in my previous response.) Doesn't pass 😞

                        Andy.Drinkwater 1 Reply Last reply Reply Quote 0
                        • Ria_
                          Ria_ @Martijn_Scheijbeler last edited by

                          I thought so too, but according to Google the trailing wildcard is completely unnecessary, and only needs to be used mid-URL.

                          1 Reply Last reply Reply Quote 0
                          • Ria_
                            Ria_ @Andy.Drinkwater last edited by

                            Hi Andy,

                            Disallowing them would be my first priority really, before removing from index. Didn't want to remove them before I've blocked Google from crawling them in case they get added back again next time Google comes a-crawling, as has happened before when I've simply removed a URL here and there. Does that make sense or am I getting myself mixed up here?

                            My other hack of a solution would be to check the URL in the page.php, and if URL includes par1=ABC then insert noindex meta tag. (Not sure if that would work well or not...)

                            Andy.Drinkwater 1 Reply Last reply Reply Quote 0
                            • Martijn_Scheijbeler
                              Martijn_Scheijbeler @Ria_ last edited by

                              My guess would be that this line needs an * at the end.
                              Allow: /page.php?par1=ABC*

                              Ria_ 1 Reply Last reply Reply Quote 0
                              • Andy.Drinkwater
                                Andy.Drinkwater @Ria_ last edited by

                                Sorry Martijn, just to jump in here for a second - Ria, you can test this via the Robots.txt testing tool in search console before going live to make sure it work.

                                -Andy

                                Ria_ 1 Reply Last reply Reply Quote 0
                                • Ria_
                                  Ria_ @Martijn_Scheijbeler last edited by

                                  Hi Martijn, thanks for your response!

                                  I'm currently looking at something like this...

                                  **user-agent: *** #disallowing page.php and any parameters after it
                                  disallow: /page.php #but leaving anything that starts with par1=ABC
                                  allow: /page.php?par1=ABC

                                  I would have thought that you could disallow things broadly like that and give an exception, as you can with files in disallowed folders. But it's not passing Google's robots.txt Tester.

                                  One thing that's probably worth mentioning really is that there are only two variables that I want to allow of the par1 parameter. For example's sake, ABC123 and ABC456. So would need to be either a partial match or "this or that" kinda deal, disallowing everything else.

                                  Andy.Drinkwater Martijn_Scheijbeler 2 Replies Last reply Reply Quote 0
                                  • Andy.Drinkwater
                                    Andy.Drinkwater last edited by

                                    Hi Ria,

                                    I have never tried regular expressions in this way, so I can't tell you if this would work or not.

                                    However, If all 1000 of these URL's are already indexed, just disallowing access won't then remove them from Google. You would ideally be able to place a noindex tag on those pages and let Google act on them, then you will be good to disallow. I am pretty sure there is no option to noindex under the URL Parameter Tool.

                                    I hope that makes sense?

                                    -Andy

                                    Ria_ 1 Reply Last reply Reply Quote 0
                                    • Martijn_Scheijbeler
                                      Martijn_Scheijbeler last edited by

                                      Hi Ria,

                                      What you could do, but it also depends on the rest of your structure is Disallow these urls based on the parameters (what you could do in a worst case scenario is that you would disallow all URLs and then put an exception Allow in there as well to make sure you still have the right URLs being indexed).

                                      Martijn.

                                      Ria_ 1 Reply Last reply Reply Quote 0
                                      • 1 / 1
                                      • First post
                                        Last post

                                      Browse Questions

                                      Explore more categories

                                      • Moz Tools

                                        Chat with the community about the Moz tools.

                                      • SEO Tactics

                                        Discuss the SEO process with fellow marketers

                                      • Community

                                        Discuss industry events, jobs, and news!

                                      • Digital Marketing

                                        Chat about tactics outside of SEO

                                      • Research & Trends

                                        Dive into research and trends in the search industry.

                                      • Support

                                        Connect on product support and feature requests.

                                      • See all categories

                                      Related Questions

                                      • SearchStan

                                        Over-optimizing Internal Linking: Is this real and, if so, what's the happy medium?

                                        I have heard a lot about having a solid internal linking structure so that Google can easily discover pages and understand your page hierarchies and correlations and equity can be passed. Often, it's mentioned that it's good to have optimized anchor text, but not too optimized. You hear a lot of warnings about how over-optimization can be perceived as spammy: https://neilpatel.com/blog/avoid-over-optimizing/ But you also see posts and news like this saying that the internal link over-optimization warnings are unfounded or outdated:
                                        https://www.seroundtable.com/google-no-internal-linking-overoptimization-penalty-27092.html So what's the tea? Is internal linking overoptimization a myth? If it's true, what's the tipping point? Does it have to be super invasive and keyword stuffy to negatively impact rankings? Or does simple light optimization of internal links on every page trigger this?

                                        Intermediate & Advanced SEO | | SearchStan
                                        1
                                      • rickyporco

                                        After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?

                                        I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?

                                        Intermediate & Advanced SEO | | rickyporco
                                        0
                                      • BHeffernan

                                        Can subdomains hurt your primary domain's SEO?

                                        Our primary website https://domain.com has a subdomain https://subDomain.domain.com and on that subdomain we have a jive-hosted community, with a few links to and fro. In GA they are set up as different properties but there are many SEO issues in the jive-hosted site, in which many different people can create content, delete content, comment, etc. There are issues related to how jive structures content, broken links, etc. My question is this: Aside from the SEO issues with the subdomain, can the performance of that subdomain negatively impact the SEO performance and rank of the primary domain? I've heard and read conflicting reports about this and it would be nice to hear from the MOZ community about options to resolve such issues if they exist. Thanks.

                                        Intermediate & Advanced SEO | | BHeffernan
                                        1
                                      • _nitman

                                        What's the best possible URL structure for a local search engine?

                                        Hi Mozzers, I'm working at AskMe.com which is a local search engine in India i.e if you're standing somewhere & looking for the pizza joints nearby, we pick your current location and share the list of pizza outlets nearby along with ratings, reviews etc. about these outlets. Right now, our URL structure looks like www.askme.com/delhi/pizza-outlets for the city specific category pages (here, "Delhi" is the city name and "Pizza Outlets" is the category) and www.askme.com/delhi/pizza-outlets/in/saket for a category page in a particular area (here "Saket") in a city. The URL looks a little different if you're searching for something which is not a category (or not mapped to a category, in which case we 301 redirect you to the category page), it looks like www.askme.com/delhi/search/pizza-huts/in/saket if you're searching for pizza huts in Saket, Delhi as "pizza huts" is neither a category nor its mapped to any category. We're also dealing in ads & deals along with our very own e-commerce brand AskMeBazaar.com to make the better user experience and one stop shop for our customers. Now, we're working on URL restructure project and my question to you all SEO rockstars is, what can be the best possible URL structure we can have? Assume, we have kick-ass developers who can manage any given URL structure at backend.

                                        Intermediate & Advanced SEO | | _nitman
                                        0
                                      • Motava

                                        Other domains hosted on same server showing up in SERP for 1st site's keywords

                                        For the website in question, the first domain alphabetically on the shared hosting space, strange search results are appearing on the SERP for keywords associated with the site. Here is an example: A search for "unique company name" shows the results: www.uniquecompanyname.com as the top result. But on pages 2 and 3, we are getting results for the same content but for domains hosted on the same server. Here are some examples with the domain name replaced: UNIQUE DOMAIN NAME PAGE TITLE
                                        ftp.DOMAIN2.com/?action=news&id=63‎
                                        META DESCRIPTION TEXT UNIQUE DOMAIN NAME PAGE TITLE 2
                                        www.DOMAIN3.com/?action=news&id=120‎
                                        META DESCRIPTION TEXT2 UNIQUE DOMAIN NAME PAGE TITLE 2
                                        www.DOMAIN4.com/?action=news&id=120
                                        META DESCRIPTION TEXT2 UNIQUE DOMAIN NAME PAGE TITLE 3
                                        mail.DOMAIN5.com/?action=category&id=17‎
                                        META DESCRIPTION TEXT3 ns5.DOMAIN6.com/?action=article&id=27‎‎ There are more but those are just some examples. These other domain names being listed are other customer domains on the same VPS shared server. When clicking the result the browser URL still shows the other customer domain name B but the content is usually the 404 page. The page title and meta description on that page is not displayed the same as on the SERP.As far as we can tell, this is the only domain this is occurring for.So far, no crawl errors detected in Webmaster Tools and moz crawl not completed yet.

                                        Intermediate & Advanced SEO | | Motava
                                        0
                                      • esiow2013

                                        May know what's the meaning of these parameters in .htaccess?

                                        Begin HackRepair.com Blacklist RewriteEngine on Abuse Agent Blocking RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Bolt\ 0 [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} CazoodleBot [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Custo [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Default\ Browser\ 0 [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^DIIbot [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^DISCo [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} discobot [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^eCatch [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ecxi [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^FlashGet [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^GetRight [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^GrabNet [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Grafula [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} GT::WWW [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} heritrix [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^HMView [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} HTTP::Lite [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ia_archiver [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} IDBot [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} id-search [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} id-search.org [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^InterGET [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} IRLbot [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ISC\ Systems\ iRc\ Search\ 2.1 [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Java [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^JetCar [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^larbin [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} libwww-perl [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Link [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} LinksManager.com_bot [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} linkwalker [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} lwp-trivial [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Maxthon$ [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} MFC_Tear_Sample [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^microsoft.url [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} Microsoft\ URL\ Control [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} Missigua\ Locator [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Mozilla.NEWT [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Navroad [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^NearSite [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^NetAnts [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^NetSpider [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^NetZIP [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Octopus [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} panscient.com [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^pavuk [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} PECL::HTTP [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^PeoplePal [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} PHPCrawl [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} PleaseCrawl [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^RealDownload [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^ReGet [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Rippers\ 0 [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} SBIder [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^SeaMonkey$ [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} Snoopy [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} Steeler [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^SuperBot [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Surfbot [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Toata\ dragostea\ mea\ pentru\ diavola [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} URI::Fetch [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} urllib [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} User-Agent [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} Web\ Sucker [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} webalta [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^WebAuto [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} WebCollage [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^WebCopier [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^WebFetch [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^WebReaper [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^WebSauger [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^WebStripper [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^WebZIP [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} Wells\ Search\ II [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} WEP\ Search [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Widow [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^WWW-Mechanize [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} zermelo [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ^(.)Zeus.Webster [NC,OR]
                                        RewriteCond %{HTTP_USER_AGENT} ZyBorg [NC]
                                        RewriteRule ^. - [F,L] Abuse bot blocking rule end End HackRepair.com Blacklist

                                        Intermediate & Advanced SEO | | esiow2013
                                        1
                                      • kking4120

                                        What's the best way to redirect categories & paginated pages on a blog?

                                        I'm currently re-doing my blog and have a few categories that I'm getting rid of for housecleaning purposes and crawl efficiency. Each of these categories has many pages (some have hundreds). The new blog will also not have new relevant categories to redirect them to (1 or 2 may work). So what is the best place to properly redirect these pages to? And how do I handle the paginated URLs? The only logical place I can think of would be to redirect them to the homepage of the blog, but since there are so many pages, I don't know if that's the best idea. Does anybody have any thoughts?

                                        Intermediate & Advanced SEO | | kking4120
                                        0
                                      • nicole.healthline

                                        Soft 404's from pages blocked by robots.txt -- cause for concern?

                                        We're seeing soft 404 errors appear in our google webmaster tools section on pages that are blocked by robots.txt (our search result pages). Should we be concerned? Is there anything we can do about this?

                                        Intermediate & Advanced SEO | | nicole.healthline
                                        4

                                      Get started with Moz Pro!

                                      Unlock the power of advanced SEO tools and data-driven insights.

                                      Start my free trial
                                      Products
                                      • Moz Pro
                                      • Moz Local
                                      • Moz API
                                      • Moz Data
                                      • STAT
                                      • Product Updates
                                      Moz Solutions
                                      • SMB Solutions
                                      • Agency Solutions
                                      • Enterprise Solutions
                                      • Digital Marketers
                                      Free SEO Tools
                                      • Domain Authority Checker
                                      • Link Explorer
                                      • Keyword Explorer
                                      • Competitive Research
                                      • Brand Authority Checker
                                      • Local Citation Checker
                                      • MozBar Extension
                                      • MozCast
                                      Resources
                                      • Blog
                                      • SEO Learning Center
                                      • Help Hub
                                      • Beginner's Guide to SEO
                                      • How-to Guides
                                      • Moz Academy
                                      • API Docs
                                      About Moz
                                      • About
                                      • Team
                                      • Careers
                                      • Contact
                                      Why Moz
                                      • Case Studies
                                      • Testimonials
                                      Get Involved
                                      • Become an Affiliate
                                      • MozCon
                                      • Webinars
                                      • Practical Marketer Series
                                      • MozPod
                                      Connect with us

                                      Contact the Help team

                                      Join our newsletter
                                      Moz logo
                                      © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                                      • Accessibility
                                      • Terms of Use
                                      • Privacy

                                      Looks like your connection to Moz was lost, please wait while we try to reconnect.