• BBgmoro

        See all notifications

        Skip to content
        Moz logo Menu open Menu close
        • Products
          • Moz Pro
          • Moz Pro Home
          • Moz Local
          • Moz Local Home
          • STAT
          • Moz API
          • Moz API Home
          • Compare SEO Products
          • Moz Data
        • Free SEO Tools
          • Domain Analysis
          • Keyword Explorer
          • Link Explorer
          • Competitive Research
          • MozBar
          • More Free SEO Tools
        • Learn SEO
          • Beginner's Guide to SEO
          • SEO Learning Center
          • Moz Academy
          • MozCon
          • Webinars, Whitepapers, & Guides
        • Blog
        • Why Moz
          • Digital Marketers
          • Agency Solutions
          • Enterprise Solutions
          • Small Business Solutions
          • The Moz Story
          • New Releases
        • Log in
        • Log out
        • Products
          • Moz Pro

            Your all-in-one suite of SEO essentials.

          • Moz Local

            Raise your local SEO visibility with complete local SEO management.

          • STAT

            SERP tracking and analytics for enterprise SEO experts.

          • Moz API

            Power your SEO with our index of over 44 trillion links.

          • Compare SEO Products

            See which Moz SEO solution best meets your business needs.

          • Moz Data

            Power your SEO strategy & AI models with custom data solutions.

          Turn SEO data into actionable content briefs

          Turn SEO data into actionable content briefs

          Learn more
        • Free SEO Tools
          • Domain Analysis

            Get top competitive SEO metrics like DA, top pages and more.

          • Keyword Explorer

            Find traffic-driving keywords with our 1.25 billion+ keyword index.

          • Link Explorer

            Explore over 40 trillion links for powerful backlink data.

          • Competitive Research

            Uncover valuable insights on your organic search competitors.

          • MozBar

            See top SEO metrics for free as you browse the web.

          • More Free SEO Tools

            Explore all the free SEO tools Moz has to offer.

          Let your business shine with Listings AI

          Let your business shine with Listings AI

          Get found
        • Learn SEO
          • Beginner's Guide to SEO

            The #1 most popular introduction to SEO, trusted by millions.

          • SEO Learning Center

            Broaden your knowledge with SEO resources for all skill levels.

          • On-Demand Webinars

            Learn modern SEO best practices from industry experts.

          • How-To Guides

            Step-by-step guides to search success from the authority on SEO.

          • Moz Academy

            Upskill and get certified with on-demand courses & certifications.

          • MozCon

            Save on Early Bird tickets and join us in London or New York City

          Access 20 years of data with flexible pricing
          Moz API

          Access 20 years of data with flexible pricing

          Find your plan
        • Blog
        • Why Moz
          • Digital Marketers

            Simplify SEO tasks to save time and grow your traffic.

          • Small Business Solutions

            Uncover insights to make smarter marketing decisions in less time.

          • Agency Solutions

            Earn & keep valuable clients with unparalleled data & insights.

          • Enterprise Solutions

            Gain a competitive edge in the ever-changing world of search.

          • The Moz Story

            Moz was the first & remains the most trusted SEO company.

          • New Releases

            Get the scoop on the latest and greatest from Moz.

          Surface actionable competitive intel
          New Feature

          Surface actionable competitive intel

          Learn More
        • Log in
          • Moz Pro
          • Moz Local
          • Moz Local Dashboard
          • Moz API
          • Moz API Dashboard
          • Moz Academy
        • Avatar
          • Moz Home
          • Notifications
          • Account & Billing
          • Manage Users
          • Community Profile
          • My Q&A
          • My Videos
          • Log Out

        The Moz Q&A Forum

        • Forum
        • Questions
        • My Q&A
        • Users
        • Ask the Community

        Welcome to the Q&A Forum

        Browse the forum for helpful insights and fresh discussions about all things SEO.

        1. Home
        2. SEO Tactics
        3. Intermediate & Advanced SEO
        4. Using 2 wildcards in the robots.txt file

        Moz Q&A is closed.

        After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

        Using 2 wildcards in the robots.txt file

        Intermediate & Advanced SEO
        2
        2
        1118
        Loading More Posts
        • Watching

          Notify me of new replies.
          Show question in unread.

        • Not Watching

          Do not notify me of new replies.
          Show question in unread if category is not ignored.

        • Ignoring

          Do not notify me of new replies.
          Do not show question in unread.

        • Oldest to Newest
        • Newest to Oldest
        • Most Votes
        Reply
        • Reply as question
        Locked
        This topic has been deleted. Only users with question management privileges can see it.
        • seo123456
          seo123456 last edited by

          I have a URL string which I don't want to be indexed. it includes the characters _Q1 ni the middle of the string.

          So in the robots.txt can I use 2 wildcards in the string to take out all of the URLs with that in it?  So something like /_Q1.  Will that pickup and block every  URL with those characters in the string?

          Also, this is not directly of the root, but in a secondary directory, so .com/.../_Q1.  So do I have to format the robots.txt as //_Q1* as it will be in the second folder or just using /_Q1 will pickup everything no matter what folder it is on?

          Thanks.

          1 Reply Last reply Reply Quote 0
          • lonniea
            lonniea last edited by

            I'm not 100% positive, however it does make sense to use it this way.

            User-agent: *
            Disallow: /*_Q1$

            1 Reply Last reply Reply Quote 0
            • 1 / 1
            • First post
              Last post

            Browse Questions

            Explore more categories

            • Moz Tools

              Chat with the community about the Moz tools.

            • SEO Tactics

              Discuss the SEO process with fellow marketers

            • Community

              Discuss industry events, jobs, and news!

            • Digital Marketing

              Chat about tactics outside of SEO

            • Research & Trends

              Dive into research and trends in the search industry.

            • Support

              Connect on product support and feature requests.

            • See all categories

            Related Questions

            • amberprata

              301 Redirects for Multiple Language Sites in htaccess File

              Hi everyone, I have a site  on a subdomain that has multiple languages set up at the domain level: https://mysite.site.com, https://mysite.site.fr ,  https://mysite.site.es , https://mysite.site.de , etc. We are migrating to a new subdomain and I am trying to create 301 redirects within the htaccess file, but I am a bit lost on how to do this as it seems you have to go from a relative url to an absolute - which would be fine if I was only doing this for the english site, but I'm not. It doesn't seem like I can go from absolute url to an absolute url - but I could be wrong. I am new to editing the htaccess file - so I could definitely use some advice here. Thanks.

              Intermediate & Advanced SEO | | amberprata
              0
            • teconsite

              Pagination loading with using AJAX. Should I change this?

              Hello, while I was checking this site; http://www.disfracessimon.com/disfraces-adultos-16.html I found that the pagination is working this way http://www.disfracessimon.com/disfraces-adultos-16.html#/page-2
              http://www.disfracessimon.com/disfraces-adultos-16.html#/page-3 and content is being loaded using AJAX. So, google is not getting the paginated results. Is this a big issue or there is no problem?
              Should I create a link for See All Products or there is not a big issue? Thank you!

              Intermediate & Advanced SEO | | teconsite
              0
            • IceIcebaby

              Baidu Spider appearing on robots.txt

              Hi, I'm not too sure what to do about this or what to think of it. This magically appeared in my companies robots.txt file (literally magically appeared/text is below) User-agent: Baiduspider
              User-agent: Baiduspider-video
              User-agent: Baiduspider-image
              Disallow: / I know that Baidu is the Google of China, but I'm not sure why this would appear in our robots.txt all of a sudden. Should I be worried about a hack? Also, would I want to disallow Baidu from crawling my companies website? Thanks for your help,
              -Reed

              Intermediate & Advanced SEO | | IceIcebaby
              0
            • dancape

              Using the same content on different TLD's

              HI Everyone, We have clients for whom we are going to work with in different countries but sometimes with the same  language. For example we might have a client in a competitive niche working in Germany, Austria and Switzerland (Swiss German) ie we're going to potentially rewrite our website three times in German, We're thinking of using Google's href lang tags and use pretty much the same content - is this a safe option, has anyone actually tries this successfully or otherwise? All answers appreciated. Cheers, Mel.

              Intermediate & Advanced SEO | | dancape
              1
            • monster99

              How to Disallow Tag Pages With Robot.txt

              Hi i have a site which i'm dealing with that has tag pages for instant - http://www.domain.com/news/?tag=choice How can i exclude these tag pages (about 20+ being crawled and indexed by the search engines with robot.txt Also sometimes they're created dynamically so i want something which automatically excludes tage pages from being crawled and indexed. Any suggestions? Cheers, Mark

              Intermediate & Advanced SEO | | monster99
              0
            • gregelwell

              Could you use a robots.txt file to disalow a duplicate content page from being crawled?

              A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?

              Intermediate & Advanced SEO | | gregelwell
              0
            • nicole.healthline

              Canonical & noindex? Use together

              For duplicate pages created by the "print" function, seomoz says its better to use noindex (http://www.seomoz.org/blog/complete-guide-to-rel-canonical-how-to-and-why-not) and JohnMu says its better to use canonical http://www.google.com/support/forum/p/Webmasters/thread?tid=6c18b666a552585d&hl=en What do you think?

              Intermediate & Advanced SEO | | nicole.healthline
              1
            • nicole.healthline

              Robots.txt & url removal vs. noindex, follow?

              When de-indexing pages from google, what are the pros & cons of each of the below two options: robots.txt & requesting url removal from google webmasters Use the noindex, follow meta tag on all doctor profile pages Keep the URLs in the Sitemap file so that Google will recrawl them and find the noindex meta tag make sure that they're not disallowed by the robots.txt file

              Intermediate & Advanced SEO | | nicole.healthline
              0

            Get started with Moz Pro!

            Unlock the power of advanced SEO tools and data-driven insights.

            Start my free trial
            Products
            • Moz Pro
            • Moz Local
            • Moz API
            • Moz Data
            • STAT
            • Product Updates
            Moz Solutions
            • SMB Solutions
            • Agency Solutions
            • Enterprise Solutions
            • Digital Marketers
            Free SEO Tools
            • Domain Authority Checker
            • Link Explorer
            • Keyword Explorer
            • Competitive Research
            • Brand Authority Checker
            • Local Citation Checker
            • MozBar Extension
            • MozCast
            Resources
            • Blog
            • SEO Learning Center
            • Help Hub
            • Beginner's Guide to SEO
            • How-to Guides
            • Moz Academy
            • API Docs
            About Moz
            • About
            • Team
            • Careers
            • Contact
            Why Moz
            • Case Studies
            • Testimonials
            Get Involved
            • Become an Affiliate
            • MozCon
            • Webinars
            • Practical Marketer Series
            • MozPod
            Connect with us

            Contact the Help team

            Join our newsletter
            Moz logo
            © 2021 - 2026 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
            • Accessibility
            • Terms of Use
            • Privacy

            Looks like your connection to Moz was lost, please wait while we try to reconnect.