• BBgmoro

        See all notifications

        Skip to content
        Moz logo Menu open Menu close
        • Products
          • Moz Pro
          • Moz Pro Home
          • Moz Local
          • Moz Local Home
          • STAT
          • Moz API
          • Moz API Home
          • Compare SEO Products
          • Moz Data
        • Free SEO Tools
          • Domain Analysis
          • Keyword Explorer
          • Link Explorer
          • Competitive Research
          • MozBar
          • More Free SEO Tools
        • Learn SEO
          • Beginner's Guide to SEO
          • SEO Learning Center
          • Moz Academy
          • MozCon
          • Webinars, Whitepapers, & Guides
        • Blog
        • Why Moz
          • Digital Marketers
          • Agency Solutions
          • Enterprise Solutions
          • Small Business Solutions
          • The Moz Story
          • New Releases
        • Log in
        • Log out
        • Products
          • Moz Pro

            Your all-in-one suite of SEO essentials.

          • Moz Local

            Raise your local SEO visibility with complete local SEO management.

          • STAT

            SERP tracking and analytics for enterprise SEO experts.

          • Moz API

            Power your SEO with our index of over 44 trillion links.

          • Compare SEO Products

            See which Moz SEO solution best meets your business needs.

          • Moz Data

            Power your SEO strategy & AI models with custom data solutions.

          Turn SEO data into actionable content briefs

          Turn SEO data into actionable content briefs

          Learn more
        • Free SEO Tools
          • Domain Analysis

            Get top competitive SEO metrics like DA, top pages and more.

          • Keyword Explorer

            Find traffic-driving keywords with our 1.25 billion+ keyword index.

          • Link Explorer

            Explore over 40 trillion links for powerful backlink data.

          • Competitive Research

            Uncover valuable insights on your organic search competitors.

          • MozBar

            See top SEO metrics for free as you browse the web.

          • More Free SEO Tools

            Explore all the free SEO tools Moz has to offer.

          Let your business shine with Listings AI

          Let your business shine with Listings AI

          Get found
        • Learn SEO
          • Beginner's Guide to SEO

            The #1 most popular introduction to SEO, trusted by millions.

          • SEO Learning Center

            Broaden your knowledge with SEO resources for all skill levels.

          • On-Demand Webinars

            Learn modern SEO best practices from industry experts.

          • How-To Guides

            Step-by-step guides to search success from the authority on SEO.

          • Moz Academy

            Upskill and get certified with on-demand courses & certifications.

          • MozCon

            Save on Early Bird tickets and join us in London or New York City

          Access 20 years of data with flexible pricing
          Moz API

          Access 20 years of data with flexible pricing

          Find your plan
        • Blog
        • Why Moz
          • Digital Marketers

            Simplify SEO tasks to save time and grow your traffic.

          • Small Business Solutions

            Uncover insights to make smarter marketing decisions in less time.

          • Agency Solutions

            Earn & keep valuable clients with unparalleled data & insights.

          • Enterprise Solutions

            Gain a competitive edge in the ever-changing world of search.

          • The Moz Story

            Moz was the first & remains the most trusted SEO company.

          • New Releases

            Get the scoop on the latest and greatest from Moz.

          Surface actionable competitive intel
          New Feature

          Surface actionable competitive intel

          Learn More
        • Log in
          • Moz Pro
          • Moz Local
          • Moz Local Dashboard
          • Moz API
          • Moz API Dashboard
          • Moz Academy
        • Avatar
          • Moz Home
          • Notifications
          • Account & Billing
          • Manage Users
          • Community Profile
          • My Q&A
          • My Videos
          • Log Out

        The Moz Q&A Forum

        • Forum
        • Questions
        • My Q&A
        • Users
        • Ask the Community

        Welcome to the Q&A Forum

        Browse the forum for helpful insights and fresh discussions about all things SEO.

        1. Home
        2. SEO Tactics
        3. Technical SEO
        4. Robot.txt : How to block a specific file type in several subdirectories ?

        Moz Q&A is closed.

        After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

        Robot.txt : How to block a specific file type in several subdirectories ?

        Technical SEO
        2
        3
        2006
        Loading More Posts
        • Watching

          Notify me of new replies.
          Show question in unread.

        • Not Watching

          Do not notify me of new replies.
          Show question in unread if category is not ignored.

        • Ignoring

          Do not notify me of new replies.
          Do not show question in unread.

        • Oldest to Newest
        • Newest to Oldest
        • Most Votes
        Reply
        • Reply as question
        Locked
        This topic has been deleted. Only users with question management privileges can see it.
        • LabeliumUSA
          LabeliumUSA last edited by

          Hello everyone !

          I need help setting up a robot.txt.

          I'm trying to block all pdf files in particular directories so I'm using this command. In the example below the line is blocking all .gif in the entire site.

          Block files of a specific file type (for example, .gif) | Disallow: /*.gif$

          2 questions :

          • Can I use this command to specify one particular directory in which I want to block pdf files ? Will this line be recognized by googlebots ?

          Disallow: /fileadmin/xxxxxxx/xxx/xxxxxxx/*.pdf$

          • Then I realized that I would have to write as many lines as many directories there are in which I want to block pdf files.

          Let's say I want to block pdf files in all these 3 directories

          /fileadmin/directory1

          /fileadmin/directory1/sub1

          /fileadmin/directory1/sub1/pdf

          Is there a pattern-matching rule I could use to blocks access to pdf files in all subdirectories instead of writing 3x the above line for each subdirectory ? For exemple :

          Disallow: /fileadmin/directory1*/

          Many thanks in advance for any insight you may have.

          1 Reply Last reply Reply Quote 0
          • LabeliumUSA
            LabeliumUSA @Rajesh.Prajapati last edited by

            Hey thank you for your answer, really appreciate it.

            1 Reply Last reply Reply Quote 0
            • Rajesh.Prajapati
              Rajesh.Prajapati last edited by

              Use this code -
              Disallow: /*.f$
              If you want to block only one folder then use this -
              Disallow: /folder1/
              .*f$
              This rule will help to block both files only .pdf and .gif

              LabeliumUSA 1 Reply Last reply Reply Quote 1
              • 1 / 1
              • First post
                Last post

              Browse Questions

              Explore more categories

              • Moz Tools

                Chat with the community about the Moz tools.

              • SEO Tactics

                Discuss the SEO process with fellow marketers

              • Community

                Discuss industry events, jobs, and news!

              • Digital Marketing

                Chat about tactics outside of SEO

              • Research & Trends

                Dive into research and trends in the search industry.

              • Support

                Connect on product support and feature requests.

              • See all categories

              Related Questions

              • Julisn

                CcTLD + Subdirectory for languages

                Hey, a client has as .de domain with subdirectories for different languages, so domain.de/de, domain.de/en, domain.de/fr etc. hreflang Tags are implemented, so each subdirectory of each language references to the other languages, so for domain.de/en it is: My question is about the combination of ccTLD + language subdirectory. Do you think this is problematic for Google and should be replaced with .com + language subdirectory? We have lots a high quality domains (from countries with corresponding languages) linking to .de/de and .de/en, some links on .de/fr & .de/es and 0 links pointing to .de/cn. Thanks in advance!
                Julian

                Technical SEO | | Julisn
                0
              • btreloar

                Robots.txt Syntax for Dynamic URLs

                I want to Disallow certain dynamic pages in robots.txt and am unsure of the proper syntax. The pages I want to disallow all include the string ?Page= Which is the proper syntax?
                Disallow: ?Page=
                Disallow: ?Page=*
                Disallow: ?Page=
                Or something else?

                Technical SEO | | btreloar
                0
              • kcb8178

                Is there a limit to how many URLs you can put in a robots.txt file?

                We have a site that has way too many urls caused by our crawlable faceted navigation.  We are trying to purge 90% of our urls from the indexes.  We put no index tags on the url combinations that we do no want indexed anymore, but it is taking google way too long to find the no index tags.  Meanwhile we are getting hit with excessive url warnings and have been it by Panda. Would it help speed the process of purging urls if we added the urls to the robots.txt file?  Could this cause any issues for us?  Could it have the opposite effect and block the crawler from finding the urls, but not purge them from the index? The list could be in excess of 100MM urls.

                Technical SEO | | kcb8178
                0
              • catalinmoraru

                Blocked URL parameters can still be crawled and indexed by google?

                Hy guys, I have two questions and one might be a dumb question but there it goes. I just want to be sure that I understand: IF I tell webmaster tools to ignore an URL Parameter, will google still index and rank my url? IS it ok if I don't append in the url structure the brand filter?, will I still rank for that brand? Thanks, PS: ok 3 questions :)...

                Technical SEO | | catalinmoraru
                0
              • irvingw

                Allow or Disallow First in Robots.txt

                If I want to override a Disallow directive in robots.txt with an Allow command, do I have the Allow command before or after the Disallow command? example: Allow: /models/ford///page* Disallow: /models////page

                Technical SEO | | irvingw
                0
              • vforvinnie

                Block Quotes and Citations for duplicate content

                I've been reading about the proper use for block quotes and citations lately, and wanted to see if I was interpreting it the right way.  This is what I read: http://www.pitstopmedia.com/sem/blockquote-cite-q-tags-seo So basically my question is, if I wanted to reference Amazon or another stores product reviews, could I use the block quote and citation tags around their content so it doesn't look like duplicate content?  I think it would be great for my visitors, but also to the source as I am giving them credit.  It would also be a good source to link to on my products pages, as I am not competing with the manufacturer for sales.  I could also do this for product information right from the manufacturer. I want to do this for a contact lens site.  I'd like to use Acuvue's reviews from their website, as well as some of their product descriptions.  Of course I have my own user reviews and content for each product on my website, but I think some official copy could do well. Would this be the best method?  Is this how Rottentomatoes.com does it?  On every movie page they have 2-3 sentences from 50 or so reviews, and not much unique content of their own. Cheers, Vinnie

                Technical SEO | | vforvinnie
                1
              • ErnieB

                Subdomain Removal in Robots.txt with Conditional Logic??

                I would like to see if there is a way to add conditional logic to the robots.txt file so that when we push from DEV to PRODUCTION and the robots.txt file is pushed, we don't have to remember to NOT push the robots.txt file OR edit it when it goes live. My specific situation is this: I have www.website.com, dev.website.com and new.website.com and somehow google has indexed the DEV.website.com and NEW.website.com and I'd like these to be removed from google's index as they are causing duplicate content. Should I: a) add 2 new GWT entries for DEV.website.com and NEW.website.com and VERIFY ownership - if I do this, then when the files are pushed to LIVE won't the files contain the VERIFY META CODE for the DEV version even though it's now LIVE? (hope that makes sense) b) write a robots.txt file that specifies "DISALLOW: DEV.website.com/" is that possible? I have only seen examples of DISALLOW with a "/" in the beginning... Hope this makes sense, can really use the help!  I'm on a Windows Server 2008 box running ColdFusion websites.

                Technical SEO | | ErnieB
                0
              • dreadmichael

                How to find a specific link on my website (currently causing redirects)

                Hi everyone, I've used crawlers like Xenu to find broken links before, and I love these tools. What I can't figure out is how to find specific pieces of code within my site. For example, Webmaster Tools tells me there are still links to old pages somewhere on my website but I just can't find them. Do you know of a crawler that can search for a specific link within the html? Thanks in advance, Josh

                Technical SEO | | dreadmichael
                0

              Get started with Moz Pro!

              Unlock the power of advanced SEO tools and data-driven insights.

              Start my free trial
              Products
              • Moz Pro
              • Moz Local
              • Moz API
              • Moz Data
              • STAT
              • Product Updates
              Moz Solutions
              • SMB Solutions
              • Agency Solutions
              • Enterprise Solutions
              • Digital Marketers
              Free SEO Tools
              • Domain Authority Checker
              • Link Explorer
              • Keyword Explorer
              • Competitive Research
              • Brand Authority Checker
              • Local Citation Checker
              • MozBar Extension
              • MozCast
              Resources
              • Blog
              • SEO Learning Center
              • Help Hub
              • Beginner's Guide to SEO
              • How-to Guides
              • Moz Academy
              • API Docs
              About Moz
              • About
              • Team
              • Careers
              • Contact
              Why Moz
              • Case Studies
              • Testimonials
              Get Involved
              • Become an Affiliate
              • MozCon
              • Webinars
              • Practical Marketer Series
              • MozPod
              Connect with us

              Contact the Help team

              Join our newsletter
              Moz logo
              © 2021 - 2026 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
              • Accessibility
              • Terms of Use
              • Privacy

              Looks like your connection to Moz was lost, please wait while we try to reconnect.