Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How do i block an entire category/directory with robots.txt?
-
Anyone has any idea how to block an entire product category, including all the products in that category using the robots.txt file? I'm using woocommerce in wordpress and i'd like to prevent bots from crawling every single one of products urls for now.
The confusing part right now is that i have several different url structures linking to every single one of my products for example www.mystore.com/all-products, www.mystore.com/product-category, etc etc.
I'm not really sure how i'd type it into the robots.txt file, or where to place the file.
any help would be appreciated thanks
-
Thanks for the detailed answer, i will give it a try!
-
Hi
This should do it, you place the robots.txt in the root directory of your site.
User-agent: * Disallow: /product-category/
You can check out some more examples here: http://www.seomoz.org/learn-seo/robotstxt
As for the multiple urls linking to the same pages, you will just need to check all possible variants and make sure you have them covered in the robots.txt file.
Google webmaster tools has a page where you can use to check if the robots.txt file is doing what you expect it to do (under Health -> Blocked Urls).
It might be easier to block the pages with a meta tag as described in the link above if you are running a plugin allowing this, that should take care of all the different url structures also.
Hope that helps!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing navigation menu items/links on homepage
We are redesigning our website after a long stint with an SEO firm who also handled our design/dev. We want to clean up the links on our homepage but don't want to screw up our IA or SEO. We want to delete some navbar menu items and a whole bunch on random links to our evergreen content below the fold. Would we need to reposition those navbar items/content links to our footer or somewhere else on the homepage to maintain our internal linking structure? It would be great if you could take a look at our site and give us any suggestions or advice on the best way to go about this. Thanks!
On-Page Optimization | | Lorne_Marr1 -
H2s & H3s for Category Navigation
Hi all. I am wondering how best to format a category navigation menu. Currently I don't think we're using H2s correctly on our website. Am I right to think that the top level category e.g. Games should be formatted as an H2 and the sub-categories underneath this should be formatted as H3s (to show a hierarchy)? Is there a limit on how many H2s and H3s you should use? Obviously only one H1 per page. Thanks in advance Paul
On-Page Optimization | | kevinliao0 -
Blocking Subdomain from Google Crawl and Index
Hey everybody, how is it going? I have a simple question, that i need answered. I have a main domain, lets call it domain.com. Recently our company will launch a series of promotions for which we will use cname subdomains, i.e try.domain.com, or buy.domain.com. They will serve a commercial objective, nothing more. What is the best way to block such domains from being indexed in Google, also from counting as a subdomain from the domain.com. Robots.txt, No-follow, etc? Hope to hear from you, Best Regards,
On-Page Optimization | | JesusD3 -
Canonical URL, cornerstone page and categories
If I want to have a cornerstone "page", can I substitute an actual page with a category archive of posts "page" (that contains many posts containing the target key phrase)? This way, if I make blog posts about a certain topic/ key phrase (example "beach weddings") and add a canonical URL of the category archive page to the individual posts, am I right then to assume google will see the archive page as the cornerstone page (and thereby won't see the individual posts with the same key phrase as competing)?
On-Page Optimization | | stephanwb0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
Do Parent Categories Hurt SEO?
I have parent categories and subcategories. Will it be harder for the subcategories to rank well because they have a parent category? The URL is longer, for one. I am just wondering if I should not have parent categories. I have one category page doing really well and I am trying to boost the others (most of which are subcategories) and this is a concern for me. Thanks! Edit: I also have a category that has 2 parent categories. I want it automatically in those 2 categories and one of its own. By itself it is very important keyword. Is this ok or should I have it be a parent category?
On-Page Optimization | | 2bloggers0 -
Right way to block google robots from ppc landing pages
What is the right way to completely block seo robots from my adword landing pages? Robots.txt does not work really good for that, as far I know. Adding metatags noindex nofollow on the other side will block adwords robot as well. right? Thank you very much, Serge
On-Page Optimization | | Kotkov0 -
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested. My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows Sitemap: http://www.mysite.net/sitemapNet.xml
On-Page Optimization | | nordicnetproducts
Sitemap: http://www.mysite.net/sitemapSe.xml in robots.txt, would that result in some cross submission error?0