Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to exclude URL filter searches in robots.txt
-
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505
How can I exclude all of these filters in the robots.txt? I think it'll be:
Disallow: /*?color=$
Is that the correct syntax with the $ sign in it? Thanks!
-
Unless you're specifically calling out Bing or Baidu... in your Robots.txt file they should follow the same directives as Google so testing with Google's Robots.txt file tester should suffice for all of them.
-
Yes, but what about bing and rest of Search Engine?
-
Adrian,
I agree that there certainly is a right answer to the question posted, as the question asks specifically about one way to manage the issue, being a block of filters in the robots.txt file. What I was getting at is that this may or may not necessarily be the "best" way, and that I'd need to look at your site and your unique situation to figure our which would be the best solution for your needs.
It is very likely that with these parameters a robots.txt file block is the best approach, assuming the parameters aren't added by default into category page or category pagination page navigational links, as then it would affect the bot's ability to crawl the site. Also, if people are linking to those URLs (highly unlikely though) you may consider a robots meta noindex,follow tag instead so the pagerank could flow to other pages.
And I'm not entirely sure the code you provided above will work if the blocked parameter is the first one in the string (e.g. domain.com/category/?color=red) as there is the additional wildcard between the ? and the parameter. I would advise testing this in Google Webmaster Tools first.
- On the Webmaster Tools Home page, click the site you want.
- Under Crawl, click Blocked URLs.
- If it's not already selected, click the Test robots.txt tab.
- Copy the content of your robots.txt file, and paste it into the first box.
- In the URLs box, list the site to test against.
- In the User-agents list, select the user-agents you want (e.g. Googlebot)
-
There certainly is a right answer to my question - I already posted it here earlier today:
Disallow: /*?color=
Disallow: /?*manufacturer=Without the $ at the end which would otherwise denote the end of the URL.
-
Hello Adrian,
The Moz reports are meant to help you uncover issues like this. If you're seeing non-canonical URLs in the Moz report then there is a potential issue for Google, Bing and other search engines as well.
Google does respect wildcards (*) in the robots.txt file, though it can easily be done wrong. There is not right or wrong answer to the issue of using filters or faceted navigation, as each circumstance is going to be different. However, I hope some of these articles will help you identify the best approach for your needs:
(Note: Faceted Navigation is not exactly the same as category filters, but the issues and possible solutions are very similar
)Building Faceted Navigation That Doesn't Suck Faceted Navigation Whiteboard Friday
Duplicate Content: Block, Redirect or Canonical
Guide to eCommerce Facets, Filters and Categories
Rel Canonical How To and Why Not
Moz.com Guide to Duplicate ContentI don't know how your store handles these (e.g. does it add the filter automatically, or only when a user selects a filter?) so I can't give you the answer, but I promise if you read those articles above you will have a very good understanding of all of the options so you can choose which is best for you. That might end up being as simple as blocking the filters in your robots.txt file, or you may opt for rel canonical, noindex meta tag, ajax, Google parameter handling, etc...
Good luck!
-
It's not Google's index that I'm interested in in this case, it's for the MOZ reports. Moz was including over 10,000 'pages' because it was indexing these URLs. Now I know how to edit the robots.txt Moz will be prevented from indexing them again (we only have around 2,000 real pages, not 10,000)
-
I sought out the answer from a developer and got the following reply, so posting here in case it helps someone else:
To exclude pages with color or manufacture in them you can use
Disallow: /*?color=
Disallow: /?*manufacturer=A question mark in your try should be omitted as it denotes the end of the url
-
Hi
I would recommend excluding these in Google Webmaster Tools. Once logged in to your account under the "Crawl" menu you will find "URL Parameters". Find the relevant parameter in the list on this page and you can tell Google not to index these pages.
Hope this helps.
Steve
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking Expectations and Search Intent Alignment
I've recently implemented the page optimization recommendations provided by Moz Pro to help our site rank for specific keywords on certain pages. It’s been about two weeks since we've added these URLs/keyword parings. The optimization scores for the targeted keywords and URLs are looking strong. Also, we've crafted these pages with keyword optimisation in mind. However, we have a couple of questions as we move forward: Ranking Expectations: Since our optimization scores are good, when might we start seeing improvements in our keyword rankings? We know that SEO results can take time, but we would appreciate any insights on a typical timeline based on your experiences. Optimization and Search Intent: While the tool’s optimization suggestions have been helpful in regards to giving us a score for a specific keyword, we’re curious about how this factors into search intent. How does this tool take into account variations in search intent, especially if users search using slight variations of the keywords we’re targeting? Thank you so much for your insight!
On-Page Optimization | | Cricket930 -
Does Google penalize you for reindexing multiple URLS?
Hello, Just a quick, question! I was wanting to know if multiple page indexing (site overhaul) could cause a drop in organic traffic ranking or be penalized by Google for submitting multiple pages at one time. Thanks
On-Page Optimization | | InternetRep0 -
Is the URL Matching the Page Title Important?
Hello I have tried searching for an answer on this but I can't get a clear answer due to the results when searching for URL title. I have just launched our second Shopify site for one of our brands. My first site launched in 2014 but when I launched I didn't pay much heed to SEO for page titles, URLs, etc so have retrospectively fixed this over time. For my Shopify site just launching I want to get it as right as possible from the start (learning from mistakes). My question is regarding URLs and what my approach should be for better SEO. So, I have a page with a Title of Newton Leather Wallets, Purses, Card Holders & Glasses Cases and the URL is https://www.tumbleandhide.com/collections/newton-leather-wallets-card-holders It was my understanding that I should try and make the URL reflect the Page Title more accurately. The problem is that this takes the character count to 77. On other pages it can be in the 80s. Will the above link be better for SEO than say just https://www.tumbleandhide.com/collections/newton I am just wary of the URL's being too long as my Moz Site Crawl is returning a lot of URLs that are too long. Thanks in Advance.
On-Page Optimization | | lukegj0 -
Ecommerce URLs with numbers
Hi everybody! I have to optimize an ecommerce where somebody has previously done the SEO optimization, although the URLs have numbers before the product's name They have told me that these numbers are useful to find the products, so I think it shouldn't be really bad if I don't redirect them to "clear" ones. For example: /colesterol-sobrepeso/2217-hc-grass-capsulas-duras-15-capsulas.html > /colesterol-sobrepeso/hc-grass-capsulas-duras-15-capsulas.html Am I right? After all, they contain the keywords and the subfolders are also ok. Or it would be better if I redirect the whole site? Thanks!
On-Page Optimization | | Estherpuntu0 -
Does 'XXX' in Domain get filtered by Google
I have a friend that has xxx in there domain and they are a religious based sex/porn addiction company but they don't show up for the queries that they are optimized against. They have a 12+ year old domain, all good health signs in quality links and press from trusted companies. Google sends them adult traffic, mostly 'trolls' and not the users they are looking for. Has anyone experienced domain word filtering and have a work around or solution? I posted in the Google Webmaster help forums and that community seems a little 'high on their horses' and are trying to hard to be cool. I am not too religious and don't necessarily support the views of the website but just trying to help a friend of a friend with a topic that I have never encountered. here is the url: xxxchurch.com Thanks, Brian
On-Page Optimization | | Add3.com0 -
Incoming Search Terms
Hi guys, I saw a blog post recently where the author added a list of "incoming search queries" to the bottom of his post, obviously to improve the post's ranking for those terms. On one hand, I suppose it it does help users find that post. On the other, it seems lazy and somewhat dodgy, but I haven't found any opinions on it elsewhere and have not seen this practice in my experience. What're your thoughts? Outright search engine manipulation? Cheers, Carlo SCWYt
On-Page Optimization | | mtgconsulting0 -
Which is Best Practice for creating URLs for subdomain?
My website is related to education. We have created sub domains for all major colleges, universities & Entrance exams like Gre, Toefl ETC. for eg: amityuniversity.abc.com (Amity is Name of University ) Now if have to mention city name in URL as well (college is located in multiple locations) amityuniversity-delhi.abc.com
On-Page Optimization | | rohanarora536
amityuniversitydelhi.abc.com Now my Q is can we use hyphens in sub domains if we have to add city name or shall we create without using any hyphens. In Directory structure we can always separate words with hyphens, can we follow same practice in subdomain as well Which is a best URL for subdomain amity-university-delhi.abc.com
amityuniversity-delhi.abc.com
or amityuniversitydelhi.abc.com0 -
URL length... is >115 now >255?
I've been having detailed discussions with a CMS provider on behalf of a client. Long URLs are the least of their problems however, the developer is arguing that Google has amended their algorithm and will now read URLs that are up to 255 characters long. I have stated that as far as I am aware, Google will still not read URLs over 115 characters... Before I stamp my feet, can someone confirm what is actually happening? SEOmoz still classes URLs >115 characters long as an amber issue. Thanks
On-Page Optimization | | Switch_Digital0