Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to safely exclude search result pages from Google's index?
-
Hello everyone,
I'm wondering what's the best way to prevent/block search result pages from being indexed by Google. The way search works on my site is that search form generates URLs like:
/index.php?blah-blah-search-results-blahI wanted to block everything of that sort, but how do I do it without blocking /index.php ?
Thanks in advance and have a great day everyone!
-
Hi Louise,
If you can ID the parameters, you can also look at blocking these in Webmaster Tools. This page explains more. As with any blocking of URLs, of course, proceed with caution.
-
I agree that can be effective. The reason I suggested the robots.txt is because Louise mentioned "blocking and preventing" as an objective. Robots.txt are particularly useful in the example where results from a search bar or something of that nature is involved. A NOINDEX, FOLLOW will not prevent bots from getting tired and dizzy, whereas the robots.txt can "block and prevent" bots from crawling certain parameters.
With all of that said, I think it is important to understand whether you need the bots to crawl and not index (in which case Spencer's answer is correct), or if you need to prevent bots from crawling the parameters altogether.
Hope that is more clear
-
I'm not sure that robots.txt is effective when url parameters are involved.
I would just add a meta robots tag to the head section of the search results template:
-
If you are able to identify a url parameter, you may excluded them using robots.txt. Here is a great resource on Robots.txt - http://a-moz.groupbuyseo.org/learn/seo/robotstxt
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz reports way fewer backlinks than google search?
My site is only 11 months old but has steadily (if not slowly) been gaining backlinks. My question, is why Moz shows me at 303 backlinks and Google search console is showing at 1,237? I am more than a little suspicious that this could highlight the reason Moz shows such an unfavorable DA ranking for our site at a DA12. Other competitors that rank for similar keywords to mine are DA 42, DA 65, DA 73, etc. If the largest ranking factor is links, and they have mine reported incorrectly - is this the issue with DA as it relates to sites like mine? Any answer from someone who has experienced similar, or has a definitive answer is more than welcome to chime in! Thanks, Kevin
Reporting & Analytics | | kvncrll0 -
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
Hi I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? : t&c's shipping policy pricing policy privacy policy etc So in summary: Shall I unblock these? What caused it Shopify default settings or more likely my migration team? All Best Dan
Reporting & Analytics | | Dan-Lawrence0 -
How to Diagnose "Crawled - Currently Not Indexed" in Google Search Console
The new Google Search Console gives a ton of information about which pages were excluded and why, but one that I'm struggling with is "crawled - currently not indexed". I have some clients that have fallen into this pit and I've identified one reason why it's occurring on some of them - they have multiple websites covering the same information (local businesses) - but others I'm completely flummoxed. Does anyone have any experience figuring this one out?
Reporting & Analytics | | brettmandoes2 -
Google analytics suddenly stopped tracking all my landing pages
Hey guys. I love the new update of GA. Looks so clean. So, of course, I was excited to see how my landing pages were doing. I went to behavior, all content, all pages. And I noticed it's only showing me 19 pages out of the 93 I have indexed. And none of the top ones at all! Can't find them anywhere in GA! Anyone seen this before? Thank you so much
Reporting & Analytics | | Meier0 -
Getting google impressions for a site not in the index...
Hi all Wondering if i could pick the brains of those wise than myself... my client has an https website with tons of pages indexed and all ranking well, however somehow they managed to also set their server up so that non https versions of the pages were getting indexed and thus we had the same page indexed twice in the engine but on slightly different urls (it uses a cms so all the internal links are relative too). The non https is mainly used as a dev testing environment. Upon seeing this we did a google remove request in WMT, and added noindex in the robots and that saw the index pages drop over night. See image 1. However, the site still appears to getting return for a couple of 100 searches a day! The main site gets about 25,000 impressions so it's way down but i'm puzzled as to how a site which has been blocked can appear for that many searches and if we are still liable for duplicate content issues. Any thoughts are most welcome. Sorry, I am unable to share the site name i'm afraid. Client is very strict on this. Thanks, Carl image1.png
Reporting & Analytics | | carl_daedricdigital0 -
Switch to www from non www preference negatively hit # pages indexed
I have a client whose site did not use the www preference but rather the non www form of the url. We were having trouble seeing some high quality inlinks and I wondered if the redirect to the non www site from the links was making it hard for us to track. After some reading, it seemed we should be using the www version for better SEO anyway so I made a change on Monday but had a major hit to the number of pages being indexed by Thursday. Freaking me out mildly. What are people's thoughts? I think I should roll back the www change asap - or am I jumping the gun?
Reporting & Analytics | | BrigitteMN0 -
Is it possible to use Google Tag Manager to pass a user’s text input into a form field to Google analytics?
Hey Everyone, I finally figured out how to use auto event tracking with Google Tag Manager, but didn't get the data I wanted. I want to see what users are typing into the search field on my site (the URL structure of my site isn't set up properly to use GA's built-in site search tracking). So, I set up the form submit event tracking in Google Tag Manager and used the following as my event tracking parameters: Category: Search Action: Search Value When I test and look in Google Analytics I just see: "search" and "search value." I wanted to see the text that I searched on my site. Not just the Action and Category of the event.... Is what I'm trying to do even possible? Do I need to set up a different event tracking parameter? Thanks everyone!
Reporting & Analytics | | DaveGuyMan0 -
Easiest way to get out of Google local results?
Odd one this, but what's the easiest way to remove a website from the Google local listings? Would removing all the Google map listings do the job? A client of ours is suffering massively since the Google update in the middle of last month. Previously they would appear no1 or no2 in the local results and normally 1 or 2 in the organic results. However, since the middle of last month any time they rank on the first page for a local result, their organic result has dropped massively to at least page 4. If I set my location as something different in google, say 100 miles away, they then rank well for the organic listings (obviously not appearing for local searches). When I change it back to my current location the organic listing is gone and they are back to ranking for the local. Since the middle of July the traffic from search engines has dropped about 65%. All the organic rankings remain as strong as ever just not in the areas where they want to get customers from!! The idea is to remove the local listing and get the organics reranking as the ctr on those is much much higher. On a side note, anyone else notice very poor ctr on google local listings? Maybe users feel they are adverts thanks
Reporting & Analytics | | ccgale0