Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Why are pages still showing in SERPs, despite being NOINDEXed for months?
- 
					
					
					
					
 We have thousands of pages we're trying to have de-indexed in Google for months now. They've all got . But they simply will not go away in the SERPs. Here is just one example.... http://bitly.com/VutCFiIf you search this URL in Google, you will see that it is indexed, yet it's had for many months. This is just one example for thousands of pages, that will not get de-indexed. Am I missing something here? Does it have to do with using content="none" instead of content="noindex, follow"? Any help is very much appreciated. 
- 
					
					
					
					
 Thanks for your reply, Let me know if you are able to deindex those pages. I will wait. Also please share what you have implemented to deindex those pages. 
- 
					
					
					
					
 A page can have a link to it, and still not be indexed, so I disagree with you on that. But thanks for using the domain name. That will teach me to use a URL shortener... 
- 
					
					
					
					
 Hm, that is interesting. So you're saying that it will get crawled, and thus will eventually become deindexed (as noindex is part of the content="none" directive), but since it's a dead end page, it just takes an extra long time for that particular page to get crawled? 
- 
					
					
					
					
 Just to add to the other answers, you can also remove the URLs (or entire directory if necessary) via the URL removal tool in Webmaster Tools, although Google prefers you to use it for emergencies of sorts (I've had no problems with it). http://support.google.com/webmasters/bin/answer.py?hl=en&answer=164734 
- 
					
					
					
					
 No, nofollow will only tell the bot that the page is a dead end - that the bot should not follow any links on page. And that means any inks from those pages won't be visited by the bot - that is slowing the crawling process overall for those pages. If you block a page in robots.txt and the page is already in the index - that will remain in the index as the noindex or content=none won't be seen by the bot so it won't be removed from the index - it will just won't be visited anymore. 
- 
					
					
					
					
 Ok, so, nofollow is stopping the page from being read at all? I thought that nofollow just means the links on the page will not be followed. Is meta nofollow essentially the same as blocking a page in robots.txt? 
- 
					
					
					
					
 Hi Howard, The page is in Google index because you are still linking to that page from your website. Here is the page from where that page links: http://www.2mcctv.com/product_print-productinfoVeiluxVS70CDNRDhtml.html As you are linking that page Google indexing the page. Google come to know about "noindex" tag before that he has already indexed it. Sorry for bad English. Lindsay has written awesome post about it here: http://www.seomoz.org/blog/robot-access-indexation-restriction-techniques-avoiding-conflicts After reading above blog post, my all doubts about noindex, follow, robots.txt get clear. Thanks Lindsay 
- 
					
					
					
					
 We always use the noindex code in our robot.txt file. 
- 
					
					
					
					
 Hi, In order to deindex you should use noindex as content=none also means nofollow. You do need to follow now in order to reach all other pages and see the no index tag and remove those from the index. When you have all of them out of the index you can set the none back on. This is the main reason "none" as attribute is not very wide in usage as "shooting yourself in the foot" with it it's easy. On the otehr hand you need to see if google bot is actually reaching those pages: - 
see if you don't have any robots.txt restrictions first 
- 
see when google's bot last have a hit on any of the pages - that will give you a good idea and you can do a prediction. 
 If those pages are in the sup index you can wait for some time for Google bit to revisit. One last note: build xml sitemaps with all of those pages and submit those via WMT - that will help at 100% to get those in front of the firing squad and also to be able to monitor those better. Hope it helps. 
- 
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		.xml sitemap showing in SERP
 Our sitemap is showing in Google's SERP. While it's only for very specific queries that don't seem to have much value (it's a healthcare website and when a doctor who isn't with us is search with the brand name so 'John Smith Brand,' it shows if there's a first or last name that matches the query), is there a way to not make the sitemap indexed so it's not showing in the SERP. I've seen the "x-robots-tag: noindex" as a possible option, but before taking any action wanted to see if this was still true and if it would work. Technical SEO | | Kyleroe950
- 
		
		
		
		
		
		Google has deindexed a page it thinks is set to 'noindex', but is in fact still set to 'index'
 A page on our WordPress powered website has had an error message thrown up in GSC to say it is included in the sitemap but set to 'noindex'. The page has also been removed from Google's search results. Page is https://www.onlinemortgageadvisor.co.uk/bad-credit-mortgages/how-to-get-a-mortgage-with-bad-credit/ Looking at the page code, plus using Screaming Frog and Ahrefs crawlers, the page is very clearly still set to 'index'. The SEO plugin we use has not been changed to 'noindex' the page. I have asked for it to be reindexed via GSC but I'm concerned why Google thinks this page was asked to be noindexed. Can anyone help with this one? Has anyone seen this before, been hit with this recently, got any advice...? Technical SEO | | d.bird0
- 
		
		
		
		
		
		My Website's Home Page is Missing on Google SERP
 Hi All, I have a WordPress website which has about 10-12 pages in total. When I search for the brand name on Google Search, the home page URL isn't appearing on the result pages while the rest of the pages are appearing. There're no issues with the canonicalization or meta titles/descriptions as such. What could possibly the reason behind this aberration? Looking forward to your advice! Cheers Technical SEO | | ugorayan0
- 
		
		
		
		
		
		SERPs started showing the incorrect date next to my pages
 Hi Moz friends, I've noticed since Tuesday, November 9, half of my post's meta dates have changed in regards to what appears next to the post in the search results. Although published this year, I'm getting some saying a random date in 2010! (The domain was born in 2013; which makes this even more odd). This is harming the CTR of my posts and traffic is decreasing. Some posts have gone from 200 hits a day to merely 30. As far as on our end of the website, we have not made any changes in regards to schema markup, rich snippets, etc. We have not edited any post dates. We have actually not added new content since about a week ago, and these incorrect dates have just started to appear on Tuesday. Only changes have been updating certain plugins in terms of maintenance. This is occurring on four of our websites now, so it is not just specific to one. All websites use Wordpress and Genesis theme. It looks like only half of the posts are showing weird dates we've never seen before (far off from the original published date as well as last updated date -- again, dates like 2010, 2011, and 2012 when none of our websites were even created until 2013). We cannot think of a correlation as to why certain posts are showing weird dates and others the correct. The only change we can think of that's related is back in June we changed our posts to show Last Updated date to give our readers an insight into when we changed it last (since it's evergreen content). Google started to use that date for the SERPs which was great, it actually increased traffic. I'm hoping it's a glitch and a recrawl soon may help sift it around. Anybody have experience with this? I've noticed Google fluctuates between showing our last updated date or not even showing a date at all sometimes at random. We're super confused here. Thank you in advance! Technical SEO | | smmour2
- 
		
		
		
		
		
		Are image pages considered 'thin' content pages?
 I am currently doing a site audit. The total number of pages on the website are around 400... 187 of them are image pages and coming up as 'zero' word count in Screaming Frog report. I needed to know if they will be considered 'thin' content by search engines? Should I include them as an issue? An answer would be most appreciated. Technical SEO | | MTalhaImtiaz0
- 
		
		
		
		
		
		Ok to internally link to pages with NOINDEX?
 I manage a directory site with hundreds of thousands of indexed pages. I want to remove a significant number of these pages from the index using NOINDEX and have 2 questions about this: 1. Is NOINDEX the most effective way to remove large numbers of pages from Google's index? 2. The IA of our site means that we will have thousands of internal links pointing to these noindexed pages if we make this change. Is it a problem to link to pages with a noindex directive on them? Thanks in advance for all responses. Technical SEO | | OMGPyrmont0
- 
		
		
		
		
		
		Why are old versions of images still showing for my site in Google Image Search?
 I have a number of images on my website with a watermark. We changed the watermark (on all of our images) in May, but when I search for my site getmecooking in Google Image Search, it still shows the old watermark (the old one is grey, the new one is orange). Is Google not updating the images its search results because they are cached in Google? Or because it is ignoring my images, having downloaded them once? Should we be giving our images a version number (at the end of the file name)? Our website cache is set to 7 days, so that's not the issue. Thanks. Technical SEO | | Techboy0
- 
		
		
		
		
		
		What's the difference between a category page and a content page
 Hello, Little confused on this matter. From a website architectural and content stand point, what is the difference between a category page and a content page? So lets say I was going to build a website around tea. My home page would be about tea. My category pages would be: White Tea, Black Tea, Oolong Team and British Tea correct? ( I Would write content for each of these topics on their respective category pages correct?) Then suppose I wrote articles on organic white tea, white tea recipes, how to brew white team etc...( Are these content pages?) Do I think link FROM my category page ( White Tea) to my ( Content pages ie; Organic White Tea, white tea receipes etc) or do I link from my content page to my category page? I hope this makes sense. Thanks, Bill Technical SEO | | wparlaman0
 
			
		 
			
		 
			
		 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				