Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google Search console says 'sitemap is blocked by robots?
- 
					
					
					
					
 Google Search console is telling me "Sitemap contains URLs which are blocked by robots.txt." I don't understand why my sitemap is being blocked? My robots.txt look like this: User-Agent: * 
 Disallow:It's a WordPress site, with Yoast SEO installed. Is anyone else having this issue with Google Search console? Does anyone know how I can fix this issue? 
- 
					
					
					
					
 Nice happy to hear that do you work with Greg Reindel? He is a good friend I looked at your IP that is why I ask? Tom 
- 
					
					
					
					
 I agree with David Hey is your dev Greg Reindel? If so you can call me for help PM me here for my info. Thomas Zickell 
- 
					
					
					
					
 Hey guys, I ended up disabling the sitemap option from YoastSEO, then installed the 'Google (XML) sitemap' plug-in. I re-submitted the sitemap to Google last night, and it came back with no issues. I'm glad to finally have this sorted out. Thanks for all the help! 
- 
					
					
					
					
 Hi Christian, The current robots.txt shouldn't be blocking those URLs. Did you or someone else recently change the robots.txt file? If so, give Google a few days to re-crawl your site. Also, can you check what happens when you do a fetch and render on one of the blocked posts in Search Console? Do you have issues there? Cheers, David 
- 
					
					
					
					
 I think you need to make an https robots.txt file if you are running https if running https https://a-moz.groupbuyseo.org/blog/xml-sitemaps `User-agent: * Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.php` Sitemap: https://domain.com/index-sitemap.xml(that is a https site map)can you send the sitemap URL or run it though deepcrawl Hope this helps? Did you make a new robots.txt file? 
- 
					
					
					
					
 Thanks for the response. Do you think this is a robots.txt issue? Or could this be caused by the YoastSEO plugin? Do you know if this plug-in works with YoastSEO together? Or will it cause issues? 
- 
					
					
					
					
 Thank you for the response. I just scanned the site using 'Screaming frog'. Under Internal>Directives there were zero 'no index' links. I also check for '404 errors', server 505 errors, or anything 'blocked by robots.txt'. Google search console is still showing me that there are URL's being blocked by my sitemap. (I added a screenshot of this). When I click through, it tells me that the 'post sitemap' has over +300 warnings. I have just deleted the YoastSEO plugin, and I am now re-installing it. hopefully, this fixes the issue. 
- 
					
					
					
					
 No, you do not need to change or plug-in what is happening is Webmaster tools is telling you that you have no index or no follow were robots xTag somewhere on your URLs inside your sitemap. Run your site through Moz, screaming frog Seo spider or deepcrawl and look for no indexed URLs. webmaster tools/search console is telling you that you have no index URLs inside of your XML sitemap not that you robots.txt is blocking it. This would be set in the Yoast plugin. one way to correct it is to look for noindex URLs & filter them inside Yoast so they are not being presented to the crawlers. If you would like you can turn off the sitemap on Yoast and turn it back on if that does not work I recommend completely removing the plug-in and reinstalling it - https://kb.yoast.com/kb/how-can-i-uninstall-my-plugin/
- https://kinsta.com/blog/uninstall-wordpress-plugin/
 Can you send a screenshot of what you're seeing? When you see it in Google Webmaster tools are you talking about the XML sitemap itself mean no indexed because all XML sitemaps are no indexed. Please add this to your robots.txt `User-agent:* Disallow:/wp-admin/ Allow:/wp-admin/admin-ajax.php` Sitemap: http://www.website.com/sitemap_index.xmlI hope this is of help, Tom 
- 
					
					
					
					
 Hi, Use this plugin https://wordpress.org/plugins/wp-robots-txt/ it will remove previous robots.txt and set simple wordpress robots.txt and wait for a day problem can be solved. Also watch this video on the same @ https://www.youtube.com/watch?v=DZiyN07bbBM Thanks 
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		Role of Robots.txt and Search Console parameters settings
 Hi, wondering if anyone can point me to resources or explain the difference between these two. If a site has url parameters disallowed in Robots.txt is it redundant to edit settings in Search Console parameters to anything other than "Let Googlebot Decide"? Technical SEO | | LivDetrick0
- 
		
		
		
		
		
		Errors In Search Console
 Hi All, I am hoping someone might be able to help with this. Last week one of my sites dropped from mid first day to bottom of page 1. We had not been link building as such and it only seems to of affected a single search term and the ranking page (which happens to be the home page). When I was going through everything I went to search console and in crawl errors there are 2 errors that showed up as detected 3 days before the drop. These are: wp-admin/admin-ajax.php showing as response code 400 and also xmlrpc.php showing as response code 405 robots.txt is as follows: user-agent: * disallow: /wp-admin/ allow: /wp-admin/admin-ajax.php Any help with what is wrong here and how to fix it would be greatly appreciated. Many Thanks Technical SEO | | DaleZon0
- 
		
		
		
		
		
		How to remove Parameters from Google Search Console?
 Hi All, Following are parameter configuration in search console - Parameters - fl Technical SEO | | adamjack
 Does this parameter change page content seen by the user? - Yes, Changes, reorders, or narrows page content.
 How does this parameter affect page content? - Narrow
 Which URLs with this parameter should Googlebot crawl? - Let Googlebot decide (Default) Query - Actually it is filter parameter. I have already set canonical on filter page. Now I am doing tracking of filter pages via data layer and tag manager so in google analytic I am not able to see filter url's because of this parameter. So I want to delete this parameter. Can anyone please help me? Thanks!0
- 
		
		
		
		
		
		I accidentally blocked Google with Robots.txt. What next?
 Last week I uploaded my site and forgot to remove the robots.txt file with this text: User-agent: * Disallow: / I dropped from page 11 on my main keywords to past page 50. I caught it 2-3 days later and have now fixed it. I re-imported my site map with Webmaster Tools and I also did a Fetch as Google through Webmaster Tools. I tweeted out my URL to hopefully get Google to crawl it faster too. Webmaster Tools no longer says that the site is experiencing outages, but when I look at my blocked URLs it still says 249 are blocked. That's actually gone up since I made the fix. In the Google search results, it still no longer has my page title and the description still says "A description for this result is not available because of this site's robots.txt – learn more." How will this affect me long-term? When will I recover my rankings? Is there anything else I can do? Thanks for your input! www.decalsforthewall.com Technical SEO | | Webmaster1230
- 
		
		
		
		
		
		HELP: Wrong domain showing up in Google Search
 So i have this domain (1)devicelock.com and i also had this other domain (2)ntutility.com, the 2nd domain was an old domain and it is not in use anymore. But when i search for devicelock on Google, the homepage devicelock.com does not exist. Only ntutility.com comes up. I asked one of the developer how the redirect is happening from the old domain to the new one and he told me its through a DNS forward. And there is no way to have an .htacess file to set up a 301 instead. Please help! Technical SEO | | Devicelock0
- 
		
		
		
		
		
		How to generate a visual sitemap using sitemap.xml
 Are there any tools (online preferably) which will take a sitemap.xml file and generate a visual site map? Seems like an obvious thing to do, but can't find any simple tools for this? Technical SEO | | k3nn3dy30
- 
		
		
		
		
		
		How to push down outdated images in Google image search
 When you do a Google image search for one of my client's products, you see a lot of first-generation hardware (the product is now in its third generation). The client wants to know what they can do to push those images down so that current product images rise to the top. FYI: the client's own image files on their site aren't very well optimized with keywords. My thinking is to have the client optimize their own images and the ones they give to the media with relevant keywords in file names, alt text, etc. Eventually, this should help push down the outdated images is my thinking. Any other suggestions? Thanks so much. Technical SEO | | jimmartin_zoho.com0
- 
		
		
		
		
		
		Should I set up a disallow in the robots.txt for catalog search results?
 When the crawl diagnostics came back for my site its showing around 3,000 pages of duplicate content. Almost all of them are of the catalog search results page. I also did a site search on Google and they have most of the results pages in their index too. I think I should just disallow the bots in the /catalogsearch/ sub folder, but I'm not sure if this will have any negative effect? Technical SEO | | JordanJudson0
 
			
		 
			
		 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				