Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Adding multi-language sitemaps to robots.txt
- 
					
					
					
					
 I am working on a revamped multi-language site that has moved to Magento. Each language runs off the core coding so there are no sub-directories per language. The developer has created sitemaps which have been uploaded to their respective GWT accounts. They have placed the sitemaps in new directories such as: - /sitemap/uk/sitemap.xml
- /sitemap/de/sitemap.xml
 I want to add the sitemaps to the robots.txt but can't figure out how to do it. Also should they have placed the sitemaps in a single location with the file identifying each language: - /sitemap/uk-sitemap.xml
- /sitemap/de-sitemap.xml
 What is the cleanest way of handling these sitemaps and can/should I get them on robots.txt? 
- 
					
					
					
					
 Adding the following lines to the bottom of your robots.txt should do it: Sitemap: http://www.example.com/sitemap/uk/sitemap.xml Sitemap: http://www.example.com/sitemap/de/sitemap.xml If you wanted to update the file names to be different it wouldn't hurt, but I don't think you would have any problems with how they are currently set up. If you have submitted them to WMT and they are being picked up ok I think you are fine. 
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		Robots.txt allows wp-admin/admin-ajax.php
 Hello, Mozzers! Technical SEO | | AndyKubrin
 I noticed something peculiar in the robots.txt used by one of my clients: Allow: /wp-admin/admin-ajax.php What would be the purpose of allowing a search engine to crawl this file?
 Is it OK? Should I do something about it?
 Everything else on /wp-admin/ is disallowed.
 Thanks in advance for your help.
 -AK:2
- 
		
		
		
		
		
		Robots.txt Syntax for Dynamic URLs
 I want to Disallow certain dynamic pages in robots.txt and am unsure of the proper syntax. The pages I want to disallow all include the string ?Page= Which is the proper syntax? Technical SEO | | btreloar
 Disallow: ?Page=
 Disallow: ?Page=*
 Disallow: ?Page=
 Or something else?0
- 
		
		
		
		
		
		Do I need a separate robots.txt file for my shop subdomain?
 Hello Mozzers! Apologies if this question has been asked before, but I couldn't find an answer so here goes... Currently I have one robots.txt file hosted at https://www.mysitename.org.uk/robots.txt We host our shop on a separate subdomain https://shop.mysitename.org.uk Do I need a separate robots.txt file for my subdomain? (Some Google searches are telling me yes and some no and I've become awfully confused! Technical SEO | | sjbridle0
- 
		
		
		
		
		
		Remove sitemap, effect ranking?
 We are considering to remove our sitemap because it doesn't display the right structure. Will it affect current rankings if we remove the sitemap en continuing without a sitemap? Thanks Technical SEO | | rijwielcashencarry0400
- 
		
		
		
		
		
		Is there a maximum sitemap size?
 Hi all, Over the last month we've included all images, videos, etc. into our sitemap and now its loading time is rather high. (http://www.troteclaser.com/sitemap.xml) Is there any maximum sitemap size that is recommended from Google? Technical SEO | | Troteclaser0
- 
		
		
		
		
		
		Is it important to include image files in your sitemap?
 I run an ecommerce business that has over 4000 product pages which, as you can imagine, branches off into thousands of image files. Is it necessary to include those in my sitemap for faster indexing? Thanks for you help! -Reed Technical SEO | | IceIcebaby0
- 
		
		
		
		
		
		Empty Meta Robots Directive - Harmful?
 Hi, We had a coding update and a side-effect of that was that our directive was emptied, in other words it now reads as: on all of the site. I've since noticed that Google's cache date on all of the pages - at least, the ones I tested - have a Cached date of no later than 17 December '12 - that's the Monday after the directive was removed on mass. So, A, does anyone have solid evidence of an empty directive causing problems? Past experience, Matt Cutts, Fishkin quote, etc. And then B - It seems fairly well correlated but, does my entire site's homogenous Cached date point to this tag removal? Or is it fairly normal to have a particular cache date across a large site (we're a large ecommerce site). Our site: http://www.zando.co.za/ I'm having the directive reinstated as soon as Dev permitting. And then, for extra credit, is there a way with Google's API, or perhaps some other tool, to run an arbitrary list and retrieve Cached dates? I'd want to do this for diagnosis purposes and preferably in a way that OK with Google. I'd avoid CURLing for the cached URL and scraping out that dates with BASH, or any such kind of thing. Cheers, Technical SEO | | RocketZando0
- 
		
		
		
		
		
		HTML Sitemap Pagination?
 Im creating an a to z type directory of internal pages within a site of mine however there are cases where there are over 500 links within the pages. I intend to use pagination (rel=next/prev) to avoid too many links on the page but am worried about indexation issues. should I be worried?" Technical SEO | | DMGoo0
 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				