Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
2 sitemaps on my robots.txt?
-
Hi,
I thought that I just could link one sitemap from my site's robots.txt but... I may be wrong.
So, I need to confirm if this kind of implementation is right or wrong:
robots.txt for Magento Community and Enterprise
...
Sitemap: http://www.mysite.es/media/sitemap/es.xml
Sitemap: http://www.mysite.pt/media/sitemap/pt.xmlThanks in advance,
-
We recently changed our protocol to https
We have in our robots.txt our new https sitemap link
Our agency is recommending we add another sitemap in our robots.txt file to our insecure sitemap - while google is reindexing our secure protocol. They recommend this as a way for all SEs to pick up on 301 redirects and swap out unsecured results in the index more efficiently.
Do you agree with this?
I am in the camp that we should have have our https sitemap and google will figure it out and having 2 sitemaps one to our old http and one to our new https in our robots.txt is redundant and may be viewed as duplicate content, not as a positive of helping SEs to see 301s better to reindex secure links.
Whats your thought? Let me know if I need to explain more.
-
Well if both sitemaps are for same site then it's OK. But it's much better to implement hreflang as this is explained here:https://support.google.com/webmasters/answer/2620865?hl=en
I'm not sure that Magento can do this but you always can hire 3rd party dev for building plugin/module for this.
-
ok, just one detail: these domains are for a multilang site.
I mean, both have quite the same content: one in spanish and the other un portuguese.
Thanks a lot.
-
You can also have multiple sitemaps on 3rd sites. Look at Moz robots.txt:
Sitemap: https://a-moz.groupbuyseo.org/blog-sitemap.xml
Sitemap: https://a-moz.groupbuyseo.org/ugc-sitemap.xml
Sitemap: https://a-moz.groupbuyseo.org/profiles-sitemap.xml
Sitemap: http://d2eeipcrcdle6.cloudfront.net/past-videos.xml
Sitemap: http://app.wistia.com/sitemaps/36357.xmlAlso Google.com robots.txt:
Sitemap: http://www.gstatic.com/culturalinstitute/sitemaps/www_google_com_culturalinstitute/sitemap-index.xml
Sitemap: http://www.gstatic.com/dictionary/static/sitemaps/sitemap_index.xml
Sitemap: http://www.gstatic.com/earth/gallery/sitemaps/sitemap.xml
Sitemap: http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml
Sitemap: http://www.gstatic.com/trends/websites/sitemaps/sitemapindex.xml
Sitemap: https://www.google.com/sitemap.xmlAlso Bing.com robots.txt:
Sitemap: http://cn.bing.com/dict/sitemap-index.xml
Sitemap: http://www.bing.com/offers/sitemap.xmlSo using multiple sitemaps it's OK and they can be also hosted on 3rd party server.
-
Hello,
Yes, multiple sitemaps are okay, and sometimes even advised!
You can read Google's official response here."..it's fine for multiple Sitemaps to live in the same directory (as many as you want!)..."
And you can see a case study showing how multiple sitemaps has helped traffic here on Moz.
Hope this helps,
Don
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robot.txt : How to block a specific file type in several subdirectories ?
Hello everyone ! I need help setting up a robot.txt. I'm trying to block all pdf files in particular directories so I'm using this command. In the example below the line is blocking all .gif in the entire site. Block files of a specific file type (for example, .gif) | Disallow: /*.gif$ 2 questions : Can I use this command to specify one particular directory in which I want to block pdf files ? Will this line be recognized by googlebots ? Disallow: /fileadmin/xxxxxxx/xxx/xxxxxxx/*.pdf$ Then I realized that I would have to write as many lines as many directories there are in which I want to block pdf files. Let's say I want to block pdf files in all these 3 directories /fileadmin/directory1 /fileadmin/directory1/sub1 /fileadmin/directory1/sub1/pdf Is there a pattern-matching rule I could use to blocks access to pdf files in all subdirectories instead of writing 3x the above line for each subdirectory ? For exemple : Disallow: /fileadmin/directory1*/ Many thanks in advance for any insight you may have.
Technical SEO | | LabeliumUSA0 -
Multiple robots.txt files on server
Hi! I have previously hired a developer to put up my site and noticed afterwards that he did not know much about SEO. This lead me to starting to learn myself and applying some changes step by step. One of the things I am currently doing is inserting sitemap reference in robots.txt file (which was not there before). But just now when I wanted to upload the file via FTP to my server I found multiple ones - in different sizes - and I dont know what to do with them? Can I remove them? I have downloaded and opened them and they seem to be 2 textfiles and 2 dupplicates. Names: robots.txt (original dupplicate)
Technical SEO | | mjukhud
robots.txt-Original (original)
robots.txt-NEW (other content)
robots.txt-Working (other content dupplicate) Would really appreciate help and expertise suggestions. Thanks!0 -
Robots txt. in page with 301 redirect
We currently have a a series of help pages that we would like to disallow from our robots txt. The thing is that these help pages are located in our old website, which now has a 301 redirect to current site. Which is the proper way to go around? 1- Add the pages we want to disallow to the robots.txt of the new website? 2- Break the redirect momentarily and add the pages to the robots.txt of the old one? Thanks
Technical SEO | | Kilgray0 -
Robots.txt on subdomains
Hi guys! I keep reading conflicting information on this and it's left me a little unsure. Am I right in thinking that a website with a subdomain of shop.sitetitle.com will share the same robots.txt file as the root domain?
Technical SEO | | Whittie0 -
Which Sitemap to keep - Http or https (or both)
Hi, Just finished upgrading my site to the ssl version (like so many other webmasters now that it may be a ranking factor). FIxed all links, CDN links are now secure, etc and 301 Redirected all pages from http to https. Changed property in Google Analytics from http to https and added https version in Webmaster Tools. So far, so good. Now the question is should I add the https version of the sitemap in the new HTTPS site in webmasters or retain the existing http one? Ideally switching over completely to https version by adding a new sitemap would make more sense as the http version of the sitemap would anyways now be re-directed to HTTPS. But the last thing i can is to get penalized for duplicate content. Could you please suggest as I am still a rookie in this department. If I should add the https sitemap version in the new site, should i delete the old http one or no harm retaining it.
Technical SEO | | ashishb010 -
Googlebot does not obey robots.txt disallow
Hi Mozzers! We are trying to get Googlebot to steer away from our internal search results pages by adding a parameter "nocrawl=1" to facet/filter links and then robots.txt disallow all URLs containing that parameter. We implemented this late august and since that, the GWMT message "Googlebot found an extremely high number of URLs on your site", stopped coming. But today we received yet another. The weird thing is that Google gives many of our nowadays robots.txt disallowed URLs as examples of URLs that may cause us problems. What could be the reason? Best regards, Martin
Technical SEO | | TalkInThePark0 -
Ror.xml vs sitemap.xml
Hey Mozzers, So I've been reading somethings lately and some are saying that the top search engines do not use ror.xml sitemap but focus just on the sitemap.xml. Is that true? Do you use ror? if so, for what purpose, products, "special articles", other uses? Can sitemap be sufficient for all of those? Thank you, Vadim
Technical SEO | | vijayvasu0 -
Subdomain Removal in Robots.txt with Conditional Logic??
I would like to see if there is a way to add conditional logic to the robots.txt file so that when we push from DEV to PRODUCTION and the robots.txt file is pushed, we don't have to remember to NOT push the robots.txt file OR edit it when it goes live. My specific situation is this: I have www.website.com, dev.website.com and new.website.com and somehow google has indexed the DEV.website.com and NEW.website.com and I'd like these to be removed from google's index as they are causing duplicate content. Should I: a) add 2 new GWT entries for DEV.website.com and NEW.website.com and VERIFY ownership - if I do this, then when the files are pushed to LIVE won't the files contain the VERIFY META CODE for the DEV version even though it's now LIVE? (hope that makes sense) b) write a robots.txt file that specifies "DISALLOW: DEV.website.com/" is that possible? I have only seen examples of DISALLOW with a "/" in the beginning... Hope this makes sense, can really use the help! I'm on a Windows Server 2008 box running ColdFusion websites.
Technical SEO | | ErnieB0