Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
-
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested.
My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows
Sitemap: http://www.mysite.net/sitemapNet.xml
Sitemap: http://www.mysite.net/sitemapSe.xmlin robots.txt, would that result in some cross submission error?
-
Thanks for your help René!
-
yup

-
Yes, I mean GTW of course :).
A folder for each site would definitely make some things easier, but it would also mean more work every time we need to republish the site or make configurations.
Did I understand that googlelink correctly in that if we have verified ownership in GWT for all involved domains cross-site submission in robots.txt was okay? I guess google will think its okay anyway.
-
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side

-
you can't submit a sitemap in GA so I'm guessing you mean GWT

Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.

-
Thanks for your response René!
Thing is we already submit the sitemaps in google analytics, but this SEO company we hired wants us to put the sitemaps in robots.txt as well.
The .htaccess idea sounds good, as long as google or someone else dont think we are doing some cross-site submission error (as described here http://www.sitemaps.org/protocol.php#submit_robots)
-
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Tool for Retrieving Multiple URL Word Counts in Bulk?
I am doing some content analysis with over 200 URLs to go through! Does anybody know of, or can recommend any bulk on-page word count checkers which would help with the heavy lifting? Any suggestions are greatly appreciated. Thanks!
On-Page Optimization | | NickG-1230 -
Should I include unnecessary pages in the sitemap.xml
I have a lot of pages that I don't want Google to index, so for most of them, I used cannonical, were they were duplicates, noindex were I wanted to remove the pages, but the question is: Should I include these pages in the sitemap.xml, or just the important pages? Also should I include them in order to get the changes indexed fastet by Google?
On-Page Optimization | | Silviu0 -
How to Handle duplicate pages/titles in Wordpress
The wordpress blog causes problems with page titles. If you go to the second page of blog posts it there's a different URL but with the same page title. for example: page 1: site/blog page 2: site/blog/page/2 Each page gets flagged for duplicate page titles. Thanks in advance for your thoughts,
On-Page Optimization | | heymarshall1 -
Can I add multi location business cities to homepage meta title or desc.?
I have a business with 6 locations (in the same state) but very different cities. We we expanded from one location with the city name in the URL we followed best practices to move to the new domain without the singular city name in the URL. We definitly took a hit on the organic side and I'm trying to figure out best practice for where to add geo info. Currently I have geo info: -In footer
On-Page Optimization | | beehiive
-Contact Page -On local page It's a WP site and each location has it's own page (ie. locations/geolocation_keyword). I know all other locations will take sometime but my concern is the hit we took on the original location that had geo-target URL. I guess really my question is simply can I include city names in homepage meta title and desc.?
and is there anything else I can do to bounce back organically on the original city faster?0 -
Is .PW domain is good for SEO?
I want to register .PW domain which has recently got live to register. I am in doubt should it is good for SEO or not.
On-Page Optimization | | semmediapvtltd0 -
Best Domain Name for Life Coaching Site
Hello, I am an NLP health coach. I am starting to work with both life threatening illnesses and minor diagnoses. NLP is a type of personal development. I'm wondering what your opinion of the best domain would be, keeping in mind branding, SEO, and usability/rememberability. The term "NLP" is not well known. I will be doing both phone coaching and in-person coaching. My other website (BobWeikel.com) is not very strong because of the lack of keywords in the domain, but it's easy to remember. Options are: NLPTrained.com BobWeikelHealthCoach.com BoiseHealthCoach.com (I'm in Boise Idaho) RobertWeikel.com or whatever you suggest.
On-Page Optimization | | BobGW0 -
How long is too long for domain URL length?
I noticed one of the negatively correlated ranking factors was length of URL. I'm building a page from scratch, we are trying to rank for 'Minneapolis Fitness' and 'Minneapolis Massage'. Is www.minnnepolismassageandfitness.com just ridiculously long? Or does the exact match outweigh the penalty for URL length?
On-Page Optimization | | JesseCWalker2 -
How do you block development servers with robots.txt?
When we create client websites the urls are client.oursite.com. Google is indexing theses sites and attaching to our domain. How can we stop it with robots.txt? I've heard you need to have the robots file on both the main site and the dev sites... A code sample would be groovy. Thanks, TR
On-Page Optimization | | DisMedia0