Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can you have a /sitemap.xml and /sitemap.html on the same site?
-
Thanks in advance for any responses; we really appreciate the expertise of the SEOmoz community!
My question: Since the file extensions are different, can a site have both a /sitemap.xml and /sitemap.html both siting at the root domain?
For example, we've already put the html sitemap in place here: https://www.pioneermilitaryloans.com/sitemap
Now, we're considering adding an XML sitemap. I know standard practice is to load it at the root (www.example.com/sitemap.xml), but am wondering if this will cause conflicts.
I've been unable to find this topic addressed anywhere, or any real-life examples of sites currently doing this.
What do you think?
-
As all 3 of us have said here, Pioneer, there is no issue with setting things up the way you are proposing. Can't make it any clearer than that.
To answer your specific point - /sitemap and /sitemap.xml are categorically NOT seen as the same URL by search engines. They are absolutely considered two different pages. Your statement "...two items with the same url, but different file extensions..." is a non-sequitur. If the URLs have different file extensions, they are by definition NOT the same URL. The file extension (or lack thereof) is an integral part of the URL.
Since 3 different people have given you the same answer and you still don't believe us, why not simply test for yourself?
- Implement the two files as above, then use Google Webmaster Tools to report your XML sitemap location, and confirm that it's finding and recognizing it correctly.
- Then use your browser to go to the URL of the regular sitemap and you'll see that it renders the html version of your sitemap map just fine.
Paul
-
So if I'm understanding you correctly, there's no technical issues with having two items with the same url, but different file extensions, coexisting? I was unable to find any examples of other sites doing this, which is making me question.
I mean, what we're proposing is two separate pieces of content that resolve as:
I want that to work, but it's just amazing to me that it doesn't cause any issues.
-
Just like Oleg & Paul I agree 100% your site may have and it will probably benefit from having both a site map which is a nice feature in HTML format and one in XML format as they are not used for the same purpose by Google nor by individuals so you may safely create a regular webpage in HTML and call it whatever you like if it ends in.XML it is not a forward facing webpage it has a separate use and that uses to tell Google's crawler where you would like it to go now keep in mind Google does not always listen to what we want but site maps can be helpful.
I hope this was of help to you
sincerely,
Thomas
-
As Oleg says - not a problems at all. What you're proposing to do is a pretty standard implementation used by most websites out there.
XML sitemaps are a very specific configuration of data built to a standard that the Search Engines all agreed on - even the naming convention. Spiders are programmed to look for the whole filename (specifically including the .xml suffix) not just the first part of the file name. And yea, connecting to them inside your Webmaster Tools accounts is an extra signal for where the search engines should find them.
Paul
-
Nope, won't cause any problems. The xml sitemap is what you will submit to G and search engines while the HTML one is for your site visitors who want to see all your pages (although it will be crawled and indexed as well).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
.xml sitemap showing in SERP
Our sitemap is showing in Google's SERP. While it's only for very specific queries that don't seem to have much value (it's a healthcare website and when a doctor who isn't with us is search with the brand name so 'John Smith Brand,' it shows if there's a first or last name that matches the query), is there a way to not make the sitemap indexed so it's not showing in the SERP. I've seen the "x-robots-tag: noindex" as a possible option, but before taking any action wanted to see if this was still true and if it would work.
Technical SEO | | Kyleroe950 -
Can anyone tell me why some of the top referrers to my site are porn site?
We noticed today that 4 of the top referring sites are actually porn sites. Does anyone know what that is all about? Thanks!
Technical SEO | | thinkcreativegroup1 -
Disallow: /404/ - Best Practice?
Hello Moz Community, My developer has added this to my robots.txt file: Disallow: /404/ Is this considered good practice in the world of SEO? Would you do it with your clients? I feel he has great development knowledge but isn't too well versed in SEO. Thank you in advanced, Nico.
Technical SEO | | niconico1011 -
How to create site map for large site (ecommerce type) that has 1000's if not 100,000 of pages.
I know this is kind of a newbie question but I am having an amazing amount of trouble creating a sitemap for our site Bestride.com. We just did a complete redesign (look and feel, functionality, the works) and now I am trying to create a site map. Most of the generators I have used "break" after reaching some number of pages. I am at a loss as to how to create the sitemap. Any help would be greatly appreciated! Thanks
Technical SEO | | BestRide0 -
Can dynamically translated pages hurt a site?
Hi all...looking for some insight pls...i have a site we have worked very hard on to get ranked well and it is doing well in search. The site has about 1000 pages and climbing and has about 50 of those pages in translated pages and are static pages with unique urls. I have had no problems here with duplicate content and that sort of thing and all pages were manually translated so no translation issues. We have been looking at software that can dynamically translate the complete site into a handfull of languages...lets say about 5. My problem here is these pages get produced dynamically and i have concerns that google will take issue with this aswell as the huge sudden influx of new urls....as now we could be looking at and increase of 5000 new urls. (which usually triggers an alarm) My feeling is that it could be risking the stability of the site that we have worked so hard for and maybe just stick with the already translated static pages. I am sure the process could be fine but fear a manual inspection and a slap on the wrist for having dynamically created content?? and also just risk a review trigger period. These days it is hard to know what could get you in "trouble" and my gut says keep it simple and as is and dont shake it up?? Am i being overly concerned? Would love to here from others who have tried similar changes and also those who have not due to similar "fear" thanks
Technical SEO | | nomad-2023230 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0 -
Sitmap Page - HTML and XML
Hi there I have a domain which has a sitemap in html for regular users and a sitemap in xml for the spiders. I have a warning via seomoz saying that i have too many links on the html version. What do i do here? regards Stef
Technical SEO | | stefanok0 -
/index.php in sitemap? take it out?
Hi Everyone, The following was automatically generated at xml-sitemaps.com Should I get rid of the index.php url from my sitemap? If so, how do I go about redirecting it in my htaccess ? <url><loc>http://www.mydomain.ca/</loc></url>
Technical SEO | | RogersSEO
<url><loc>http://www.mydomain.ca/index.php</loc></url> thank you in advance, Martin0