Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can you have a /sitemap.xml and /sitemap.html on the same site?
-
Thanks in advance for any responses; we really appreciate the expertise of the SEOmoz community!
My question: Since the file extensions are different, can a site have both a /sitemap.xml and /sitemap.html both siting at the root domain?
For example, we've already put the html sitemap in place here: https://www.pioneermilitaryloans.com/sitemap
Now, we're considering adding an XML sitemap. I know standard practice is to load it at the root (www.example.com/sitemap.xml), but am wondering if this will cause conflicts.
I've been unable to find this topic addressed anywhere, or any real-life examples of sites currently doing this.
What do you think?
-
As all 3 of us have said here, Pioneer, there is no issue with setting things up the way you are proposing. Can't make it any clearer than that.
To answer your specific point - /sitemap and /sitemap.xml are categorically NOT seen as the same URL by search engines. They are absolutely considered two different pages. Your statement "...two items with the same url, but different file extensions..." is a non-sequitur. If the URLs have different file extensions, they are by definition NOT the same URL. The file extension (or lack thereof) is an integral part of the URL.
Since 3 different people have given you the same answer and you still don't believe us, why not simply test for yourself?
- Implement the two files as above, then use Google Webmaster Tools to report your XML sitemap location, and confirm that it's finding and recognizing it correctly.
- Then use your browser to go to the URL of the regular sitemap and you'll see that it renders the html version of your sitemap map just fine.
Paul
-
So if I'm understanding you correctly, there's no technical issues with having two items with the same url, but different file extensions, coexisting? I was unable to find any examples of other sites doing this, which is making me question.
I mean, what we're proposing is two separate pieces of content that resolve as:
I want that to work, but it's just amazing to me that it doesn't cause any issues.
-
Just like Oleg & Paul I agree 100% your site may have and it will probably benefit from having both a site map which is a nice feature in HTML format and one in XML format as they are not used for the same purpose by Google nor by individuals so you may safely create a regular webpage in HTML and call it whatever you like if it ends in.XML it is not a forward facing webpage it has a separate use and that uses to tell Google's crawler where you would like it to go now keep in mind Google does not always listen to what we want but site maps can be helpful.
I hope this was of help to you
sincerely,
Thomas
-
As Oleg says - not a problems at all. What you're proposing to do is a pretty standard implementation used by most websites out there.
XML sitemaps are a very specific configuration of data built to a standard that the Search Engines all agreed on - even the naming convention. Spiders are programmed to look for the whole filename (specifically including the .xml suffix) not just the first part of the file name. And yea, connecting to them inside your Webmaster Tools accounts is an extra signal for where the search engines should find them.
Paul
-
Nope, won't cause any problems. The xml sitemap is what you will submit to G and search engines while the HTML one is for your site visitors who want to see all your pages (although it will be crawled and indexed as well).
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I want to load my ecommerce site xml via CDN
Hello Experts. My ecommerce site - abcd.com
Technical SEO | | micey123
My ecommrece site sitemap abcd.com/sitemap.xml
My subdomain - xyz.abcd.com ( this is blank page but status is 200 which runs from cdn) My ecommerce site sitemap abcd.com/sitemap.xml contains only 1 link of subdomain sitemap- xyz.abcd.com/sitemap.xml
And this sitemap- xyz.abcd.com/sitemap.xml contains all category and product links of abcd.com So my query is :- Above configuration is okay? In search console I will add new property - xyz.abcd.com. and add sitemap xyz.abcd.com/sitemap.xml So Google will able to give errors for my website abcd.com Purpose - I want to run my xml sitemap from cdn that's why i have created subdomain like xyz.abcd.com Hope you understood my query. Thanks!0 -
Google Indexed a version of my site w/ MX record subdomain
We're doing a site audit and found "internal" links to a page in search console that appear to be from a subdomain of our site based on our MX record. We use Google Mail internally. The links ultimately redirect to our correct preferred subdomain "www", but I am concerned as to why this is happening and if it can have any negative SEO implications. Example of one of the links: Links aspmx3.googlemail.com.sullivansolarpower.com/about/solar-power-blog/daniel-sullivan/renewable-energy-and-electric-cars-are-not-political-footballs I did a site operator search, site:aspmx3.googlemail.com.sullivansolarpower.com on google and it returns several results.
Technical SEO | | SS.Digital0 -
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
WordPress - How to stop both http:// and https:// pages being indexed?
Just published a static page 2 days ago on WordPress site but noticed that Google has indexed both http:// and https:// url's. Usually I only get http:// indexed though. Could anyone please explain why this may have happened and how I can fix? Thanks!
Technical SEO | | Clicksjim1 -
Robots.txt to disallow /index.php/ path
Hi SEOmoz, I have a problem with my Joomla site (yeah - me too!). I get a large amount of /index.php/ urls despite using a program to handle these issues. The URLs cause indexation errors with google (404). Now, I fixed this issue once before, but the problem persist. So I thought, instead of wasting more time, couldnt I just disallow all paths containing /index.php/ ?. I don't use that extension, but would it cause me any problems from an SEO perspective? How do I disallow all index.php's? Is it a simple: Disallow: /index.php/
Technical SEO | | Mikkehl0 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0 -
How Can I Block Archive Pages in Blogger when I am not using classic/default template
Hi, I am trying to block all the archive pages of my blog as Google is indexing them. This could lead to duplicate content issue. I am not using default blogger theme or classic theme and therefore, I cannot use this code therein: Please suggest me how I can instruct Google not to index archive pages of my blog? Looking for quick response.
Technical SEO | | SoftzSolutions0 -
On a dedicated server with multiple IP addresses, how can one address group be slow/time out and all other IP addresses OK?
We utilize a dedicated server to host roughly 60 sites on. The server is with a company that utilizes a lady who drives race cars.... About 4 months ago we realized we had a group of sites down thanks to monitoring alerts and checked it out. All were on the same IP address and the sites on the other IP address were still up and functioning well. When we contacted the support at first we were stonewalled, but eventually they said there was a problem and it was resolved within about 2 hours. Up until recently we had no problems. As a part of our ongoing SEO we check page load speed for our clients. A few days ago a client who has their site hosted by the same company was running very slow (about 8 seconds to load without cache). We ran every check we could and could not find a reason on our end. The client called the host and were told they needed to be on some other type of server (with the host) at a fee increase of roughly $10 per month. Yesterday, we noticed one group of sites on our server was down and, again, it was one IP address with about 8 sites on it. On chat with support, they kept saying it was our ISP. (We speed tested on multiple computers and were 22MB down and 9MB up +/-2MB). We ran a trace on the IP address and it went through without a problem on three occassions over about ten minutes. After about 30 minutes the sites were back up. Here's the twist: we had a couple of people in the building who were on other ISP's try and the sites came up and loaded on their machines. Does anyone have any idea as to what the issue is?
Technical SEO | | RobertFisher0