Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should XML sitemaps include *all* pages or just the deeper ones?
- 
					
					
					
					
 Hi guys, Ok this is a bit of a sitemap 101 question but I cant find a definitive answer: When we're running out XML sitemaps for google to chew on (we're talking ecommerce and directory sites with many pages inside sub-categories here) is there any point in mentioning the homepage or even the second level pages? We know google is crawling and indexing those and we're thinking we should trim the fat and just send a map of the bottom level pages. What do you think? 
- 
					
					
					
					
 It is correct that DA, PA, depth of pages, etc. are all factors in determining which pages get indexed. If your site offers good navigation, reasonable backlinks, anchor text, etc then you can get close to all pages indexed even on a very large site. Your site map should naturally include a date on every link which indicates when content was added or changed. Even if you submit a 10k list of links, Google can evaluate the dates on each link and determine which content has been added or modified since your site was last crawled. 
- 
					
					
					
					
 Well yes, that's kinda my point. We do have a sensible, crawlable navigation so there will be no problems there, so then the sitemap really becomes an indicator of what needs to be crawled (new and updated pages), but then the same question stands... With other sites we've managed with thousands of pages we've found it detrimental to give Google hundreds of pages to crawl on a sitemap that we don't feel are important. We're pretty sure (and SEOmoz staff have supported this) that domain authority and the number of pages you can get into the index are closely related. 
- 
					
					
					
					
 Tim, We always index ALL pages...the help tip on Google XML also suggests including all pages of your site in the XML sitemap. 
- 
					
					
					
					
 Your sitemap should include every page of your site that you wish to be indexed. The idea is that if your site does not provide crawlable navigation, Google can use your sitemap to crawl your site. There are some sites that use flash and when a crawler lands on a page there is absolutely no where for the crawler to go. If your site navigation is solid then a sitemap doesn't offer any value to Google other then an indicator of when content is updated or added. 
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		Should search pages be indexed?
 Hey guys, I've always believed that search pages should be no-indexed but now I'm wondering if there is an argument to index them? Appreciate any thoughts! Technical SEO | | RebekahVP0
- 
		
		
		
		
		
		When creating parent and child pages should key words be repeated in url and page title?
 We are in the direct mail advertising business: PrintLabelAndMail.com Example: Parent: Technical SEO | | JimDirectMailCoach
 Postcard Direct Mail Children:
 Postcard Mailings
 Postcard Design
 Postcard Samples
 Postcard Pricing
 Postcard Advantages should "postcard" be repeated in the URL and Page Title? and in this example should each of the 5 children link back directly to the parent or would it be better to "daisy chain" them using each as parent for the next?0
- 
		
		
		
		
		
		Should all pagination pages be included in sitemaps
 How important is it for a sitemap to include all individual urls for the paginated content. Assuming the rel next and prev tags are set up would it be ok to just have the page 1 in the sitemap ? Technical SEO | | Saijo.George0
- 
		
		
		
		
		
		Can you noindex a page, but still index an image on that page?
 If a blog is centered around visual images, and we have specific pages with high quality content that we plan to index and drive our traffic, but we have many pages with our images...what is the best way to go about getting these images indexed? We want to noindex all the pages with just images because they are thin content... Can you noindex,follow a page, but still index the images on that page? Please explain how to go about this concept..... Technical SEO | | WebServiceConsulting.com0
- 
		
		
		
		
		
		Host sitemaps on S3?
 Hey guys, I run a dynamic web service and I will start building static sitemaps for it pretty soon. The fact that my app lives in a multitude of servers doesn't make it easy to distribute frequently updated static files throughout the servers. My idea was to host the files in AWS S3 and point my robots.txt sitemap directive there. I'll use a sitemap index so, every other sitemap will be hosted on S3 as well. I could dynamically mirror the content from the files in S3 through my app, but that would be a little more resource intensive than just serving the static files from a common place. Any ideas? Thanks! Technical SEO | | tanlup0
- 
		
		
		
		
		
		Splitting Page Authority with two URLs for the same page.
 Hello guys, My website is currently holding two different URLs for the same page and I am under the impression such set up is dividing my Page Authority and Link Juice. We currently have the following page with both URLs below: www.wbresearch.com/soldiertechnologyusa/home.aspx Technical SEO | | JoaoPdaCosta-WBR
 www.wbresearch.com/soldiertechnologyusa/ Analysing the page authority and backlinks I identified that we are splitting the amount of backlinks (links from sites, social media and therefore authority). "/home.aspx"
 PA: 67
 Linking Root Domains: 52
 Total Links: 272 "/"
 PA: 64
 Linking Root Domains: 29
 Total Links: 128 I am under the impression that if the URLs were the same we would maximise our backlinks and therefore page authority. My Question: How can I fix this? Should I have a 301 redirect from the page "/" to the "/home.aspx" therefore passing the authority and link juice of “/” directly to “/homes.aspx”? Trying to gather thoughts and ideas on this, suggestions are much appreciated? Thanks!0
- 
		
		
		
		
		
		Do I need an XML sitemap?
 I have an established website that ranks well in Google. However, I have just noticed that no xml sitemap has been registered in Google webmaster tools, so the likelihood is that it hasn't been registered with the other search engines. However, there is an html sitemap listed on the website. Seeing as the website is already ranking well, do I still need to generate and submit an XML sitemap? Could there be any detriment to current rankings in doing so? Technical SEO | | pugh0
- 
		
		
		
		
		
		Sitemap for dynamic website with over 10,000 pages
 If I have a website with thousands of products, is it a good idea to create a sitemap for this website for the search engines where you show maybe 250 products on a page so it makes it easy for the search engine to find the part and also puts that part closer to the home page? Seems like google likes pages that are the closest to the home page (less clicks the better) Technical SEO | | roundbrix0
 
			
		 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				