Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How many links can you have on sitemap.html
- 
					
					
					
					
 we have a lot of pages that we want to create crawlable paths to. How many links are able to be crawled on 1 page for sitemap.html 
- 
					
					
					
					
 Sitemaps are limited to 50MB (uncompressed) and 50,000 URLs from Google perspective. All formats limit a single sitemap to 50MB (uncompressed) and 50,000 URLs. If you have a larger file or more URLs, you will have to break it into multiple sitemaps. You can optionally create a sitemap index file (a file that points to a list of sitemaps) and submit that single index file to Google. You can submit multiple sitemaps and/or sitemap index files to Google. Just for everyone's references - here is a great list of 20 limits that you may not know about. 
- 
					
					
					
					
 Hi Imjonny, As you know google crawl all pages without creating any sitemap. You don't need to create html sitemap. Xml sitemap is sufficient to crawl all pages. if you have millions pages, You need to create html sitemap with proper category wise and keep upto 1000 links on one page. . As you know html site map is creating for user not Google, So you don't need to worry about that too much. Thanks 
 Rajesh
- 
					
					
					
					
 We break ours down to 1000 per page. A simple setting in Yoast SEO - if you decide to use their sitemap tool. It's worked well for us though I may bump that number up a bit. 
- 
					
					
					
					
 Well rather the amount of links each page of the sitemap.html is allowed to have. For example, If I have a huge site, I don't want to place all links on 1 page, I would probably break them out to allow the crawlers some breathing room between different links. 
- 
					
					
					
					
 Hello! I get that you are referring to the maximum size and/or the limit of URLs the sitemap file can have. That gets answered in the faq of sitemap.org: (link here) Q: How big can my Sitemap be? 
 Sitemaps should be no larger than 50MB (52,428,800 bytes) and can contain a maximum of 50,000 URLs.Best luck! 
 GR
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		Pinging Links
 Interested to know if anybody still uses the strategy of pinging links to make sure they get indexed, there are a number of sites out there which offer it. Is it considered dangerous/spamy? White Hat / Black Hat SEO | | seoman100
- 
		
		
		
		
		
		Can I leave off HTTP/HTTPS in a canonical tag?
 We are working on moving our site to HTTPS and I was asked by my dev team if it is required to declare HTTP or HTTPS in the canonical tag? I know that relative URL's are acceptable but cannot find anything about HTTP/HTTPS. Example of what they would like to do Has anyone done this? Any reason to not leave off the protocol? White Hat / Black Hat SEO | | Shawn_Huber0
- 
		
		
		
		
		
		Do I lose link juice if I have a https site and someone links to me using http instead?
 We have recently launched a https site which is getting some organic links some of which are using https and some are using http. Am I losing link juice on the ones linked using http even though I am redirecting or does Google view them the same way? As most people still use http naturally will it look strange to google if I contact anyone who has given us a link and ask them to change to https? White Hat / Black Hat SEO | | Lisa-Devins0
- 
		
		
		
		
		
		Footer Link in International Parent Company Websites Causing Penalty?
 Still waiting to look at the analytics for the timeframe, but we do know that the top keyword dropped on or about April 23, 2012 from the #1 ranking in Google - something they had held for years, and traffic dropped over 15% that month and further slips since. Just looked at Google Webmaster Tools and see over 2.3MM backlinks from "sister" compainies from their footers. One has over 700,000, the rest about 50,000 on average and all going to the home page, and all using the same anchor text, which is both a branded keyword, as well as a generic keyword, the same one they ranked #1 for. They are all "nofollows" but we are trying to confirm if the nofollow was before or after they got hit, but regardless, Google has found them. To also add, most of sites are from their international sites, so .de, .pl, .es, .nl and other Eurpean country extensions. Of course based on this, I would assume the footer links and timing, was result of the Penguin update and spam. The one issue, is that the other US "sister" companies listed in the same footer, did not see a drop, in fact some had increase traffic. And one of them has the same issue with the brand name, where it is both a brand name and a generic keyword. The only note that I will make about any of the other domains is that they do not drive the traffic this one used to. There is at least a 100,000+ visitor difference among the main site, and this additional sister sites also listed in the footer. I think I'm on the right track with the footer links, even though the other sites that have the same footer links do not seem to be suffering as much, but wanted to see if anyone else had a different opinion or theory. Thanks! White Hat / Black Hat SEO | | LeverSEO
 Jen Davis0
- 
		
		
		
		
		
		Why would links that were deleted by me 3 months ago still show up in reports?
 I inadvertently created a mini link farm some time back by linking all of my parked domains (2000 plus) to some of my live websites (I was green and didn't think linking between the same owner sites / domains was an issue). These websites were doing well until Penguin and although I did not get any 'bad link' advices from Google I figure I was hit by Penguin. So about 3 or 4 months ago I painstakingly deleted ALL links from all of those domains that I still own (only 500 or so - the others were allowed to lapse). None of those domains have any links linking out at all but old links from those domains are still showing up in WMT and in SEOmoz and every other link tracking report I have run. So why would these links still be reported? How long do old links stay in the internet archives? This may sound like a strange question but do links 'remain with a domain for a given period of time regardless'? Are links archived before being 'thrown out' of the web. I know Google keeps archives of data that has expired, been deleted, website closed etc, etc for about 3 years or so (?). In an effort to correct a situation I have spent countless hours manually deleting thousands of links but they won't go away. Looking for some insight here please. cheers, Mike White Hat / Black Hat SEO | | shags380
- 
		
		
		
		
		
		Deny visitors by referrer in .htaccess to clean up spammy links?
 I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts? White Hat / Black Hat SEO | | highlyrelevant0
- 
		
		
		
		
		
		Would linking out to a gambling/casino site, harm my site and the other sites it links out to?
 I have been emailed asking if I sell links on one of my sites. The person wants to link out to slotsofvegas[dot]com or similar. Should I be concerned about linking out to this and does it reduce the link value to any of the other sites that the site links out to? Thanks, Mark White Hat / Black Hat SEO | | Markus1111
 
			
		 
			
		 
			
		 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				