Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How many links can you have on sitemap.html
-
we have a lot of pages that we want to create crawlable paths to. How many links are able to be crawled on 1 page for sitemap.html
-
Sitemaps are limited to 50MB (uncompressed) and 50,000 URLs from Google perspective.
All formats limit a single sitemap to 50MB (uncompressed) and 50,000 URLs. If you have a larger file or more URLs, you will have to break it into multiple sitemaps. You can optionally create a sitemap index file (a file that points to a list of sitemaps) and submit that single index file to Google. You can submit multiple sitemaps and/or sitemap index files to Google.
Just for everyone's references - here is a great list of 20 limits that you may not know about.
-
Hi Imjonny,
As you know google crawl all pages without creating any sitemap. You don't need to create html sitemap. Xml sitemap is sufficient to crawl all pages. if you have millions pages, You need to create html sitemap with proper category wise and keep upto 1000 links on one page. . As you know html site map is creating for user not Google, So you don't need to worry about that too much.
Thanks
Rajesh -
We break ours down to 1000 per page. A simple setting in Yoast SEO - if you decide to use their sitemap tool. It's worked well for us though I may bump that number up a bit.
-
Well rather the amount of links each page of the sitemap.html is allowed to have. For example, If I have a huge site, I don't want to place all links on 1 page, I would probably break them out to allow the crawlers some breathing room between different links.
-
Hello!
I get that you are referring to the maximum size and/or the limit of URLs the sitemap file can have. That gets answered in the faq of sitemap.org: (link here)
Q: How big can my Sitemap be?
Sitemaps should be no larger than 50MB (52,428,800 bytes) and can contain a maximum of 50,000 URLs.Best luck!
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do links from subdomains pass the authority and link juice of main domain ?
Hi, There is a subdomain with a root domain's DA 90. I can earn a backlink from that subdomain. This subdomain is fresh with no traffic yet. Do I get the ranking boost and authority from the subdomain? Example: I can earn a do-follow link from **https://what-is-crm.netlify.app/ **but not from https://netlify.app
White Hat / Black Hat SEO | | teamtc0 -
Site Footer Links Used for Keyword Spam
I was on the phone with a proposed web relaunch firm for one of my clients listening to them talk about their deep SEO knowledge. I cannot believe that this wouldn’t be considered black-hat or at least very Spammy in which case a client could be in trouble. On this vendor’s site I notice that they stack the footer site map with about 50 links that are basically keywords they are trying to rank for. But here’s the kicker shown by way of example from one of the themes in the footer: 9 footer links:
White Hat / Black Hat SEO | | RosemaryB
Top PR Firms
Best PR Firms
Leading PR Firms
CyberSecurity PR Firms
Cyber Security PR Firms
Technology PR Firms
PR Firm
Government PR Firms
Public Sector PR Firms Each link goes to a unique URL that is basically a knock-off of the homepage with a few words or at the most one sentences swapped out to include this footer link keyword phrase, sometimes there is a different title attribute but generally they are a close match to each other. The canonical for each page links back to itself. I simply can’t believe Google doesn’t consider this Spammy. Interested in your view.
Rosemary0 -
Dealing with links to your domain that the previous owner set up
Hey everyone, I rebranded my company at the end of last year from a name that was fairly unique but sounded like I cleaned headstones instead of building websites. I opted for a name that I liked, it reflected my heritage - however it also seems to be quite common. Anyway, I registered the domain name as it was available as the previous owner's company had been wound up. It's only been in the last week or two where I've managed to have a website on that domain and I've been tracking it's progress through Moz, Google & Bing Webmaster tools. Both the webmaster tools are reporting back that my site triggers 404 errors for some specific links. However, I don't have or have never used those links before. I think the previous owner might have created the links before he went bust. My question is in two parts. The first part is how do I find out what websites are linking to me with these broken URL's, and the second is will these 404'ing links affect my SEO? Thanks!
White Hat / Black Hat SEO | | mickburkesnr0 -
How do you change the 6 links under your website in Google?
Hello everyone, I have no idea how to ask this question, so I'm going to give it a shot and hopefully someone can help me!! My company is called Eteach, so when you type in Eteach into Google, we come in the top position (phew!) but there are 6 links that appear underneath it (I've added a picture to show what I mean). How do you change these links?? I don't even know what to call them, so if there is a particular name for these then please let me know! They seem to be an organic rank rather than PPC...but if I'm wrong then do correct me! Thanks! zorIsxH.jpg
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Cross linking websites of the same company, is it a good idea
As a user I think it is beneficial because those websites are segmented to answer to each customer needs, so I wonder if I should continue to do it or avoid it as much as possible if it damages rankings...
White Hat / Black Hat SEO | | mcany0 -
Is it worth getting links from .blogspot.com and .wordpress.com?
Our niche ecommerce site has only one thing going for it: We have numerous opportunities on a weekly basis to get reviews from "mom bloggers". We need links - our domain authority is depressing. My concern is that these "mom bloggers" tend to have blogs that end with .blogspot.com or .wordpress.com. How do I screen for "reviewers" that are worth getting links from and how can I make the most of the community we have available to us?
White Hat / Black Hat SEO | | Wilkerson1 -
Disavow - Broken links
I have a client who dealt with an SEO that created not great links for their site. http://www.golfamigos.co.uk/ When I drilled down in opensiteexplorer there are quite a few links where the sites do not exist anymore - so I thought I could test out Disavow out on them .. maybe just about 6 - then we are building good quality links to try and tackle this problem with a more positive approach. I just wondered what the consensus was?
White Hat / Black Hat SEO | | lauratagdigital0 -
Why would links that were deleted by me 3 months ago still show up in reports?
I inadvertently created a mini link farm some time back by linking all of my parked domains (2000 plus) to some of my live websites (I was green and didn't think linking between the same owner sites / domains was an issue). These websites were doing well until Penguin and although I did not get any 'bad link' advices from Google I figure I was hit by Penguin. So about 3 or 4 months ago I painstakingly deleted ALL links from all of those domains that I still own (only 500 or so - the others were allowed to lapse). None of those domains have any links linking out at all but old links from those domains are still showing up in WMT and in SEOmoz and every other link tracking report I have run. So why would these links still be reported? How long do old links stay in the internet archives? This may sound like a strange question but do links 'remain with a domain for a given period of time regardless'? Are links archived before being 'thrown out' of the web. I know Google keeps archives of data that has expired, been deleted, website closed etc, etc for about 3 years or so (?). In an effort to correct a situation I have spent countless hours manually deleting thousands of links but they won't go away. Looking for some insight here please. cheers, Mike
White Hat / Black Hat SEO | | shags380