Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Too many on page links
-
Hi
I know previously it was recommended to stick to under 100 links on the page, but I've run a crawl and mine are over this now with 130+
How important is this now? I've read a few articles to say it's not as crucial as before.
Thanks!
-
Hi Becky!
First, I would like to say this is it great you are being proactive in making sure your webpage doesn't have too many links on it! But, luckily for you, this is not something you need to worry about. 100 is a suggested number but not something that will penalize you if you go over.
Google’s Matt Cutts posted a video explaining why Google no longer has that 100-links-per-page Webmaster guideline—so be sure to check that out! It's commonly thought that having too many links will negatively impact your SEO results, but that hasn't been the case since 2008. However, Google has said if a site looks to be spammy and has way too many links on a single page—Google reserves the right to take action on the site. So, don't include links that could be seen as spammy and you should be fine.
Check out this Moz blog that discusses how many links is too many for more information!
-
Thank you for the advice, I'll take a look at the articles

Brilliant, the round table sounds great - I'll sign up for this
-
I honestly wouldn't worry Becky. The page looks fine, the links look fine and it is certainly not what you would call spammy,
Link crafting was a 'thing' a number of years ago, but today Google pretty much ignores this, as has been shown many times in testing.
However, you can benefit from internal links, but that is a different discussion. Read this if you are interested.
If you are interested, there is a round-table discussion on eCommerce SEO hosted by SEMrush on Thursday and that could be useful to you? Myself and 2 others will be talking on a number of issues.
-Andy
-
Thanks for the advice, I've looked into this before.
We have menu links and product links as it's an ecommerce site, so I wouldn't be able to remove any of these.
I've found it hard to find a way to decrease these links further on primary pages. For example http://www.key.co.uk/en/key/aluminium-sack-truck has 130 links.
Any advice would be appreciated

-
Confirmation from Google here to limit the links on a page to 3000
https://www.deepcrawl.com/knowledge/news/google-webmaster-hangout-notes-friday-8th-july-2016/
I would consider that to be a lot though

-Andy
-
Brilliant thank you!
-
In the "old days" (yup, I go back that far), Google's search index crawler wasn't all that powerful. So it would ration itself on each page and simply quit trying to process all the content on the page after a certain number of links and certain character count. (That's also why it used to be VERY important that your content was close to the top of your page code, not buried at the bottom of the code).
The crawler has been beefed up to the point where this hasn't been a limiting factor per page for a long time, so the crawler will traverse pretty well any links you feed it. But I +1 both Andy and Mike's advice about considering the usability and link power dilution of having extensive numbers of links on a page. (This is especially important to consider for your site's primary pages, since one of their main jobs is to help flow their ranking authority down to important/valuable second-level pages.)
Paul
-
Hi Becky,
Beyond the hypothetical limit, would be the consideration of dividing the link authority of the page by a really large number of links and therefor decreasing the relative value of each of those links to the pages they link to.
Depending on the page holding all these links, user experience, purpose of linked-to pages, etcetera, this may or may not be a consideration, but worth thinking about.
Good luck!
- Mike
-
Hi Becky,
If the links are justified, don't worry. I have clients with 3-400 and no problems with their positions in Google.
That doesn't mean to say it will be the same case for everyone though - each site is different and sometimes you can have too many, but just think it through and if you come to the conclusion that most of the links aren't needed and are stuffing keywords in, then look to make changes.
But on the whole, it doesn't sound like an issue to me - there are no hard and fast rules around this.
-Andy
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rel canonical tag from shopify page to wordpress site page
We have pages on our shopify site example - https://shop.example.com/collections/cast-aluminum-plaques/products/cast-aluminum-address-plaque That we want to put a rel canonical tag on to direct to our wordpress site page - https://www.example.com/aluminum-plaques/ We have links form the wordpress page to the shop page, and over time ahve found that google has ranked the shop pages over the wp pages, which we do not want. So we want to put rel canonical tags on the shop pages to say the wp page is the authority. I hope that makes sense, and I would appreciate your feeback and best solution. Thanks! Is that possible?
Intermediate & Advanced SEO | | shabbirmoosa0 -
So many links from single site?
this guy is ranking on all high volume keywords and has low quality content, he has 1600 ref domains check the attachment how did he get so many links from single site is he gonna be penalized YD2BvQ0
Intermediate & Advanced SEO | | SIMON-CULL0 -
Is it ok to repeat a (focus) keyword used on a previous page, on a new page?
I am cataloguing the pages on our website in terms of which focus keyword has been used with the page. I've noticed that some pages repeated the same keyword / term. I've heard that it's not really good practice, as it's like telling google conflicting information, as the pages with the same keywords will be competing against each other. Is this correct information? If so, is the alternative to use various long-winded keywords instead? If not, meaning it's ok to repeat the keyword on different pages, is there a maximum recommended number of times that we want to repeat the word? Still new-ish to SEO, so any help is much appreciated! V.
Intermediate & Advanced SEO | | Vitzz1 -
When is Too Many Categories Too Many on a eCommerce site?
We all know that more and more people are increasing the amount of different categories that eCommerce sites have. Say for example, you have over 3,000 different products, all categories contain unique text at the top of each, all of the categories link to each other (so loads on internal linking) and no two categories contain the exact same products. My question is this, is there ever a stage that you could create too many categories? Alternatively, do you think you should just keep creating categories based on what our customers search for?
Intermediate & Advanced SEO | | the-gate-films1 -
Different Header on Home Page vs Sub pages
Hello, I am an SEO/PPC manager for a company that does a medical detox. You can see the site in question here: http://opiates.com. My question is, I've never heard of it specifically being a problem to have a different header on the home page of the site than on the subpages, but I rarely see it either. Most sites, if i'm not mistaken, use a consistent header across most of the site. However, a person i'm working for now said that she has had other SEO's look at the site (above) and they always say that it is a big SEO problem to have a different header on the homepage than on the subpages. Any thoughts on this subject? I've never heard of this before. Thanks, Jesse
Intermediate & Advanced SEO | | Waismann0 -
Do 404 Pages from Broken Links Still Pass Link Equity?
Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this. When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost? We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name). Anyone have a clear answer? Thanks!
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Duplicate internal links on page, any benefit to nofollow
Link spam is naturally a hot topic amongst SEO's, particularly post Penguin. While digging around forums etc, I watched a video blog from Matt Cutts posted a while ago that suggests that Google only pays attention to the first instance of a link on the page As most websites will have multiple instances of a links (header, footer and body text), is it beneficial to nofollow the additional instances of the link? Also as the first instance of a link will in most cases be within the header nav, does that then make the content link text critical or can good on page optimisation be pulled from the title attribute? I would appreciate the experiences and thoughts Mozzers thoughts on this thanks in advance!
Intermediate & Advanced SEO | | JustinTaylor880