Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Backlink quality vs quantity: Should I keep spammy backlinks?
-
Regarding backlinks, I'm wondering which is more advantageous for domain authority and Google reputation:
- Option 1: More backlinks including a lot of spammy links
- Option 2: Fewer backlinks but only reliable, non-spam links
I've researched this topic around the web a bit and understand that the answer is somewhere in the middle, but given my site's specific backlink volume, the answer might lean one way or the other.
For context, my site has a spam score of 2%, and when I did a quick backlink audit, roughly 20% are ones I want to disavow. However, I don't want to eliminate so many backlinks that my DA goes down. As always, we are working to build quality backlinks, but I'm interested in whether eliminating 20% of backlinks will hurt my DA.
Thank you!
-
Backlinks are always about quality not quantity. Google does not like too many backlinks and especially spammy backlinks. I would suggest you to go with quality backlinks if you want long term and sustainable results otherwise there will always be a threat of getting penalized by google if you focus on spammy backlinks.
-
It's a myth that your DA drops because you put links in disavow. Disavow is a google only (or bing) tool, where lets say you get spammy links from a rogue domain and there's no way you can get 'm removed.
MOZ cant read your disavow file either you file into google. So i'm not sure on how the link is being put here. With MOZ, or any other tool, they just calculate the amount of incoming, FOLLOW links and presume your DA on some magical number. Thats all there is to it. Again, PA/DA has nothing in common at all with Google as Google maintains their own algorithm.
-
Hello again,
Thanks for the clarification and the link. I've read through that and a few other sources across the web, but none of them seemed to answer my question the way you did, so thanks! Our backlink profile is pretty balanced with spammy and definitely not spammy, so I'm not super concerned about it, but I appreciate the reminder.
-
I should also clarify, these may hurt you if they are your only links. If you have very little equitable links, this may cause Google and other search engines to falsely recognize you as spam. So just be careful and be on the look out for extra suspicious spam links. The balanced approach is the best approach: don't worry but stay aware!
Here is a more technical write-up from Moz that I reccomend: https://a-moz.groupbuyseo.org/help/link-explorer/link-building/spam-score
-
No problem Liana.
- That is correct. Google understands that you don't have control of 3rd party sites, so instead of penalizing you, they minimize/ delete the effect the spam site links have.
- Yes, but only kind of. It may or may not increase PA/ DA, but according to Google it shouldn't hurt you.
But yeah that's the gist of it! Instead taking the time investigating and disavowing links, you could spend that time cultivating relationships with other websites and businesses that could give you nice quality linkage.
Hope this answer works for you.
-
Hello Advanced Air Ambulance SEO!
Thanks for the quick and thorough response. Please confirm if I understand you correctly:
- I can leave spammy backlinks alone (not spend time disavowing them) _unless _I see a manual action in Search Console, which would indicate that Google sees an issue and is penalizing my site until I disavow the links. Without this manual action, there's no indication that the spam links are hurting my rankings or DA.
- Leaving spammy backlinks that don't incur a manual action may actually increase DA since leaving them maintains a higher volume of backlinks (albeit some spammy), and backlink quantity is a contributor to DA.
Thank you!
-
Hi Liana,
As far as spammy links, Google has done well detecting whether or not they are intentional, aka black hat. If they aren't, Google does not penalize you for these links, so it's best to leave them.
As far as a strategy for generating links to your website, you should always focus on high quality over quantity. High quality links give you exponentially more return than high quantity of bad links.
I recommend this article Google wrote for us to understand when and how to disavow links.
https://support.google.com/webmasters/answer/2648487?hl=en
In short, rarely do you ever need to disavow links, even if they have a high spam score. You are only hurt when they sense you are gaming the system and in the case that they detect or suspect unethical backlinking, you will be penalized with a "manual action". You can check if you were penalized, as well as disavow flagged backlinks, in the Google Search Console.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google keeps marking different pages as duplicates
My website has many pages like this: mywebsite/company1/valuation mywebsite/company2/valuation mywebsite/company3/valuation mywebsite/company4/valuation ... These pages describe the valuation of each company. These pages were never identical but initially, I included a few generic paragraphs like what is valuation, what is a valuation model, etc... in all the pages so some parts of these pages' content were identical. Google marked many of these pages as duplicated (in Google Search Console) so I modified the content of these pages: I removed those generic paragraphs and added other information that is unique to each company. As a result, these pages are extremely different from each other now and have little similarities. Although it has been more than 1 month since I made the modification, Google still marks the majority of these pages as duplicates, even though Google has already crawled their new modified version. I wonder whether there is anything else I can do in this situation? Thanks
Technical SEO | | TuanDo96270 -
Personalized Content Vs. Cloaking
Hi Moz Community, I have a question about personalization of content, can we serve personalized content without being penalized for serving different content to robots vs. users? If content starts in the same initial state for all users, including crawlers, is it safe to assume there should be no impact on SEO because personalization will not happen for anyone until there is some interaction? Thanks,
Technical SEO | | znotes0 -
Links from PubMed (nlm.nih.gov) not appearing in backlinks for articles
Content from our medical journals gets indexed by the National Library of Medicine / PubMed on a monthly basis. The link to the full article appears in the upper-right corner on PubMed, yet I'm unable to find PubMed (nlm.nih.gov) backlinks in the reporting tools. Example:
Technical SEO | | aafpitadmin
Article Title: Respiratory Syncytial Virus Infection in Children (allintitle query)
Article URL: http://www.aafp.org/afp/2011/0115/p141.html
PubMed URL: http://www.ncbi.nlm.nih.gov/pubmed/21243988 The PubMed link is to http://www.aafp.org/link_out?pmid=21243988 ,
a 301 redirect to the article, http://www.aafp.org/afp/2011/0115/p141.html Any idea why this link isn't appearing in backlinks? This isn't just for one article, we have roughly 2,000 articles from 1998 to the present. Articles from the past 12-months are access-restricted, and after 12-months the articles become public.0 -
Meta Description VS Rich Snippets
Hello everyone, I have one question: there is a way to tell Google to take the meta description for the search results instead of the rich snippets? I already read some posts here in moz, but no answer was found. In the post was said that if you have keywords in the meta google may take this information instead, but it's not like this as i have keywords in the meta tags. The fact is that, in this way, the descriptions are not compelling at all, as they were intended to be. If it's not worth for ranking, so why google does not allow at least to have it's own website descriptions in their search results? I undestand that spam issues may be an answer, but in this way it penalizes also not spammy websites that may convert more if with a much more compelling description than the snippets. What do you think? and there is any way to fix this problem? Thanks!
Technical SEO | | socialengaged
Eugenio0 -
Root directory vs. subdirectories
Hello. How much more important does Google consider pages in the root directory relative to pages in a subdirectory? Is it best to keep the most important pages of a site in the root directory? Thanks!
Technical SEO | | nyc-seo0 -
Singular vs plural in urls
In keyword research for an ecommerce site, I've found that widget, singular gets a lot more searches than widgets, plural AND is much less competitive. Is it better for SEO purposes to have the URLs (and matching title tags) in the catalog as /brass-widget.html, /steel-widget.html, etc., or /brass-widgets.html, etc.? I'm worried that a) searches for widgets will pass by the singular urls but not vice versa, and b) the singular form will strike visitors as bad grammar. Any advice?
Technical SEO | | AmericanOutlets0 -
Microsite on subdomain vs. subdirectory
Based on this post from 2009, it's recommended in most situations to set up a microsite as a subdirectory as opposed to a subdomain. http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites. The primary argument seems to be that the search engines view the subdomain as a separate entity from the domain and therefore, the subdomain doesn't benefit from any of the trust rank, quality scores, etc. Rand made a comment that seemed like the subdomain could SOMETIMES inherit some of these factors, but didn't expound on those instances. What determines whether the search engine will view your subdomain hosted microsite as part of the main domain vs. a completely separate site? I read it has to do with the interlinking between the two.
Technical SEO | | ryanwats0 -
Syndication: Link back vs. Rel Canonical
For content syndication, let's say I have the choice of (1) a link back or (2) a cross domain rel canonical to the original page, which one would you choose and why? (I'm trying to pick the best option to save dev time!) I'm also curious to know what would be the difference in SERPs between the link back & the canonical solution for the original publisher and for sydication partners? (I would prefer not having the syndication partners disappeared entirely from SERPs, I just want to make sure I'm first!) A side question: What's the difference in real life between the Google source attribution tag & the cross domain rel canonical tag? Thanks! PS: Don't know if it helps but note that we can syndicate 1 article to multiple syndication partners (It would't be impossible to see 1 article syndicated to 50 partners)
Technical SEO | | raywatson0