Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Removing poor domain authority backlinks worth it?
-
Hey Moz,
I am working with a client on more advanced SEO tactics. This client has a reputable domain authority of 67 and 50,000+ backlinks.
We're wanting to continue SEO efforts and stay on top of any bad backlinks that may arise.
Would it be worth asking websites (below 20 domain authority) to remove our links? Then, use the disavow tool if they do not respond.
Is this a common SEO practice for continued advanced efforts? Also, what would your domain authority benchmark be? I used 20 just as an example.
Thanks so much for your help.
Cole
-
Awesome responses guys. Anyone else have any other insight?
-
I updated my response while you were writing yours.
I don't doubt your insight. But The Googles doesn't sleep.
When you're doing a local campaign, with strictly above the board links, you should move as fast as possible.
-
That would be bad.
You should follow the rough 10-80-10 rule, whether you are building 10 links or 10,000 links. And you should always do it slowly.
I agree there are no specific percentages. You have to look at the big picture over a long period of time.
-
Let's say someone reads this and decides to get their first 10% in the crappy category. That would not be good for them. Further, there aren't any specific percentages that I'm aware of.
Yes, The Googles does have to pick the best of the worst. I'm not in doubt of that.
Yes, sometimes you inherit a mess but it seems to work. Manual reviews happen.
-
Big picture: What a good "problem" to have!
Without taking a close look at your specific URL...
...my first instinct is that the answer to your question is almost certainly a giant...
**No.
DO THE HARD THING: NOTHING!!!!** There is a real danger of overthinking this stuff and neglecting the fundamentals.
I faced the same issue with a DA72 site for a leading SME In his field who had 450,000+ backlinks....some from major media outlets and universities, but most from "nobodies" in the field. This is good!
What you want in a classic Inverted U-shaped curve in terms of DA.
-
10 % crappy links
-
80 % middling links
-
10% super high quality links
You mess with this at your peril!!!! Beware. "Bad" links are not necessarily bad in the grand scheme of universe. Every credible and authoritative site should have some. They are part of a natural link profile.
Getting rid of the <20 DA authority links could hurt...badly.
Focusing excessively on tweaking or sculpting the middling 80% of your links is probably a mistake. You could shoot yourself in the foot.
Less is more.
It might be better to just keep doing what you're doing.
This is hard...and requires great discipline!
-
-
Happy to be contrary. Another good thing about Link Detox is that the service has been trained - mostly for the good - by users manually reviewing the quality of their links. If easylinkseodirectory4u.com has been flagged enough, it's more likely to get caught by the machine.
Once you have uploaded your list and reviewed the links, you will get a pretty accurate risk rating. It scales from shades of low to high. I don't think Link Detox has ever given me a false Toxic rating on individual links either.
I'm not a client scalper, so if you would like to PM the domain name, I can take a look.
-
Excellent, quality response. Thanks so much.
I would love to hear from any disavow experts, maybe even costs of them (of course, I don't want to break any Moz rules that may be applicable).
Cole
-
Setting a DA cut-off from the outset is a bit too arbitrary. What if it's a link from a site with low DA and a low PA now, but later the site becomes the next New York Times? You don't want to disavow the next New York Times, but that's what an arbitrary number would have you do.
Further, DA and PA can be gamed to a certain extent. I'm sure Rap Genius has a pretty solid DA, but they were penalized all the same. So it would appear that using DA as a cut-off would be less than ideal.
There's no real easy way to do a disavow. You have to think about characteristics, context and intent. If you have links that pass juice, but were obviously paid - that may be a candidate. If there's a vast preponderance of links from seemingly low quality directories with exact match anchor text - those would be candidates for closer scrutiny as well. Dead giveaways are usually 'sponsored' links that pass juice.
Low quality directories usually let everyone in. You will know them by their viagra and casino anchor text. They're usually a pretty safe disavow candidate.
Does the site have a lot of links from spam blog comments from sites that are obviously unrelated? Has there been some guest blogging on free for all blogs? Those links would require some review as well.
Definitely prioritize your exact match anchor text links for review.
I would suggest you start with gathering link data from numerous sources:
- Google Webmaster Tools
- Bing Webmaster Tools
- Ahrefs
- Majestic SEO
- Etc.
Then filter the duplicates via spreadsheet voodoo. After that, drop it into a service like Link Detox. But be careful, it still throws false positives and false negatives. So again, there's no real way of getting out of a manual review. But Link Detox will speed up the process.
Are there plenty of disavow services out there? Sure, but I've never used them. I'm far too paranoid. A disavow is a delicate and lengthy process.
Are there some great disavow pros/individuals out there? Definitely. I would be far more likely to trust them. In fact, a couple will likely chime in here. Though they may be a little bit outside the budget. I don't know.
One final, important, point: A disavow is not a panacea. They take as long as they take. Though it is good that you appear to be proactive. You never know when the next Penguin filter will land. The site may be right with The Googles now, but it might not be later.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it worth creating an Image Sitemap?
We've just installed the server side script 'XML Sitemaps' on our eCommerce site. The script gives us the option of (easily) creating an image sitemap but I'm debating whether there is any reason for us to do so. We sell printer cartridges and so all the images will be pretty dry (brand name printer cartridge in front of a box being a favourite). I can't see any potential customers to search for an image as a route in to the site and Google appears to be picking up our images on it's own accord so wonder if we'll just be crawling the site and submitting this information for no real reason. From a quality perspective would Google give us any kind of kudos for providing an Image Sitemap? Would it potentially increase their crawl frequency or, indeed, reduce the load on our servers as they wouldn't have to crawl for all the images themselves?
Intermediate & Advanced SEO | | ChrisHolgate
I can't stress how little of a hardship it will be to create one of these automatically daily but am wondering if, like Meta Keywords, there is any benefit to doing so?1 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Blog comments - backlinks - question
Hi, I see that many good websites have backlinks from very good blogs/sites which are relative. What I noticed that everyone use their real name or generic name in comments. They do not use the keyword for the name. So later they get backlinks with anchor text of their names... So, my question is this good technique ? Do I have any benefits from these backlinks for my website ? With such a technique, whether it is enough just to leave your real name or may I periodically put the keyword for the name ? Thank you
Intermediate & Advanced SEO | | Ivek990 -
Is it worth removing date from Blog Posts / Articles
Wondering, is it worth to remove date from articles from seo perspective. Am sure, Google search algorithm would like demote a post written a year back, as against an article on the same post (unless a year old post has very strong Authoritative links) May be it can turn out a bad user experience of removing dates, but if can hide date using Javascripts so as to show it as image to user and hide it from search engines, is it a good idea !!
Intermediate & Advanced SEO | | Modi0 -
How do I list the subdomains of a domain?
Hi Mozers, I am trying to find what subdomains are currently active on a particular domain. Is there a way to get a list of this information? The only way I could think of doing it is to run a google search on; site:example.com -site:www.example.com The only issues with this approach is that a majority of the indexed pages exist on the non-www domain and I still have thousands of pages in the results (mainly from the non-www). Is there another way to do it in Google? OR is there a server admin online tool that will tell me this information? Cheers, Dan
Intermediate & Advanced SEO | | djlaidler0 -
Should I buy a .co domain if my preferred .com and .co.uk domain are taken by other companies?
I'm looking to boost my website ranking and drive more traffic to it using a keyword rich domain name. I want to have my nearest city followed by the keyword "seo" in the domain name but the .co.uk and .com have already been taken. Should I take the plunge and buy .co at a higher price? What options do I have? Also whilst we're on domains and URL's is it best to separate keywords in url's with a (_) or a (-)? Many thanks for any help with this matter. Alex
Intermediate & Advanced SEO | | SeoSheikh0 -
Merging Domains... Sub-domains, Directories or Seperate Sites?
Hello! I am hoping you can help me decide the best path to take here... A little background: I'm moving to a new company that has three old domains (the oldest is 10 years old), which get a lot of traffic from their e-letters. Until recently they have not cared about SEO. So the websites have some structural, coding, URL and other issues. The sites are indexed, but have a problem getting crawled and/or indexed for new content - haven't delved into this yet but am certain I will be able to fix any of these issues. These three domains are PR4, PR4, PR5 and contain hundreds of unique articles. Here's the question... They want to move these three sites **to their main company site (PR4) and create sub domains for each one. ** I am wondering if this is a good idea or not. I have merged sites before (creating categories and/or directories) and the end result is that the ONE big site, is much for effective than TWO smaller, less authoritative sites. But the sub domain idea is something I am unsure about from an SEO perspective. Should we do this with sub domains? Or do you think we should keep the sites separate? How do Panda and Penguin play into this? Thanks in advance for the help! SD P.S. I'm not a huge advocate in using PR as a measurement tool, but since I can't reveal the actual domains, I figured I would list it as a reference point.
Intermediate & Advanced SEO | | essdee0 -
Keyword-Rich Domains - Redirect?
Hi, Mozzers- I have a client that has a bunch of pretty nice keyword-rich domain names. Their traffic and rankings are good. They provide legal services in the Chicago area. I have lots of good content that I could use to start a blog using a domain like keyword,keyword-blog.com. Good idea? Currently I have a resources area on their website but feel like this area could be getting a little bloated and some news-related stuff isn't really appropriate. 2 Questions: Should I use one of the decent domains for a blog and build up the rankings, traffic, and link to the main site? Or is this lots of work for little payout? Both sites would be hosted in the cloud. Some of the domain names are related to their name, others are keyword or geo-targeted. Would it be wise to setup 301 redirects going to their website? Pros/cons? If you need additional info, please PM me for details. Thank you, friends! LHC
Intermediate & Advanced SEO | | lhc670