Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I disavow links from pages that don't exist any more
-
Hi. Im doing a backlinks audit to two sites, one with 48k and the other with 2M backlinks. Both are very old sites and both have tons of backlinks from old pages and websites that don't exist any more, but these backlinks still exist in the Majestic Historic index. I cleaned up the obvious useless links and passed the rest through Screaming Frog to check if those old pages/sites even exist.
There are tons of link sending pages that return a 0, 301, 302, 307, 404 etc errors. Should I consider all of these pages as being bad backlinks and add them to the disavow file?
Just a clarification, Im not talking about l301-ing a backlink to a new target page. Im talking about the origin page generating an error at ping eg: originpage.com/page-gone sends me a link to mysite.com/product1. Screamingfrog pings originpage.com/page-gone, and returns a Status error. Do I add the originpage.com/page-gone in the disavow file or not?
Hope Im making sense
-
Sounds a plan. Thanks for your help bud, much appreciated.
-
My take, I'll just go ahead and start doing other things to improve it's current rankings. I could assign someone to go over links if another team member is available.
If I see improvements, within the next month, then that's a good sign already that you should continue and not worry about the dead links.
It takes google a long time to actually forget about those links pointing to your site. So if they are dead AND then you didnt notice any increases or drops in analytics, then they are pretty much ineffective so they shouldnt be a major obstacle. I think someone coined a term for it, ghost links or something. LOL.
-
Hi. I did go through GA several years back, think back to 2011, but didn't really see dramatic changes in traffic other than a general trend of just low organic traffic throughout. Keep in mind that it's an engineering site, so no thousands of visit per day... the keywords that are important for the site get below 1000 searcher per month (data from the days when Google Keyword Tool shared this info with us mortals).
That said, I do notice in roughly 60% of the links absolutely no regard for anchors, so some are www.domain.com/index.php, Company Name, some are Visit Site, some are Website etc. Some anchors are entire generic sentences like "your company provided great service, your entire team should be commended blah blah blah". And there are tons of backlinks from http://jennifers.tempdomainname.com...a domain that a weird animal as there's not much data on who they are, what they do and what the deal is with the domain name itself. Weird.
In all honesty, nothing in WMT or GA suggests that the site got hit by either Penguin or Panda....BUT, having a ton of links that originate from non-existing pages, pages with no thematic proximity to the client site, anchors that are as generic as "Great Service"...is it a plus to err on the side of caution and get them disavowed, or wait for a reason from Google and then do the link hygiene?
-
Hi Igor,
Seeing ezinearticles in there is definitely a red flag that tells you that it probably has web directories, article networks, blog networks, pliggs, guestbooks and other links from that time.
Maybe you can dig up some old analytics data, check out when the traffic dropped.
If you did not see any heavy anchor text usage, then the site must've gotten away with a sitewide penalty, I would assume it's just a few (or many, but not all) of the keywords that got hit so either way, youll need to clean up -> disavow the links if they are indeed like that. So that's probably a reason for it's low organic rankings.
That, and since it's old, it might have been affected by panda too.
-
Thanks for your response. Im about done with cleaning up the link list in very broad strokes, eliminating obvious poor quality links, so in a few hours I could have a big list for disavowing.
The site is very specific, mechanical engineering thing and they sell technology and consulting to GM, GE, Intel, Nasa... so backlinks from sites for rental properties and resorts do look shady....even if they do return a 200 status.
But...how vigilent is google now with all the Penguin updates about backlinks from non-related sites, and my client's site has tons of them? And if Majestic reports them to have zero trust flow, is there a benefit of having them at all?
Thanks.
-
Hi. Thanks for responding. WMT shows just a fraction of the links actually. about few thousand for the site that Majestic Historic reports 48k. But I dont have any notifications of issues. Im guessing that with all the Penguin updates most sites won't get any notifications and it's up to us SEO guys to figure out why rankings are so low.
About quality of the links, many do come from weird sites, and I've noticed ezinearticles too. Problem is that the 48k portfolio was built by non-seo experts and now, few years after the fact, Im stuck with a site that doesn't rank well and has no notifications in WMT. But can I take the lack of notification as evidence that the site has no backlinks problem, or do I read-in the problem in poor organic ranking?
-
If I would be in that similar situation I would not really care about it but if it didn’t took too much of my time, I would have included all of these in the disavow file too.
But if the page is not giving a 200 status, this shouldn’t really be a problem.
Hope this helps!
-
Hi Igor,
Do they still show up in Webmaster tools? Do you have a penalty because of those links that used to link to the site? If not then I wouldn't really worry about it and just prioritize other things and make that a side task.
Are the majority of them on bad looking domains? If you checked the link URL on archive.org, were they spammy links? Then go ahead and include them in the disavow list.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Trying to find all internal links to a specific page (without index)
Hi guys -- Still waiting on Moz to index a page of mine. We launched a new site over two months ago. In the meantime, I really just need a list of internal links to a specific page because I want to change its URL. Does anybody know how to find that list (of internal links to 1 of my pages) without the Moz index? I appreciate the help!
Technical SEO | | marchexmarketingmcc1 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
Remove page with PA of 69 and 300 root domain links?
Hi We have a few pages within our website which were at one time a focus for us, but due to developing the other areas of the website, they are now defunct (better content elsewhere) and in some ways slightly duplicate so we're merging two areas into one. We have removed the links to the main hub page from our navigation, and were going to 301 this main page to the main hub page of the section which replaces it. However I've just noticed the page due to be removed has a PA of 69 and 15,000 incoming links from 300 root domains. So not bad! It's actually stronger than the page we are 301'ing it to (but not really an option to swap as the URL structure will look messy) With this in mind, is the strategy to redirect still the best or should we keep the page and turn it into a landing page, with links off to the other section? It just feels as though we would be doing this just for the sake of google, im not sure how much decent content we could put on it as we've already done that on the destination page. The incoming links to that page will still be relevant to the new section (they are both v similar hence the merging) Any suggestions welcome, thanks
Technical SEO | | benseb0 -
Google insists robots.txt is blocking... but it isn't.
I recently launched a new website. During development, I'd enabled the option in WordPress to prevent search engines from indexing the site. When the site went public (over 24 hours ago), I cleared that option. At that point, I added a specific robots.txt file that only disallowed a couple directories of files. You can view the robots.txt at http://photogeardeals.com/robots.txt Google (via Webmaster tools) is insisting that my robots.txt file contains a "Disallow: /" on line 2 and that it's preventing Google from indexing the site and preventing me from submitting a sitemap. These errors are showing both in the sitemap section of Webmaster tools as well as the Blocked URLs section. Bing's webmaster tools are able to read the site and sitemap just fine. Any idea why Google insists I'm disallowing everything even after telling it to re-fetch?
Technical SEO | | ahockley0 -
We have set up 301 redirects for pages from an old domain, but they aren't working and we are having duplicate content problems - Can you help?
We have several old domains. One is http://www.ccisound.com - Our "real" site is http://www.ccisolutions.com The 301 redirect from the old domain to the new domain works. However, the 301-redirects for interior pages, like: http://www.ccisolund.com/StoreFront/category/cd-duplicators do not work. This URL should redirect to http://www.ccisolutions.com/StoreFront/category/cd-duplicators but as you can see it does not. Our IT director supplied me with this code from the HT Access file in hopes that someone can help point us in the right direction and suggest how we might fix the problem: RewriteCond%{HTTP_HOST} ccisound.com$ [NC] RewriteRule^(.*)$ http://www.ccisolutions.com/$1 [R=301,L] Any ideas on why the 301 redirect isn't happening? Thanks all!
Technical SEO | | danatanseo0 -
How to remove the 4XX Client error,Too many links in a single page Warning and Cannonical Notices.
Firstly,I am getting around 12 Errors in the category 4xx Client error. The description says that this is either bad or a broken link.How can I repair this ? Secondly, I am getting lots of warnings related to too many page links of a single page.I want to know how to tackle this ? Finally, I don't understand the basics of Cannonical notices.I have around 12 notices of this kind which I want to remove too. Please help me out in this regard. Thank you beforehand. Amit Ganguly http://aamthoughts.blogspot.com - Sustainable Sphere
Technical SEO | | amit.ganguly0 -
Deep Page Link - url no longer exists
I used Open Site Explorer and found a link to our site on http://www.business.com/guides/bedding-supplies-3639/ The link was setup to go to an important, deep page on my website, but the structure of our urls changed and the url no longer exists. The link (anchor text 'National Hospitality Supply') does direct to our homepage, www.nathosp.com. My question is, am I receiving full link juice? Or would I be better served to create a 301 redirect to the revised / new page url? In case it matters, if I had my choice I'd prefer the link to go to the intended deep page. Thanks in advance for your insight. -Josh Fulfer
Technical SEO | | mhans0 -
How to find links to 404 pages?
I know that I used to be able to do this, but I can't seem to remember. One of the sites I am working on has had a lot of pages moving around lately. I am sure some links got lost in the fray that I would like to recover, what is the easiest way to see links going to a domain that are pointing to 404 pages?
Technical SEO | | MarloSchneider0