Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Disavowin a sitewide link that has Thousands of subdomains. What do we tell Google?
- 
					
					
					
					
Hello,
I have a hosting company that partnered up with a blogger template developer that allowed users to download blog templates and have my footer links placed sitewide on their website. Sitewides i know are frowned upon and that's why i went through the rigorous Link Audit months ago and emailed every webmaster who made "WEBSITENAME.Blogspot.com" 3 times each to remove the links.
I'm at a point where i have 1000 sub users left that use the domain name of "blogspot.com". I used to have 3,000!
Question: When i disavow these links in Webmaster tools for Google and Bing, should i upload all 1000 subdomains of "blogspot.com" individually and show Google proof that i emailed all of them individually, or is it wise to just include just 1 domain name (www.blogspot.com) so Google sees just ONE big mistake instead of 1000.
This has been on my mind for a year now and I'm open to hearing your intelligent responses.
 - 
					
					
					
					
Google does allow root domains in disavow, but I'm honestly not sure how they would handle this with a mega-site with unique sub-domains like Blogspot. Typically, Google treats these sub-domains as stand-alone sites (isolating their PageRank, penalties, etc.). I tend to agree with the consensus, that the best bet is to disavow the individual blogs, and not the entire root domain. If you're really in bad shape and you have much more to lose from Blogspot links than gain, you could disavow the root domain, but I'm not sure if anyone has good data on the potential impact.
 - 
					
					
					
					
I would disavow the blogspot subdomains individually. So you'd have 1000 lines that say:
domain:subdomain-name.blogspot.com
The documentation for the disavow tool (http://googlewebmastercentral.blogspot.ca/2012/10/a-new-tool-to-disavow-links.html) says the following:
Q: Can I disavow something.example.com to ignore only links from that subdomain?
A: For the most part, yes. For most well-known freehosts (e.g. wordpress.com, blogspot.com, tumblr.com, and many others), disavowing "domain:something.example.com" will disavow links only from that subdomain. If a freehost is very new or rare, we may interpret this as a request to disavow all links from the entire domain. But if you list a subdomain, most of the time we will be able to ignore links only from that subdomain.What we don't know, however, is if we can do a disavow:blogspot.com to get everything from blogspot. I wouldn't trust it to do this and I would definitely disavow each individual subdomain.
If you don't have a manual penalty then there is no way to upload anything other than your disavow file to Google. Your disavow file is not read by a human. It is machine processed. You simply need to trust that you have done a thorough job and then, when Penguin refreshes, if you've got a good base of good links you should see an improvement.
 - 
					
					
					
					
I remember G saying that you should include links you've removed in disavow as well. You can add a comment before you list all the removed links but I don't think G manually reads disavow files anyways.
Since it's algorithmic, you just need to disavow/remove all those sitewide footer links and fix your anchor profile. Check out this case study as it is very similar to your situation.
 - 
					
					
					
					
In my opinion, if you are going to use the main domain that is blogspot.com it will probably disavow any link that is coming from blogspot.com which means if later down the list if you get a good link from the blogspot, even it will not give you any help!
In my case, I used the sub-domains and it worked fine for me!
Hope this helps!
 - 
					
					
					
					
We didn't receive a penalty letter, but our traffic and search queries impression went down when there was an algorithm update with footer links.
I don't have the original list of subdomains that removed our footer links, is it really necessary for Google? I mean, can't they realize that there aren't SO MANY links coming from Blogspot anymore? And is there a section in disavow links where i can upload a list of removed links i can show Google? Or do i just state that I removed so many with a list of the subdomains in a written notice when doing a disavow (never done a disavow, so this is new to me).
THis problem won't continue either because we stopped our partnership with the blog template devloper, so anything after 2 years ago and on...we are not a part of new consumer blogs.
Looking forward to your reply and other suggestions.
 - 
					
					
					
					
I'm assuming you received a manual penalty letter.
I would do the separate subdomains (if this is a complete list and new ones aren't being created) since it shows more effort and won't discredit any links you get from legit .blogspot blogs. Be sure to include the domains you've successfully removed in your disavow file as well.
If this is a problem that will continue (more people will create new sites with your footer link), you might have to disavow the whole domain.
 
Browse Questions
Explore more categories
- 
		
		
Moz Tools
Chat with the community about the Moz tools.
 
- 
		
		
SEO Tactics
Discuss the SEO process with fellow marketers
 
- 
		
		
Community
Discuss industry events, jobs, and news!
 
- 
		
		
Digital Marketing
Chat about tactics outside of SEO
 
- 
		
		
Research & Trends
Dive into research and trends in the search industry.
 
- 
		
		
Support
Connect on product support and feature requests.
 
Related Questions
- 
		
		
		
		
		
		
How Many Links to Disavow at Once When Link Profile is Very Spammy?
We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
Intermediate & Advanced SEO | | Kingalan1
-8% (52) of backlinks with an average of below above average detox risk
-12% (81) of backlinks with a low or very low detox risk
-38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.0 - 
		
		
		
		
		
		
Is possible to submit a XML sitemap to Google without using Google Search Console?
We have a client that will not grant us access to their Google Search Console (don't ask us why). Is there anyway possible to submit a XML sitemap to Google without using GSC? Thanks
Intermediate & Advanced SEO | | RosemaryB0 - 
		
		
		
		
		
		
How can I make sure Google is crawling a link from an iframe (video)?
Do they crawl backlinks from an iframe example from a Youtube video embedded in a blog post? TIA!
Intermediate & Advanced SEO | | zpm20140 - 
		
		
		
		
		
		
Why are bit.ly links being indexed and ranked by Google?
I did a quick search for "site:bit.ly" and it returns more than 10 million results. Given that bit.ly links are 301 redirects, why are they being indexed in Google and ranked according to their destination? I'm working on a similar project to bit.ly and I want to make sure I don't run into the same problem.
Intermediate & Advanced SEO | | JDatSB1 - 
		
		
		
		
		
		
Sitemap on a Subdomain
Hi, For various reasons I placed my sitemaps on a subdomain where I keep images and other large files (static.example.com). I then submitted this to Google as a separate site in Webmaster tools. Is this a problem? All of the URLs are for the actual site (www.example.com), the only issue on my end is not being able to look at it all at the same time. But I'm wondering if this would cause any problems on Google's end.
Intermediate & Advanced SEO | | enotes0 - 
		
		
		
		
		
		
Follow or nofollow to subdomain
Hi, I run a hotel booking site and the booking engine is setup on a subdomain.
Intermediate & Advanced SEO | | vmotuz
The subdomain is disabled from being indexed in robots.txt Should the links from the main domain have a nofollow to the subdomain? What are you thoughts? Thanks!0 - 
		
		
		
		
		
		
Google Indexing Feedburner Links???
I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?
Intermediate & Advanced SEO | | sbrault740 - 
		
		
		
		
		
		
Google is mixing subdomains. What can we do?
Hi! I'm experiencing something that's kind of strange for me. I have my main domain let's say: www.domain.com. Then I have my mobile version in a subdomain: mobile.domain.com and I also have a german version of the website de.domain.com. When I Google my domain I have the main result linking to: www.domain.com but then Google mixes all the domains in the sites links. For example a Sing in may be linking mobile.domain.com, a How it works link may be pointing to de.domain.com, etc What's the solution? I think this is hurting a lot my position cause google sees that all are the same domain when clearly is not. thanks!!
Intermediate & Advanced SEO | | fabrizzio0