Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Disavowin a sitewide link that has Thousands of subdomains. What do we tell Google?
-
Hello,
I have a hosting company that partnered up with a blogger template developer that allowed users to download blog templates and have my footer links placed sitewide on their website. Sitewides i know are frowned upon and that's why i went through the rigorous Link Audit months ago and emailed every webmaster who made "WEBSITENAME.Blogspot.com" 3 times each to remove the links.
I'm at a point where i have 1000 sub users left that use the domain name of "blogspot.com". I used to have 3,000!
Question: When i disavow these links in Webmaster tools for Google and Bing, should i upload all 1000 subdomains of "blogspot.com" individually and show Google proof that i emailed all of them individually, or is it wise to just include just 1 domain name (www.blogspot.com) so Google sees just ONE big mistake instead of 1000.
This has been on my mind for a year now and I'm open to hearing your intelligent responses.
-
Google does allow root domains in disavow, but I'm honestly not sure how they would handle this with a mega-site with unique sub-domains like Blogspot. Typically, Google treats these sub-domains as stand-alone sites (isolating their PageRank, penalties, etc.). I tend to agree with the consensus, that the best bet is to disavow the individual blogs, and not the entire root domain. If you're really in bad shape and you have much more to lose from Blogspot links than gain, you could disavow the root domain, but I'm not sure if anyone has good data on the potential impact.
-
I would disavow the blogspot subdomains individually. So you'd have 1000 lines that say:
domain:subdomain-name.blogspot.com
The documentation for the disavow tool (http://googlewebmastercentral.blogspot.ca/2012/10/a-new-tool-to-disavow-links.html) says the following:
Q: Can I disavow something.example.com to ignore only links from that subdomain?
A: For the most part, yes. For most well-known freehosts (e.g. wordpress.com, blogspot.com, tumblr.com, and many others), disavowing "domain:something.example.com" will disavow links only from that subdomain. If a freehost is very new or rare, we may interpret this as a request to disavow all links from the entire domain. But if you list a subdomain, most of the time we will be able to ignore links only from that subdomain.What we don't know, however, is if we can do a disavow:blogspot.com to get everything from blogspot. I wouldn't trust it to do this and I would definitely disavow each individual subdomain.
If you don't have a manual penalty then there is no way to upload anything other than your disavow file to Google. Your disavow file is not read by a human. It is machine processed. You simply need to trust that you have done a thorough job and then, when Penguin refreshes, if you've got a good base of good links you should see an improvement.
-
I remember G saying that you should include links you've removed in disavow as well. You can add a comment before you list all the removed links but I don't think G manually reads disavow files anyways.
Since it's algorithmic, you just need to disavow/remove all those sitewide footer links and fix your anchor profile. Check out this case study as it is very similar to your situation.
-
In my opinion, if you are going to use the main domain that is blogspot.com it will probably disavow any link that is coming from blogspot.com which means if later down the list if you get a good link from the blogspot, even it will not give you any help!
In my case, I used the sub-domains and it worked fine for me!
Hope this helps!
-
We didn't receive a penalty letter, but our traffic and search queries impression went down when there was an algorithm update with footer links.
I don't have the original list of subdomains that removed our footer links, is it really necessary for Google? I mean, can't they realize that there aren't SO MANY links coming from Blogspot anymore? And is there a section in disavow links where i can upload a list of removed links i can show Google? Or do i just state that I removed so many with a list of the subdomains in a written notice when doing a disavow (never done a disavow, so this is new to me).
THis problem won't continue either because we stopped our partnership with the blog template devloper, so anything after 2 years ago and on...we are not a part of new consumer blogs.
Looking forward to your reply and other suggestions.
-
I'm assuming you received a manual penalty letter.
I would do the separate subdomains (if this is a complete list and new ones aren't being created) since it shows more effort and won't discredit any links you get from legit .blogspot blogs. Be sure to include the domains you've successfully removed in your disavow file as well.
If this is a problem that will continue (more people will create new sites with your footer link), you might have to disavow the whole domain.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Pagination Changes
What with Google recently coming out and saying they're basically ignoring paginated pages, I'm considering the link structure of our new, sooner to launch ecommerce site (moving from an old site to a new one with identical URL structure less a few 404s). Currently our new site shows 20 products per page but with this change by Google it means that any products on pages 2, 3 and so on will suffer because google treats it like an entirely separate page as opposed to an extension of the first. The way I see it I have one option: Show every product in each category on page 1. I have Lazy Load installed on our new website so it will only load the screen a user can see and as they scroll down it loads more products, but how will google interpret this? Will Google simply see all 50-300 products per category and give the site a bad page load score because it doesn't know the Lazy Load is in place? Or will it know and account for it? Is there anything I'm missing?
Intermediate & Advanced SEO | | moon-boots0 -
Change Google's version of Canonical link
Hi My website has millions of URLs and some of the URLs have duplicate versions. We did not set canonical all these years. Now we wanted to implement it and fix all the technical SEO issues. I wanted to consolidate and redirect all the variations of a URL to the highest pageview version and use that as the canonical because all of these variations have the same content. While doing this, I found in Google search console that Google has already selected another variation of URL as canonical and not the highest pageview version. My questions: I have millions of URLs for which I have to do 301 and set canonical. How can I find all the canonical URLs that Google has autoselected? Search Console has a daily quota of 100 or something. Is it possible to override Google's version of Canonical? Meaning, if I set a variation as Canonical and it is different than what Google has already selected, will it change overtime in Search Console? Should I just do a 301 to highest pageview variation of the URL and not set canonicals at all? This way the canonical that Google auto selected might get redirected to the highest pageview variation of the URL. Any advice or help would be greatly appreciated.
Intermediate & Advanced SEO | | SDCMarketing0 -
Breadcrumbs and internal links
Hello, I use to move up my site structure with links in content. I have now installed breadcrumbs, is it is useful to still keep the links in content or isn't there a need to duplicate those links ? and are the breadcrumbs links enough. Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
How can I make sure Google is crawling a link from an iframe (video)?
Do they crawl backlinks from an iframe example from a Youtube video embedded in a blog post? TIA!
Intermediate & Advanced SEO | | zpm20140 -
Does Disavowing Links Negate Anchor Text, or Just Negates Link Juice
I'm not so sure that disavowing links also discounts the anchor texts from those links. Because nofollow links absolutely still pass anchor text values. And disavowing links is supposed to be akin to nofollowing the links. I wonder because there's a potential client I'm working on an RFP for and they have tons of spammy directory links all using keyword rich anchor texts and they lost 98% of their traffic in Pengiun 1.0 and haven't recovered. I want to know what I'm getting into. And if I just disavow those links, I'm thinking that it won't help the anchor text ratio issues. Can anyone confirm?
Intermediate & Advanced SEO | | MiguelSalcido0 -
Wikipedia links - any value?
Hello everyone. We recently posted some of our research to Wikipedia as references in the "External Links" section. Our research is rigorous and has been referenced by a number of universities and libraries (an example: https://www.harborcompliance.com/information/company-suffixes.php). Anyway, I'm wondering if these Wikipedia links have any value beyond of course adding to the Wiki page's information. Thanks!
Intermediate & Advanced SEO | | Harbor_Compliance0 -
Dummy links in posts
Hi, Dummy links in posts. We use 100's of sample/example lnks as below http://<domain name></domain name> http://localhost http://192.168.1.1 http:/some site name as example which is not available/sample.html many more is there any tag we can use to show its a sample and not a link and while we scan pages to find broken links they are skipped and not reported as 404 etc? Thanks
Intermediate & Advanced SEO | | mtthompsons0 -
Do links to PDF's on my site pass "link juice"?
Hi, I have recently started a project on one of my sites, working with a branch of the U.S. government, where I will be hosting and publishing some of their PDF documents for free for people to use. The great SEO side of this is that they link to my site. The thing is, they are linking directly to the PDF files themselves, not the page with the link to the PDF files. So my question is, does that give me any SEO benefit? While the PDF is hosted on my site, there are no links in it that would allow a spider to start from the PDF and crawl the rest of my site. So do I get any benefit from these great links? If not, does anybody have any suggestions on how I could get credit for them. Keep in mind that editing the PDF's are not allowed by the government. Thanks.
Intermediate & Advanced SEO | | rayvensoft0