Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Old subdomains - what to do SEO-wise?
-
Hello,
I wanted the community's advice on how to handle old subdomains.
We have https://www.yoursite.org. We also have two subdomains directly related to the main website: https://www.archive.yoursite.org and https://www.blog.yoursite.org.
As these pages are not actively updated, they are triggering lots and lots of errors in the site crawl (missing meta descriptions, and much much more). We do not have particular intentions of keeping them up-to-date in terms of SEO. What do you guys think is the best option of handling these?
I considered de-indexing, but content of these page is still relevant and may be useful - yet it is not up to date and it will never be anymore.
Many thanks in advance.
-
Thanks for replying Will.
You have mentioned a few ways to deal with this there - and they all seem to point out to the fact that this should not really be a high-priority issue for us at the moment. Especially, if you think that sub-domains do not really have a major effect to the main site (I would not even think it's even worth us deindexing to be honest as it may be relevant to some people and we can just allow Google to continue indexing as it is).
Surely, all considerations point to this: we can come to the conclusion that we won't be doing any SEO-related work on these pages.
Therefore, how do I set up MOZ to ignore these two sub-domains and only show crawl errors related to the main site? We just don't want these pages to be crawled at all by MOZ given we won't do any work on them.
Thanks
-
Hi there. Sorry for the slow follow-up on this - there was an issue that meant I didn't get the email alert when it was assigned to me.
There is increasing evidence that culling old / poor performing content from your site can have a positive effect, though I wouldn't be particularly confident that this would transfer across sub-domains to benefit the main site.
In general, I suspect that most effort expended here will be better-placed elsewhere, and so I would angle towards the least effort option.
I think that the "rightest" long-term answer though would be to move the best content to the main domain (with accompanying 301 redirects) and remove the remainder with 410 status codes. This should enable you to focus on the most valuable content and get the most benefit from the stuff that is valuable, while avoiding having to continue expending effort on the stuff that is no longer useful. The harder this is, though, the less I'd be inclined to do it - and would be more likely to consider just deindexing the lowest quality stuff and getting whatever benefit remains from the better content for as long as it is a net positive, with an eye to eventually removing it all.
Hope that helps - I don't think it's a super clear-cut situation unfortunately.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best strategy to SEO Discontinued Products on Ecommerce Sites?
RebelsMarket.com is a marketplace for alternative fashion. We have hundreds of sellers who have listed thousands of products. Over 90% of the items do not generate any sales; and about 40% of the products have been on the website for over 3+ years. We want to cleanup the catalog and remove all the old listings that older than 2years that do not generate any sales. What is the best practice for removing thousands of listings an Ecommerce site? do we 404 these products and show similar items? Your help and thoughts is much appreciated.
White Hat / Black Hat SEO | | JimJ3 -
Negative SEO - Spammy Backlinks By Competitor
Hi Everyone, Someone has generated more than 22k spam backlinks (on bad keywords) for my domain.Will it hurt on my website (SEO Ranking)? Because it is already in the top ranking. How could I remove all the spammy backlinks? How could I know particular competitior who have done this?
White Hat / Black Hat SEO | | HuptechWebseo0 -
Opinion on Gotch SEO methods & services
I would love to get you all's take on Gotch SEO. I am gearing up to link build for a site in the next several months, and have been reading up from sources other than Moz, in preparation. (Need to re-read Moz's guide, too, but I have already read it last year) I'm reading Gotch SEO's main link building method articles right now, and am wondering what you all think. Do you think they have a good approach and are generally reliable? Likewise, has anyone used their service for getting a link? What was your experience? Or if you haven't used the service, any quick takes on it?
White Hat / Black Hat SEO | | scienceisrad0 -
The use of a ghost site for SEO purposes
Hi Guys, Have just taken on a new client (.co.uk domain) and during our research have identified they also have a .com domain which is a replica of the existing site but all links lead to the .co.uk domain. As a result of this, the .com replica is pushing 5,000,000+ links to the .co.uk site. After speaking to the client, it appears they were approached by a company who said that they could get the .com site ranking for local search queries and then push all that traffic to .co.uk. From analytics we can see that very little referrer traffic is coming from the .com. It sounds remarkably dodgy to us - surely the duplicate site is an issue anyway for obvious reasons, these links could also be deemed as being created for SEO gain? Does anyone have any experience of this as a tactic? Thanks, Dan
White Hat / Black Hat SEO | | SEOBirmingham810 -
href="#" and href="javascript.void()" links. Is there a difference SEO wise?
I am currently working a site re-design and we are looking at if href="#" and href="javascript.void()" have an impact on the site? We were initially looking at getting the links per page down but I am thinking that rel=nofollow is the best method for this. Anyone had any experience with this? Thanks in advanced
White Hat / Black Hat SEO | | clickermediainc0 -
Does posting on Craigslist damage our SEO or reuptation?
We have a website that's a single person barbershop. She has been promoting on Craigslist, and that is outranking the website in the SERPs. However, the craigslist results showing up are actually expired and don't link to anything. They just seem to be cached by Craigslist. My question is, is Craigslist considered to generally not be a good avenue for directing inbound links for services on your site? Or is it a good strategy to use Craigslist to build link traffic for service businesses? I get mixed responses when I search for this. Thanks eYtdHtg.png
White Hat / Black Hat SEO | | smallpotatoes0 -
Recovering From Black Hat SEO Tactics
A client recently engaged my service to deliver foundational white hat SEO. Upon site audit, I discovered a tremendous amount of black hat SEO tactics employed by their former SEO company. I'm concerned that the efforts of the old company, including forum spamming, irrelevant backlink development, exploiting code vulnerabilities on BB's and other messy practices, could negatively influence the target site's campaigns for years to come. The site owner handed over hundreds of pages of paperwork from the old company detailing their black hat SEO efforts. The sheer amount of data is insurmountable. I took just one week of reports and tracked back the links to find that 10% of the accounts were banned, 20% tagged as abusive, some of the sites were shut down completely, WOT reports of abusive practices and mentions on BB control programs of blacklisting for the site. My question is simple. How does one mitigate the negative effects of old black hat SEO efforts and move forward with white hat solutions when faced with hundreds of hours of black gunk to clean up. Is there a clean way to eliminate the old efforts without contacting every site administrator and requesting removal of content/profiles? This seems daunting, but my client is a wonderful person who got in over her head, paying for a service that she did not understand. I'd really like to help her succeed. Craig Cook
White Hat / Black Hat SEO | | SEOptPro
http://seoptimization.pro
[email protected]0 -
Ever seen a black hat SEO hack this sneaky?
A friend pointed out to me that a University site had been hacked and used to gain top Google rankings. But it was cloaked so that most users wouldn't notice the hack. Only Googlebot and visitors from Google SERPs for the spam keywords would see a hacked version. See http://www.rypmarketing.com/blog/122-how-hackers-gained-an-easy-1-google-ranking-using-a-university-website.whtml (my blog) for screenshot and specifics. I've dealt with hacks before, but nothing this evil and sneaky. Ever seen anything like this? This is not our client, but was just curious if others had seen a hack like this before.
White Hat / Black Hat SEO | | AdamThompson0