Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is it a good idea to remove old blogs?
-
So I have a site right now that isn't ranking well, and we are trying everything to help it out. One of my areas of concern is we have A LOT of old blogs that were not well written and honestly are not overly relevant. None of them rank for anything, and could be causing a lot of duplicate content issues. Our newer blogs are doing better and written in a more Q&A type format and it seems to be doing better.
So my thought is basically wipe out all the blogs from 2010-2012 -- probably 450+ blog posts.
What do you guys think?
-
You may find this case study helpful of a blog that decided to exactly that:
http://www.koozai.com/blog/search-marketing/deleted-900-blog-posts-happened-next/
-
It depends on what you mean by "remove."
If the content of all those old blogs truly is poor, I'd strongly consider going through 1 by 1 and seeing how you can re-write, expand upon, and improve the overall blog post. Can you tackle the subject from another angle? Are there images, videos, or even visual assets you can add to the post to make it more intriguing and sharable?
Then, you can seek out some credible places to strategically place your blog content for additional exposure and maybe even a link. Be careful here, however. I'm not talking about forum and comment spam, but there may be some active communities that are open to unique and valuable content. Do your research first.
When going through each post 1 by 1, you'll undoubtedly find blog posts that are simply "too far gone" or not relevant enough to keep. Essentially, it wouldn't even be worth your time to re-write them. In this case, find another page on your website that's MOST SIMILAR to the blog post. This may be in topic, but also could be an author's page, another blog post that is valuable, a contact page, etc. Then perform 301 redirects of the crap blog posts to those pages.
Not only are you salvaging any little value those blog posts may have had, but you're also preventing crawl and index issues by telling the search engine bots where that content is now (assuming it was indexed in the first place).
This is an incredibly long content process and should take you months. Especially if there's a lot of content that's good enough to be re-written, expanded upon, and added to. However making that content relevant and useful is the best thing you can do. It's a long process, but if your best content writers need a project, this would be it.
To recap: **1) **Go through each blog post 1 by 1, determine what's good enough to edit, what's "too far gone." 2) Re-write, edit, add to (content and images/videos) and re-promote them socially and to appropriate audiences and communities. 3) For the posts that were "too far gone," 301 redirect them to the most relevant posts and pages that are remaining "live."
Again, I can say firsthand that this is a LONG process. I've done it for a client in the past. However, the return was well worth the work. And by doing it this way and not just deleting posts, you're preventing yourself a lot of crawl/index headaches with the search engines.
-
we have A LOT of old blogs that were not well written and honestly are not overly relevant.
Wow.... it is great to hear someone looking at their content and deciding that he can kick it up a notch. I have seen a lot of people would never, ever, pull the kill switch on an old blog post. In fact they are still out there hiring people to write stuff that is really crappy.
If this was my site I would first check to be sure that I don't have a penguin or unnatural links problem. If you think you are OK there, here is what I would do.
-
I would look at those blog posts to see if any of them have any traffic, link or revenue value. Value is defined as... A) Traffic from any search engine or other quality source, B) valuable links, C) viewing by current website visitors, D) traffic who enter through those pages making any income through ads or purchases.
-
If any of them pass the value test above then I would improve that page. I would put a nice amount of work into that page.
-
Next I would look at each of those blog posts and see if any have content value. That means an idea that could be developed into valuable content... or valuable content that could be simply rewritten to a higher standard. Valuable content is defined as a topic that might pull traffic from search or be consumed by current site visitors.
-
If any pass the valuable content test then I would improve them. I would make them kickass.
-
After you have done the above, I would pull the plug on everything else.... or if I was feeling charitable I would offer them to a competitor.

Salutes to you for having the courage to clean some slates.
-
-
I would run them through Copyscape to check for plagiarism/duplicate content issues. After that, I would check for referral traffic. If there are some pages that draw enough traffic, you might not want to remove them. Finally, round it off with a page level link audit. Majestic can give you a pretty good idea of where they stand.
The pages that don't make the cut should be set to throw 410 status codes. If you still don't like the content on pages with good links and/or referral traffic, 301 those to better content on the same subject.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Large Competitor closed, how to capitalize in search. Any ideas?
Hey Mozzers, One of our biggest competitors closed down on January 1st, 2020 in several US cities. They did stay open in some areas just FYI. The competitor's website is www.execucar.com. This is a very large company that has a presence in almost all US major airports. It's a private car service just like Uber but for wealthy individuals. For example. when you search " lax car service" they are #3 on Google or "car service to lax" they're #2 still. What can we do to get more of their traffic and actual business? Has anyone done something like this before or knows quick and easy tactics to get their clients? We have a local landing page: https://dcacar.com/lax-car-service that ranks 9 through 11 for those same keywords. Thanks for your thoughts and time. Davit
Intermediate & Advanced SEO | | Davit19850 -
Removing the Trailing Slash in Magento
Hi guys, We have noticed trailing slash vs non-trailing slash duplication on one of our sites. Example:
Intermediate & Advanced SEO | | brandonegroup
Duplicate: https://www.example.com.au/living/
Preferred: https://www.example.com.au/living So, SEO-wise, we suggested placing a canonical tag on all trailing slash pointing to non-trailing slash. However, devs have advised against removing the trailing slash from some URLs with a blanket rule, as this may break functionality in Magento that depends on the trailing slash. The full site would need to be tested after implementing a blanket rewrite rule. Is any other way to address this trailing slash duplication issue without breaking anything in Magento? Keen to hear from you guys. Cheers,0 -
Removing index.php
I have question for the community and whether or not this is a good or bad idea. I currently have a Joomla site that displays www.domain.com/index.php in all the URLs with the exception of the home page. I have read that it's better to not have index.php showing in the URL at all. Does it really matter if I have index.php in my URL? I've read that it is a bad practice. I am thinking about installing the sh404SEF component on my site and removing the index.php. However, I rank pretty high for the keywords I want in Google, Bing and Yahoo. All of the URLs that show up in the searches have index.php as part of the URL. Has anyone ever used sh404SEF to remove the index.php and how did you overcome not loosing your search engine links? I don't want an existing search showing www.domain.com/index.php/sales and it not linking to the correct page which would now be www.domain.com/sales. I guess I could insert the proper redirects in the htaccess file. But I was hoping to avoid having every page of my site in the htaccess file for redirecting. Any help or advice appreciated.
Intermediate & Advanced SEO | | MedGroupMedia0 -
Dev Subdomain Pages Indexed - How to Remove
I own a website (domain.com) and used the subdomain "dev.domain.com" while adding a new section to the site (as a development link). I forgot to block the dev.domain.com in my robots file, and google indexed all of the dev pages (around 100 of them). I blocked the site (dev.domain.com) in robots, and then proceeded to just delete the entire subdomain altogether. It's been about a week now and I still see the subdomain pages indexed on Google. How do I get these pages removed from Google? Are they causing duplicate content/title issues, or does Google know that it's a development subdomain and it's just taking time for them to recognize that I deleted it already?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Our login pages are being indexed by Google - How do you remove them?
Each of our login pages show up under different subdomains of our website. Currently these are accessible by Google which is a huge competitive advantage for our competitors looking for our client list. We've done a few things to try to rectify the problem: - No index/archive to each login page Robot.txt to all subdomains to block search engines gone into webmaster tools and added the subdomain of one of our bigger clients then requested to remove it from Google (This would be great to do for every subdomain but we have a LOT of clients and it would require tons of backend work to make this happen.) Other than the last option, is there something we can do that will remove subdomains from being viewed from search engines? We know the robots.txt are working since the message on search results say: "A description for this result is not available because of this site's robots.txt – learn more." But we'd like the whole link to disappear.. Any suggestions?
Intermediate & Advanced SEO | | desmond.liang1 -
Article/ Blog Post submissions
Hello All, I'm looking to perform a 'Standard' guest blog post link building tactic, but i'm a little unsure as where to start. Does anybody have a list/ guide to websites that accept guest posts? Preferably ones that are useful for SEO purposes, I have been link building for about 3 months now, but to be honest, most of these links are NoFollow, which isn't too great! Paul
Intermediate & Advanced SEO | | Paul_Tovey0 -
Creating 100,000's of pages, good or bad idea
Hi Folks, Over the last 10 months we have focused on quality pages but have been frustrated with competition websites out ranking us because they have bigger sites. Should we focus on the long tail again? One option for us is to take every town across the UK and create pages using our activities. e.g. Stirling
Intermediate & Advanced SEO | | PottyScotty
Stirling paintball
Stirling Go Karting
Stirling Clay shooting We are not going to link to these pages directly from our main menus but from the site map. These pages would then show activities that were in a 50 mile radius of the towns. At the moment we have have focused our efforts on Regions, e.g. Paintball Scotland, Paintball Yorkshire focusing all the internal link juice to these regional pages, but we don't rank high for towns that the activity sites are close to. With 45,000 towns and 250 activities we could create over a million pages which seems very excessive! Would creating 500,000 of these types of pages damage our site? This is my main worry, or would it make our site rank even higher for the tougher keywords and also get lots of traffic from the long tail like we used to get. Is there a limit to how big a site should be? edit0 -
Should pages of old news articles be indexed?
My website published about 3 news articles a day and is set up so that old news articles can be accessed through a "back" button with articles going to page 2 then page 3 then page 4, etc... as new articles push them down. The pages include a link to the article and a short snippet. I was thinking I would want Google to index the first 3 pages of articles, but after that the pages are not worthwhile. Could these pages harm me and should they be noindexed and/or added as a canonical URL to the main news page - or is leaving them as is fine because they are so deep into the site that Google won't see them, but I also won't be penalized for having week content? Thanks for the help!
Intermediate & Advanced SEO | | theLotter0