Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I delete 100s of weak posts from my website?
-
I run this website: http://knowledgeweighsnothing.com/
It was initially built to get traffic from Facebook. The vast majority of the 1300+ posts are shorter curation style posts. Basically I would find excellent sources of information and then do a short post highlighting the information and then link to the original source (and then post to FB and hey presto 1000s of visitors going through my website). Traffic was so amazing from FB at the time, that 'really stupidly' these posts were written with no regard for search engine rankings.
When Facebook reach etc dropped right off, I started writing full original content posts to gain more traffic from search engines. I am starting to get more and more traffic now from Google etc, but there's still lots to improve.
I am concerned that the shortest/weakest posts on the website are holding things back to some degree. I am considering going through the website and deleting the very weakest older posts based on their quality/backlinks and PA. This will probably run into 100s of posts. Is it detrimental to delete so weak many posts from a website?
Any and all advice on how to proceed would be greatly recieved.
-
This is a very valid question, in my opinion, and one that I have thought about a lot. I even did it on a site before on a UGC section where there were about 30k empty questions, many of which were a reputation nightmare for the site. We used the parameters of:
- Over a year old
- Has not received an organic visit in the past year
We 410d all of them as they did not have any inbound links and we just wanted them out of the index. I believe they were later 301d, and that section of the site has now been killed off.
Directly after the pages were removed, we saw a lift of ~20% in organic traffic to that section of the site. That maintained, and over time that section of the site started getting more visits from organic as well.
I saw it as a win and went through with it because:
- They were low quality
- They already didn't receive traffic
- By removing them, we'd get more pages that we wanted crawled, crawled.
I think Gary's answer of "create more high quality content" is too simplistic. Yes, keep moving forward in the direction you are, but if you have the time or can hire someone else to do it, and those pages are not getting traffic, then I'd say remove them. If they are getting traffic, maybe do a test of going back and making them high quality to see if they drive more traffic.
Good luck!
-
Too many people are going to gloss over the "In general" part of what Gary is saying.
Things not addressed in that thread:
- If a URL isn't performing for you but has a few good backlinks, you're probably still better off to 301 the page to better content to it lend additional strength.
- The value of consistency across the site; wildly uneven content can undermine your brand.
- Consolidating information to provide a single authoritative page rather than multiple thin and weak pages.
- The pointlessness of keeping non-performing pages when you don't have the resources to maintain them.
-
Haha I read this question earlier, saw the post come across feedly and knew what I needed to do with it. Just a matter of minutes.
You're right though - I would've probably said remove earlier as well. It's a toss up but usually when they clarify, I try to follow. (Sometimes they talk nonsense of course, but you just have to filter that out.)
-
Just pipped me to it
-
Hi Xpers.
I was reading a very timely, if not the same issue article today from Barry Schwartz over at SEO Round Table. He has been following a conversation from Gary Illyes at Google, whom apparently does not recommend removing content from a site to help you recover from a Panda issue, but rather recommends increasing the number of higher quality pages etc.
If you are continuing to get more traffic by adding your new larger higher quality articles, I would simply continue in the same vein. There is no reason why you cannot still continue to share your content on social platforms too.
In the past I may have suggested removing some thin/outsdated content and repointing to a newer more relevant piece, but in light of this article I now may start to think a tad differently. Hopefully some of the other Mozzers might have more thoughts on Barry's post too.
Here is the article fresh off the press today - https://www.seroundtable.com/google-panda-fix-content-21006.html
-
Google's Gary Illyes basically just answered this on Twitter: https://www.seroundtable.com/google-panda-fix-content-21006.html
"We don't recommend removing content in general for Panda, rather add more highQ stuff"
So rather than spend a lot of time on old work, move forward and improve. If there's terrible stuff, I'd of course remove it. But if it's just not super-high quality, I would do as Gary says in this instance and work on new things.
Truthfully, getting Google to recrawl year or two or five stuff can be tough. If they don't recrawl it you don't even get the benefit until they do, if there were a benefit. Moving forward seems to make more sense to me.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If my website do not have a robot.txt file, does it hurt my website ranking?
After a site audit, I find out that my website don't have a robot.txt. Does it hurt my website rankings? One more thing, when I type mywebsite.com/robot.txt, it automatically redirect to the homepage. Please help!
Intermediate & Advanced SEO | | binhlai0 -
Lazy Loading of Blog Posts and Crawl Depths
Hi Moz Fans, We are looking at our blog and improving the content as much as we can for SEO purposes, but we have hit a bit of a blank in terms of lazy loading implications and issues with crawl depths. We introduced lazy loading onto the blog home page to increase site speed initially and it works well with infinite scroll, but we were wondering whether this would cause any issues regarding SEO. A lot of the resources online seem to be conflicting and some are very outdated, so some clarification on what is best in terms of lazy loading and crawl depths for blogs, would be fantastic! I hope someone can help and give us some up to date insights - If you need anymore information, I'll reply ASAP
Intermediate & Advanced SEO | | Victoria_0 -
Why my website disappears for the keywords ranked, then reappears and so on?
Hello to everyone. In the last 2 weeks my website emorroidi.imieirimedinaturali.it has a strange behavior in SERP: it disappears for the keywords ranked and then reappears, and so on. Here's the chronicle of the last days: 12/6: message in GWT: Improvement of the visibility of the website in search. 12/6 the website disappears for all the keywords ranked 16/6 the website reappears for all the keywords ranked with some keywords higher in ranking 18/6 the website disappears for all the keywords ranked 22/6 the website reappears for all the keywords ranked 24/6 the website disappears for all the keywords ranked... I can't explain this situation. Could it be a penalty? What Kind? Thank you.
Intermediate & Advanced SEO | | emarketer0 -
Website completely delisted - reasons?
Hi, I got a request from a potential client as he do not understand why his website cannot be found on Google. I've checked that and found out that the complete website is not listed (complete delist) at all - expect just one pdf file.
Intermediate & Advanced SEO | | TheHecksler
I've checked his robots.txt - but this is ok. I've checked the META Robots - but they are on index,follow ... ok so far. I've checked his backlinks but could not found any massive linking from bad pages - just 6 backlinks and only four of them from designdomains.com which looks like a linklist or so. I've requested access to their GWT account if available in hope to find more infos, but does anyone of you may have a quick idea what els it could be? What could be the issue? I think that they got delisted due to any bad reason ... Let me know your Ideas 🙂 THANX 🙂 Sebi0 -
Delete or not delete old/unanswered forum threads?
Hello everyone, here is another question for you: I have several forum postings on my websites that are pretty old and so they are sort of "dead discussion" threads. Some of those old discussion threads are still getting good views (but not new postings), and so I presume may be valuable for some users. But most of them are just answers to personal questions that I doubt someone else could be interested in. Besides that, many postings are just single, unanswered questions still waiting for an answer, forgotten, they are just sitting there, and will probably stay unanswered for years.... I don't think this may be good for SEO, am I right? How do you suggest to approach this kind of issues on forums or discussions sections on a website? I am eager to know your thoughts on all this. Thank you in advance! All the best, Fab.
Intermediate & Advanced SEO | | fablau0 -
Archiving a festival website - subdomain or directory?
Hi guys I look after a festival website whose program changes year in and year out. There are a handful of mainstay events in the festival which remain each year, but there are a bunch of other events which change each year around the mainstay programming.This often results in us redoing the website each year (a frustrating experience indeed!) We don't archive our past festivals online, but I'd like to start doing so for a number of reasons 1. These past festivals have historical value - they happened, and they contribute to telling the story of the festival over the years. They can also be used as useful windows into the upcoming festival. 2. The old events (while no longer running) often get many social shares, high quality links and in some instances still drive traffic. We try out best to 301 redirect these high value pages to the new festival website, but it's not always possible to find a similar alternative (so these redirects often go to the homepage) Anyway, I've noticed some festivals archive their content into a subdirectory - i.e. www.event.com/2012 However, I'm thinking it would actually be easier for my team to archive via a subdomain like 2012.event.com - and always use the www.event.com URL for the current year's event. I'm thinking universally redirecting the content would be easier, as would cloning the site / database etc. My question is - is one approach (i.e. directory vs. subdomain) better than the other? Do I need to be mindful of using a subdomain for archival purposes? Hope this all makes sense. Many thanks!
Intermediate & Advanced SEO | | cos20300 -
Moving Content To Another Website With No Redirect?
I've got a website that has lots of valuable content and tools but it's been hit too hard by both Panda and Penguin. I came to the conclusion that I'd be better off with a new website as this one is going to hell no matter how much time and money I put in it. Had I started a new website the first time it got hit by Penguin, I'd be profitable today. I'd like to move some of that content to this other domain but I don't want to do 301 redirects as I don't want to pass bad link juice. I know I'll lose all links and visitors to the original website but I don't care. My only concern is duplicate content. I was thinking of setting the pages to noindex on the original website and wait until they don't appear in Google's index. Then I'd move them over to the new domain to be indexed again. Do you see any problem with this? Should I rewrite everything instead? I hate spinning content...!
Intermediate & Advanced SEO | | sbrault741 -
Redirecting Canonical 301s and Magento Website
I have an issue with a client's website where it has 3700+ pages, but roughly half of them are duplicates. Thankfully, the only difference between the original and the duplictes is the "?print" at the end of each URL (I suppose this is Magento's way of making a printable page version of the same page. I don't know, I didn't build it.) My questions is, how can I get all the pages like this http://www.mycompany.com/blah.html?print to redirect to pages like this... http://www.mycompany.com/blah.html Also, do they NEED to be Canonical, or will a 301 redirect be sufficient. Also, after having done this, if anybody knows, is there a way I can turn that feature off in Magento, because we're expanding our product line, and I don't want to have to keep chasing after these "?print" pages after the fact.
Intermediate & Advanced SEO | | ClifThompson0