Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What should be done with old news articles?
-
Hello,
We have a portal website that gives information about the industry we work in. This website includes various articles, tips, info, reviews and more about the industry.We also have a news section that was previously indexed in Google news but is not for the past few month.The site was hit by Panda over a year ago and one of the things we have been thinking of doing is removing pages that are irrelavant/do not provide added value to the site.Some of these pages are old news articles posted over 3-4 years ago and that have had hardly any traffic to.All the news articles on the site are under a /archive/ folder sorted by month and year, so for example a url for a news item from April 2010 would be /archive/042010/article-nameMy question is do you think removing such news articles would benefit the site helping it get out of Panda (many other things have been done in the site as well), if not what is the best suggested way to keep these articles on the site in a way which Google indexes them and treats them well.thx
-
Basically I don't see a reason to remove old news articles from a site, as it makes sense to still have an archive present. The only reason I could think of to remove them is if they are duplicate versions of texts that have originally been published somewhere else. Or if the quality is really crap...
-
if the articles are good - then there just might be value to the user . Depending on the niche / industry those old articles could be very important.
Google dosen't like those as you probably have a lot of impression but no clicks (so mainly no traffic) or maybe the "score" is bad (bounce rate - not Google analytics bounce rate, but Google's bounce rate - if they bounce to serps that is).
Since you got hit by panda, in my opinion, I see two options:
1. No index those old pages. The users can still get tho those by navigation, site search etc but google won't see them. Google is fine with having content (old, poor, thin etc) if it's not in the index. I work with a site that has several million pages and 80% is no index - everything is fine now (they also got hit by Panda).
2. Merge those pages into rich, cool, fresh topic pages (see new york time topic pages sample - search for it - I think there is also an seomoz post - a whiteboard friday about it). This is a good approach and if you manage to merge those old pages with some new content you will be fine. Topic pages are great as an anti panda tool !
If you merge the pages into topic pages do that based on a simple flow:
1. identify a group of pages that covers the same topic.
2. identify the page that has the highest authority of all.
3. Change this page into the topic page - keep the url.
4. Merge the other into this page (based on your new topic page structure and flow)
5. 301 redirect the others to this one
6. build a separat xml sitemaps with all those pages and load it up to WMT. Monitor it.
7. Build some links to some of those landing pages, get some minimum social signals to those - to a few (depending on the number). Build an index typoe of page with those topic pages or some of them (user friendly one/ ones) and use those as target to build some links to send the 'love'.
Hope it helps - just some ideas.
-
I do think that any site should remove pages that are not valuable to users.
I would look for the articles that have external links pointed at them and 301 those to something relevant. The rest, you could simply remove and let them return a 404 status. Just make sure all internal links pointing at them are gone. You don't want to lead people to a 404 page.
You could consider putting /archive/ in your robots.txt file if you think the pages have some value to users, but not to the engines. Or putting a no index tag on each page in that section.
If you want to keep the articles on the site, available to both google and users, you have to make sure they meet some of this basic criteria.
- Mostly Unique Content
- Moderate length.
- Good content to ad ratio.
- Content the focus on the page (top/center)
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How old is 404 data from Google Search Console?
I was wondering how old the 404 data from Google Search Console actually is? Does anyone know over what kind of timespan their site 404s data is compiled over? How long do the 404s tend to take to disappear from the Google Search Console, once they are fixed?
Intermediate & Advanced SEO | | McTaggart0 -
How Can I Redirect an Old Domain to Our New Domain in .htaccess?
There is an old version of http://chesapeakeregional.com still floating around the web here: http://www.dev3.com.php53-24.dfw1-2.websitetestlink.com/component/content/category/20-our-services. Various iterations of this domain pop up when I do certain site:searches and for some queries as well (such as "Diagnostic Center of Chesapeake"). About 3 months ago the websitetestlink site had files and a fully functional navigation but now it mostly returns 404 or 500 errors. I'd like to redirect the site to our newer site, but don't believe I can do that in chesapeakeregional.com's .htaccess file. Is that so and would I need access to the websitetestlink .htaccess to forward the domain? Note* I (nor anyone else in our organization) has the login for the old site. The new site went live about 9 months before I arrived at the organization and I've been slowly putting the pieces together since arriving.
Intermediate & Advanced SEO | | smpomoryCRH0 -
Redirected Old Pages Still Indexed
Hello, we migrated a domain onto a new Wordpress site over a year ago. We redirected (with plugin: simple 301 redirects) all the old urls (.asp) to the corresponding new wordpress urls (non-.asp). The old pages are still indexed by Google, even though when you click on them you are redirected to the new page. Can someone tell me reasons they would still be indexed? Do you think it is hurting my rankings?
Intermediate & Advanced SEO | | phogan0 -
Should you delete old blog posts for SEO purposes?
Hey all, When I run crawl diagnostics I get around 500 medium-priority issues. The majority of these (95%) come from issues with blog pages (duplicate titles, missing meta desc, etc.). Many of these pages are posts listing contest winners and/or generic announcements (like, "we'll be out of the office tomorrow"). I have gone through and started to fix these, but as I was doing so I had the thought: what is the point of updating pages that are completely worthless to new members (like a page listing winners in 2011, in which case I just slap a date into the title)? My question is: Should I just bite the bullet and fix all of these or should delete the ones that are no longer relevant? Thanks in advance, Roman
Intermediate & Advanced SEO | | Dynata_panel_marketing1 -
Best practice for retiring old product pages
We’re a software company. Would someone be able to help me with a basic process for retiring old product pages and re-directing the SEO value to new pages. We are retiring some old products to focus on new products. The new software has much similar functionality to the old software, but has more features. How can we ensure that the new pages get the best start in life? Also, what is the best way of doing this for users? Our plan currently is to: Leave the old pages up initially with a message to the user that the old software has been retired. There will also be a message explaining that the user might be interested in one of our new products and a link to the new pages. When traffic to these pages reduces, then we will delete these pages and re-direct them to the homepage. Has anyone got any recommendations for how we could approach this differently? One idea that I’m considering is to immediately re-direct the old product pages to the new pages. I was wondering if we could then provide a message to the user explaining that the old product has been retired but that the new improved product is available. I’d also be interested in pointing the re-directs to the new product pages that are most relevant rather than the homepage, so that they get the value of the old links. I’ve found in the past that old retirement pages for products can outrank the new pages as until you 301 them then all the links and authority flow to these pages. Any help would be very much appreciated 🙂
Intermediate & Advanced SEO | | RG_SEO0 -
How to 301 redirect old wordpress category?
Hi All, In order to avoid duplication errors we've decided to redirect old categories (merge some categories).
Intermediate & Advanced SEO | | BeytzNet
In the past we have been very generous with the number of categories we assigned each post. One category needs to be redirected back to blog home (removed completely) while a couple others should be merged. Afterwords we will re-categorize some of the old posts. What is the proper way to do so?
We are not technical, Is there a plugin that can assist? Thanks0 -
Old deleted sitemap still shown in webmaster tools
Hello I have redisgned a website inl new url structure in cms. Old sitemap was not set to 404 but changed with new sitemap files,also new sitemap was named different to old one.All redirections done properly Still 3 month after google still shows me duplicate titile and metas by comparing old and new urls I am lost in what to do now to eliminate the shown error. How can google show urls that are not shown in sitemap any more? Looking forward to any help Michelles
Intermediate & Advanced SEO | | Tit0 -
Xml sitemap advice for website with over 100,000 articles
Hi, I have read numerous articles that support submitting multiple XML sitemaps for websites that have thousands of articles... in our case we have over 100,000. So, I was thinking I should submit one sitemap for each news category. My question is how many page levels should each sitemap instruct the spiders to go? Would it not be enough to just submit the top level URL for each category and then let the spiders follow the rest of the links organically? So, if I have 12 categories the total number of URL´s will be 12??? If this is true, how do you suggest handling or home page, where the latest articles are displayed regardless of their category... so I.E. the spiders will find l links to a given article both on the home page and in the category it belongs to. We are using canonical tags. Thanks, Jarrett
Intermediate & Advanced SEO | | jarrett.mackay0