Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Unpublish and republish old articles
-
This might be a dumb question but we had an incident where a new SEO guy thought it would be a good idea to un-publish and republish all of your 200+ blog posts which we carefully scheduled over the last 6 months. He did not update the content and did not change anything. His intention was to send out google a sign to recheck the sites or something. Now, the entire blog looks like it wen't live in one day, which I don't think is good? Should we load a backup and get our old publishing dates back, should we keep it with the new publishing dates? What are the consequences? Will it effect our SEO?
-
You guys are awesome!! Thank you so much! That is exactly what we thought! Thank you for confirming, we loaded a backup.
-
From a user standpoint, you should definitely roll back the old dates. Otherwise, it's going to raise suspicion as to why they've all been published on the same date and for current event articles, having the wrong date is not helpful at all for those who are looking at the relevant time frame.
If the content was updated, on the other hand, then it's worth updating the publish date on the article to the date it was updated. And the article will probably be all the better for it, in terms of SEO!
However, as it is, I don't think it's going to have any noticeable effect on the SEO to have the dates as they were or as they are now. But think of the users.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google displays multiple titles for same article. What does this mean?
I've linked to some screenshots so that it what I'm talking about makes more sense. Sometimes, when I perform a search, I see an article with the correct article title listed as the page title in the SERPs. Other times, I see the wrong page title – it's a generic somethin' or other done by my client's web design company with a bunch of keywords thrown in. The latter (not the correct article title) also appears at the top of the browser tab for every article on my client's site. I know this is bad, but what can be done about it? This would never happen if my client used Wordpress or some easily modifiable CMS, but they're using a proprietary one maintained by the group that designed the website. open?id=0BxB_dYL1ylGgVVF1dHlwdXp2dFU open?id=0BxB_dYL1ylGgdWJjdlJoRlRIR00
Technical SEO | | Greenery0 -
Old URLs Appearing in SERPs
Thirteen months ago we removed a large number of non-corporate URLs from our web server. We created 301 redirects and in some cases, we simply removed the content as there was no place to redirect to. Unfortunately, all these pages still appear in Google's SERPs (not Bings) for both the 301'd pages and the pages we removed without redirecting. When you click on the pages in the SERPs that have been redirected - you do get redirected - so we have ruled out any problems with the 301s. We have already resubmitted our XML sitemap and when we run a crawl using Screaming Frog we do not see any of these old pages being linked to at our domain. We have a few different approaches we're considering to get Google to remove these pages from the SERPs and would welcome your input. Remove the 301 redirect entirely so that visits to those pages return a 404 (much easier) or a 410 (would require some setup/configuration via Wordpress). This of course means that anyone visiting those URLs won't be forwarded along, but Google may not drop those redirects from the SERPs otherwise. Request that Google temporarily block those pages (done via GWMT), which lasts for 90 days. Update robots.txt to block access to the redirecting directories. Thank you. Rosemary One year ago I removed a whole lot of junk that was on my web server but it is still appearing in the SERPs.
Technical SEO | | RosemaryB3 -
Redirecting old Sitemaps to a new XML
I've discovered a ton of 404s from Google's WMT crawler looking for mydomain.com/sitemap_archive_MONTH_YEAR. There are tons of these monthly archive xmls. I've used a plugin that for some reason created individual monthly archive xml sitemaps and now I get 404s. Creating rules for each archive seems a bad solution. My current sitemap plugin creates a single clean one mydomain.com/sitemap_index.xml. How can I create a redirect rule in the Redirection WP plugin that will redirect any URL that has the 'sitemap' and 'xml' string in it to my current xml sitemap? I've tried using a wildcard like so: mysite.com/sitemap*.*, mysite.com/sitemap ., mysite.com/sitemap(.), mysite.com/sitemap (.) but none of the wildcard uses got the general redirect to work. Is there a way to make this happen with the WP Redirection plugin? If not, is there a htaccess rule, and what would the code be for it? Im not very fluent with using general redirects in htaccess unfortunately. Thanks!
Technical SEO | | IgorMateski0 -
302 redirect used, submit old sitemap?
The website of a partner of mine was recently migrated to a new platform. Even though the content on the pages mostly stayed the same, both the HTML source (divs, meta data, headers, etc.) and URLs (removed index.php, removed capitalization, etc) changed heavily. Unfortunately, the URLs of ALL forum posts (150K+) were redirected using a 302 redirect, which was only recently discovered and swiftly changed to a 301 after the discovery. Several other important content pages (150+) weren't redirected at all at first, but most now have a 301 redirect as well. The 302 redirects and 404 content pages had been live for over 2 weeks at that point, and judging by the consistent day/day drop in organic traffic, I'm guessing Google didn't like the way this migration went. My best guess would be that Google is currently treating all these content pages as 'new' (after all, the source code changed 50%+, most of the meta data changed, the URL changed, and a 302 redirect was used). On top of that, the large number of 404's they've encountered (40K+) probably also fueled their belief of a now non-worthy-of-traffic website. Given that some of these pages had been online for almost a decade, I would love Google to see that these pages are actually new versions of the old page, and therefore pass on any link juice & authority. I had the idea of submitting a sitemap containing the most important URLs of the old website (as harvested from the Top Visited Pages from Google Analytics, because no old sitemap was ever generated...), thereby re-pointing Google to all these old pages, but presenting them with a nice 301 redirect this time instead, hopefully causing them to regain their rankings. To your best knowledge, would that help the problems I've outlined above? Could it hurt? Any other tips are welcome as well.
Technical SEO | | Theo-NL0 -
How to Stop Google from Indexing Old Pages
We moved from a .php site to a java site on April 10th. It's almost 2 months later and Google continues to crawl old pages that no longer exist (225,430 Not Found Errors to be exact). These pages no longer exist on the site and there are no internal or external links pointing to these pages. Google has crawled the site since the go live, but continues to try and crawl these pages. What are my next steps?
Technical SEO | | rhoadesjohn0 -
Is it worth adding schema markup to articles?
I know things like location, pagination, breadcrumbs, video, products etc have value in using schema markup. What about things like articles though? Is it worth all the work involved in having the pages mark up automatically? How does this effect SEO, and is it worthwhile? Thanks, Spencer
Technical SEO | | MarloSchneider0 -
Does posting an article on multiple sites hurt seo?
A client of mine creates thought leadership articles and pitches multiple sites to host the article on their site to reach different audiences. The sites that pick it up are places such as AdAge and MarketingProfs and we do get link juice from these sources most of the time. Does having the same article on these sites as well as your own hurt your SEO efforts in any way? Could it be recognized as duplicate content? I know the links are great just wondering if there is any other side effects especially when there are no links provided! Thank you!
Technical SEO | | Scratch_MM0 -
Is it worth setting up 301 redirects from old products to new products?
This year we are using a new supplier and they have provided us a product database of approx. 5k products. About 80% of these products were in our existing database but once we have installed the new database all the URLs will have changed. There is no quick way to match the old products with the new products so we would have to manually match all 5k products if we were were to setup 301 rules for the old products pointing to the new products. Of course this would take a lot of time. So the options are: 1. Is it worth putting in this effort to make the 301 rules? 2. Or are we okay just to delete the old product pages, let the SE see the 404 and just wait for it to index the new pages? 3. Or, as a compromise, should we 301 the old product page to the new category page as this is a lot quicker for us do do than redirecting to the new product page?
Technical SEO | | indigoclothing0